How I downloaded 244 photos from xtreamyouth.com in 1 minute with 20 lines of code.

Background
Xtreamyouth.com is a very popular Sri Lankan event photography service. They do event photography for schools, universities etc. They publish the images in their website after the event so anyone can view it from their website.
Problem
The website does not allow right clicking. Right click context menu is blocked by the developers using javascript. The website is also very cluttered and hard to work with. The only way to download them is to click on the download button provided by the website. This particular album I wanted to download had about 250 photos in it. So there is no way I’m going to download them one at a time. Let the hacking begin :)
Solution
1.Solution #1 (Did not work)
Tried disabling JavaScript in Chrome but the photo album worked with fancybox plugin for joomla. So disabling javascript rendered the website unusable.

2. Solution #2 (Worked) :)
This time I got my hands dirty by writing some teeny-weeny bit of code. I used NodeJS and the classic wget command on linux for this project.
Guys at xtreamyouth.com store images in different domains called content-a-xy.net and content-b-xy.net the URL pattern is something like this
http://www.content-b-xy.net/albums/mcsocial-2018-da/XY_DA_0001.jpg
http://www.content-b-xy.net/albums/mcsocial-2018-da/XY_DA_0002.jpg
Do you notice the pattern?,

Interestingly the above links will land you on images of the 2018 social day (whatever) xD of musaeus college.
So what has to be done is to change the ‘0001’,’0002' programmatically right?
I wrote the following code,
What it does?
- It imports NodeJS file system module into the project.
- Initialises some variables.
- Creates the URL.
- Edits the variable using a promise so 1->0001 and 2->0002.
- Append a URL to a text file called url_set.txt.
- Console logs the output so I can see if something breaks.
- Output (If any) errors.
- Loop this cycle 244 times.
So I ended up getting a file like this,

Bingo, Now I have All the URLs.
Using Wget
Then I typed this,
wget -i url_set.txt -p The above command will read the url_set.txt and download every image on the list using HTTP protocol and save it while preserving the directory structure of the document.

Drum roll please…
Now all the images are stored inside a folder in my computer.

PS
I deleted all the images later.