Q&A: How to Reduce 120 sec Manual Work to 3 sec and repeat it 1200 times?

MacLoush
4 min readApr 22, 2023

--

Task intro:

I manually create a script in JSON-LD format for a Technical SEO project, where 1200 schema markups are needed to be prepared for brand webshop.

Photo by Sophi Raju on Unsplash

Test and worries:

I made a test, manually to copy image URLs, and brand webshop information, introduction .. in total 9 variables so one brand in one webshop will be prepared.

It took me 90–120 seconds manual work. The problem is that the manual work on this scale is very sensitive.

I can make mistakes with the copy-paste, and I can get slower after some time and I have deadline to prepare the scripts.

The solution:

Autohotkey is a free, open-source scripting language for Windows that allows users to easily create small to complex scripts for all kinds. This will be the solution to create the script in JSON-LD format.

Overview on the script:

This is an example of the script which has to be prepared 1200 times. In between the 2 percentage signs %….% I have variables, which are automatically going to change each time: %company_name%, %company_url%, %company_logo% and so on.

So the script will be generated 1200 times but each time it will be filled with individual information. See the below snippet:

<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "OnlineStore",
"name": "%company_name%",
"url": "%company_url%",
"description": "%final_brand_intro%",
"logo": "%company_logo%",
"brand": {
"@type": "Brand",
"name": "%brand_name%",
"url": "%urlinput%",
"logo": "%brand_logo%",
"image": "%brand_image%",
"description": "%brand_description%"
}}
</script>

On the top of this script I will add another line for administrative reasons, 1 part for the brand webshop URL and 1 for the current date of creating the snippet.

URL: %urlinput% `n`rDate: %TimeVar%`n`r

I start the process with 1 pop up windows, which asks for to enter the brand webshop URL.

Using this URL we will download the whole website into a .txt file, and use Regular Expressions to find the content that we will pass on to the previously mentioned variables.

InputBox, urlinput, Enter the BrandShop URL

UrlDownloadToFile, %urlinput%, %A_Desktop%\brandshoptemp.txt

Sometimes it would be necessary to trim the results from starting the left side of the text or from the end. But you will understand it from the syntax.

So the Regular Expressions, what are these? Shortly, I provide a starting point and after that a rule, which will provide a whole piece of information from a big pool of information.

The pool is the downloaded URL to .txt file. If you make the script open the file it will contain the code of the website. From this code we are looking for a certain piece of code, or information. Take a look at the below snippet.

FileRead, filevar, %A_Desktop%\brandshoptemp.txt
RegExMatch(filevar, "https://images.kkeu.de/is/content/BEG/logo-claim-([\w\.-]+)", company_logo)
Sleep, 500

RegExMatch(filevar, "https://images.kkeu.de/is/content/BEG/FLA([\w?$]+)([\w?$]+)", brand_logo)
Sleep, 500

RegExMatch(filevar, "https://images.kkeu.de/is/image/BEG/HGB([\w?$]+)", brand_image)
Sleep, 500

RegExMatch(filevar, "html-to-react-id"">([\w?$ ]+)", shop_name)
StringTrimLeft, brand_name, shop_name, 18
Sleep, 500

RegExMatch(filevar, "<p>.*?</p>", paragraph)
StringTrimLeft, newparagraph, paragraph, 3
StringTrimRight, brand_description, newparagraph, 4

FileRead will read the saved .txt file with the downloaded code. Then I make this action save it in a variable, called “filevar”.

In every RegExMatch command I call first the pool of information- aka filevar, then I enter the part of the code that is the solid base of every variable which will be changing across all the websites and brand webshops.

The end of this command is another variable, which will save the filtered data. This will be called when I use the double %% signs.

Look at this example for fetching the brand webshop background image in 3 different webshops:

1. https://images.kkeu.de/is/content/BEG/FLA_LOaltrex_00_00_00_6721258

2. https://images.kkeu.de/is/content/BEG/FLA_LOasecos_00_00_00_7306081

3. https://images.kkeu.de/is/content/BEG/FLA_LOanke_00_00_00_7306080

4. https://images.kkeu.de/is/content/BEG/FLA_Ansell_9748641

The solid base of the information we are looking for has 2 parts, the 1st is this: https://images.kkeu.de/is/content/BEG/FLA and the second part is something that always going to be changing, as each brand webshop has its own image, so has its own unique URL for that brand image.

Once we have all the variables saved we will open another .txt file, and save the information on to the script template, which you can see in the 1st code snippet in this article.

At the end of the file, we close up all the .txt files that we downloaded the URLs, and pop a message box that the process is completed.

FileDelete, %A_Desktop%\brandshoptemp.txt
Sleep, 500
FileDelete, %A_Desktop%\generalshopintro.txt
Sleep, 500

MsgBox, Completed the BrandMarkup creation
return

Takeaway:

I am not an process or desktop automation specialist, still reading ahk.com and the forums and YouTube videos seem to provide good enough materials to start.

The code might seem to be clumsy, but it is working, and with this I can save lot of time.

If you want to have the script, message me, and I will forward it to you.

If you want to see the script in action check out the video below:

If you want to check out another AHK script:

--

--

MacLoush

Tips on Desktop Automation | Digital Marketing | Parenting | Savings | Investment and Corporate Life | Subsrcibe: https://medium.com/@macloush/subscribe