<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Dömötör Lugosi on Medium]]></title>
        <description><![CDATA[Stories by Dömötör Lugosi on Medium]]></description>
        <link>https://medium.com/@domotorlugosi?source=rss-c4e36681d963------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Sat, 16 May 2026 13:19:34 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@domotorlugosi/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[Simple CI/CD for Azure Web App with Docker-compose and GitHub Actions ☁️]]></title>
            <link>https://medium.com/@domotorlugosi/simple-ci-cd-for-azure-web-app-with-docker-compose-and-github-actions-%EF%B8%8F-2fbec9995b65?source=rss-c4e36681d963------2</link>
            <guid isPermaLink="false">https://medium.com/p/2fbec9995b65</guid>
            <category><![CDATA[azure-web-app]]></category>
            <category><![CDATA[azure-app-service]]></category>
            <category><![CDATA[docker]]></category>
            <category><![CDATA[azure-webapp]]></category>
            <category><![CDATA[docker-compose]]></category>
            <dc:creator><![CDATA[Dömötör Lugosi]]></dc:creator>
            <pubDate>Tue, 21 May 2024 02:05:13 GMT</pubDate>
            <atom:updated>2024-05-22T21:50:09.640Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*Gz4TYldkmn-skqlf3N0IUw.png" /></figure><p>In this article, we will explore how to set up a Continuous Integration and Continuous Deployment (CI/CD) pipeline using GitHub Actions to build and deploy a Dockerized application to an Azure Web App. The pipeline will handle both front-end and back-end builds, Docker image creation, and deployment to Azure. Let’s dive into the details.</p><h3>Prerequisites</h3><p>Before getting started, ensure you have the following:</p><ul><li>An Azure account</li><li>An Azure Container Registry (ACR)</li><li>An Azure Web App for Containers</li><li>A GitHub repository with your application code</li><li>GitHub Secrets configured with your Azure credentials</li></ul><h3>Folder Structure</h3><p>Your project should have the following structure:</p><pre>├── .github<br>│   └── workflows<br>│       └── azure-cicd-dev.yml<br>│<br>├── front<br>│   ├── Dockerfile<br>│   └── # other front-end related files<br>│<br>├── back<br>│   ├── Dockerfile<br>│   └── # other back-end related files<br>│<br>├── docker-compose-azure-dev.yml<br>└── # other project files</pre><ul><li>The front directory contains the Dockerfile and other files related to the front-end application.</li><li>The back directory contains the Dockerfile and other files related to the back-end application.</li><li>The docker-compose-azure-dev.yml file is at the root of the project and defines the services and their configurations for the Docker Compose setup.</li></ul><h3>Front-End Dockerfile (front/Dockerfile)</h3><pre># Use the official node image as the base image<br>FROM node:14-alpine<br><br># Set the working directory<br>WORKDIR /app<br><br># Copy package.json and package-lock.json<br>COPY package*.json ./<br><br># Install dependencies<br>RUN npm ci<br><br># Copy the rest of the application code<br>COPY . .<br><br># Build the application<br>RUN npm run build<br><br># Expose the port the app runs on<br>EXPOSE 3000<br><br># Command to run the app<br>CMD [&quot;npm&quot;, &quot;start&quot;]</pre><h3>Back-End Dockerfile (back/Dockerfile)</h3><pre># Use the official node image as the base image<br>FROM node:14-alpine<br><br># Set the working directory<br>WORKDIR /app<br><br># Copy package.json and package-lock.json<br>COPY package*.json ./<br><br># Install dependencies<br>RUN npm ci<br><br># Copy the rest of the application code<br>COPY . .<br><br># Expose the port the app runs on<br>EXPOSE 8080<br><br># Command to run the app<br>CMD [&quot;npm&quot;, &quot;start&quot;]</pre><h3>Docker Compose File (docker-compose-azure-dev.yml)</h3><pre>version: &#39;3.8&#39;<br><br>services:<br>  frontend:<br>    image: myacr.azurecr.io/frontend:latest<br>    ports:<br>      - &quot;80:3000&quot;<br>    environment:<br>      - NODE_ENV=dev<br><br>  backend:<br>    image: myacr.azurecr.io/backend:latest<br>    ports:<br>      - &quot;8080:8080&quot;<br>    environment:<br>      - NODE_ENV=dev</pre><h3>Setting Up the GitHub Actions Workflow</h3><p>We’ll create a GitHub Actions workflow file named azure-cicd-dev.yml in the .github/workflows directory of your repository. This workflow will have two main jobs: npm-build and docker-build-and-deploy.</p><h3>Workflow Dispatch Inputs</h3><p>The workflow can be manually triggered with the following inputs:</p><ul><li>run_npm_builds: A boolean to decide whether to run npm builds.</li><li>additional_string_for_docker_tag: An optional string to append to the Docker tag.</li></ul><pre>name: DEV compose build and deploy to Azure Web App<br><br>on:<br>  workflow_dispatch:<br>    inputs:<br>      run_npm_builds:<br>        description: &#39;Run npm builds&#39;<br>        required: true<br>        type: choice<br>        default: &#39;false&#39;<br>        options:<br>        - &#39;true&#39;<br>        - &#39;false&#39;<br>      additional_string_for_docker_tag:<br>        description: &#39;Additional string for Docker tag&#39;<br>        required: false<br>        type: string</pre><h3>Job 1: NPM Build</h3><p>The npm-build job runs only if run_npm_builds is set to true. It checks out the code and builds both the front-end and back-end using npm.</p><pre>jobs:<br>  npm-build:<br>    runs-on: ubuntu-latest<br>    if: ${{ github.event.inputs.run_npm_builds == &#39;true&#39; }}<br>    steps:<br>    - name: Checkout code<br>      uses: actions/checkout@v2<br>      <br>    - name: Build frontend <br>      run: |<br>       cd front<br>       npm ci<br>       npm run build<br>     <br>    - name: Build backend <br>      run: |<br>       cd back<br>       npm ci<br>       npm run build</pre><h3>Job 2: Docker Build and Deploy</h3><p>The docker-build-and-deploy job depends on the npm-build job and runs after it, or immediately if npm-build is not executed. This job involves several steps:</p><ol><li>Check out the code.</li><li>Login to Azure CLI.</li><li>Set environment variables for Docker tags and the agent’s IP address.</li><li>Whitelist the agent IP in the Azure Container Registry (ACR).</li><li>Build and push Docker images for both the front-end and back-end.</li><li>Update the Docker Compose file with the new image tags.</li><li>Deploy the Docker Compose file to Azure Web App.</li><li>Commit and push changes to the repository.</li><li>Logout from Azure CLI.</li></ol><pre>  docker-build-and-deploy:<br>    runs-on: ubuntu-latest<br>    needs: npm-build<br>    if: ${{ github.event.inputs.run_npm_builds == &#39;false&#39; || success() }}<br>    steps:<br>    - name: Checkout code<br>      uses: actions/checkout@v3<br>      with:<br>        ref: ${{ github.ref }}<br>      <br>    - name: AZ CLI login<br>      uses: azure/login@v1<br>      with:<br>        creds: &#39;{&quot;clientId&quot;:&quot;${{ secrets.CLIENT_ID }}&quot;,&quot;clientSecret&quot;:&quot;${{ secrets.CLIENT_SECRET }}&quot;,&quot;subscriptionId&quot;:&quot;${{ secrets.SUBSCRIPTION_ID }}&quot;,&quot;tenantId&quot;:&quot;${{ secrets.TENANT_ID }}&quot;}&#39;<br>        <br>    - name: Set variables<br>      id: vars<br>      run: |<br>           echo &quot;::set-output name=docker_tag::$(git rev-parse --short HEAD)-$(TZ=&#39;Europe/Paris&#39; date +%Y-%m-%d-%H-%M)&quot;<br>           echo &quot;::set-output name=agent_ip::$(dig +short myip.opendns.com @resolver1.opendns.com)&quot;<br>        <br>    - name: Whitelist agent ip<br>      run: |<br>          if [ -z &quot;$(az acr network-rule list --name ${{ vars.ACR_REPO_SHORT }} --resource-group ${{ vars.ACR_REPO_RG }} | grep ${{ steps.vars.outputs.agent_ip }} )&quot;]<br>          then <br>            echo &quot;Adding agent IP ${{ steps.vars.outputs.agent_ip }} to Azure Container Registry firewall whitelist&quot;<br>            az acr network-rule add --name ${{ vars.ACR_REPO_SHORT }} --resource-group ${{ vars.ACR_REPO_RG }} --ip-address ${{ steps.vars.outputs.agent_ip }}<br>          else<br>            echo &quot;Agent is already whitelisted; skipping.&quot;<br>          fi<br>      <br>    - uses: azure/docker-login@v1<br>      with:<br>        login-server: ${{ vars.ACR_REPO }}<br>        username: ${{ secrets.DOCKERIO_USERNAME }}<br>        password: ${{ secrets.DOCKERIO_PASSWORD }}<br>    - run: |<br>        docker build -t ${{ vars.ACR_REPO }}/frontend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }} -f front/Dockerfile .<br>        docker push ${{ vars.ACR_REPO }}/frontend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }}<br>        <br>        docker build -t ${{ vars.ACR_REPO }}/backend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }} -f back/Dockerfile .<br>        docker push ${{ vars.ACR_REPO }}/backend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }}<br><br>    - name: Remove whitelisted agent ip<br>      if: always()<br>      run: |       <br>          echo &quot;Removing agent IP ${{ steps.vars.outputs.agent_ip }} from Azure Container Registry firewall whitelist&quot;<br>          az acr network-rule remove --name ${{ vars.ACR_REPO_SHORT }} --resource-group ${{ vars.ACR_REPO_RG }} --ip-address ${{ steps.vars.outputs.agent_ip }} --only-show-errors --output none<br>        <br>    - name: Update the Dockercompose file<br>      env:<br>        FRONT_NEW_VERSION: &#39;${{ vars.ACR_REPO }}\/frontend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }}&#39;<br>        BACK_NEW_VERSION: &#39;${{ vars.ACR_REPO }}\/backend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }}&#39;<br>      run: |<br>       sed -i &#39;s/^ *image: .*frontend:.*$/    image: &#39;$FRONT_NEW_VERSION&#39;/&#39; docker-compose-azure-dev.yml<br>       sed -i &#39;s/^ *image: .*backend:.*$/    image: &#39;$BACK_NEW_VERSION&#39;/&#39; docker-compose-azure-dev.yml<br>       <br>       <br>    - name: Deploy to Azure Web App<br>      uses: azure/webapps-deploy@v2<br>      with:<br>        app-name: ${{ vars.DEV_APP_NAME }}<br>        configuration-file: &#39;docker-compose-azure-dev.yml&#39;<br>        <br>    - name: Commit files<br>      run: |<br>        git config --local user.email &quot;action@github.com&quot;<br>        git config --local user.name &quot;GitHub Action&quot;<br>        git commit -m &quot;Bump versions in docker-compose-azure&quot; docker-compose-azure-dev.yml<br>        <br>    - name: Push changes<br>      uses: ad-m/github-push-action@master<br>      with:<br>        github_token: ${{ secrets.PAT }}<br>        <br>    - name: AZ CLI logout<br>      if: always()<br>      run: |<br>        az logout</pre><h3>Full Code</h3><p>Here is the full code for the GitHub Actions workflow file azure-cicd-dev.yml:</p><pre>name: DEV compose build and deploy to Azure Web App<br><br>on:<br>  workflow_dispatch:<br>    inputs:<br>      run_npm_builds:<br>        description: &#39;Run npm builds&#39;<br>        required: true<br>        type: choice<br>        default: &#39;fasle&#39;<br>        options:<br>        - &#39;true&#39;<br>        - &#39;false&#39;<br>      additional_string_for_docker_tag:<br>        description: &#39;Additional string for Docker tag&#39;<br>        required: false<br>        type: string<br><br>jobs:<br>  npm-build:<br>    runs-on: ubuntu-latest<br>    if: ${{ github.event.inputs.run_npm_builds == &#39;true&#39; }}<br>    steps:<br>    - name: Checkout code<br>      uses: actions/checkout@v2<br>      <br>    - name: Build frontend <br>      run: |<br>       cd front<br>       npm ci<br>       npm run build<br>     <br>    - name: Build backend <br>      run: |<br>       cd back<br>       npm ci<br>       npm run build<br>  <br>  docker-build-and-deploy:<br>    runs-on: ubuntu-latest<br>    needs: npm-build<br>    if: ${{ github.event.inputs.run_npm_builds == &#39;false&#39; || success() }}<br>    steps:<br>    - name: Checkout code<br>      uses: actions/checkout@v3<br>      with:<br>        ref: ${{ github.ref }}<br>      <br>    - name: AZ CLI login<br>      uses: azure/login@v1<br>      with:<br>        creds: &#39;{&quot;clientId&quot;:&quot;${{ secrets.CLIENT_ID }}&quot;,&quot;clientSecret&quot;:&quot;${{ secrets.CLIENT_SECRET }}&quot;,&quot;subscriptionId&quot;:&quot;${{ secrets.SUBSCRIPTION_ID }}&quot;,&quot;tenantId&quot;:&quot;${{ secrets.TENANT_ID }}&quot;}&#39;<br>        <br>    - name: Set variables<br>      id: vars<br>      run: |<br>           echo &quot;::set-output name=docker_tag::$(git rev-parse --short HEAD)-$(TZ=&#39;Europe/Paris&#39; date +%Y-%m-%d-%H-%M)&quot;<br>           echo &quot;::set-output name=agent_ip::$(dig +short myip.opendns.com @resolver1.opendns.com)&quot;<br>        <br>    - name: Whitelist agent ip<br>      run: |<br>          if [ -z &quot;$(az acr network-rule list --name ${{ vars.ACR_REPO_SHORT }} --resource-group ${{ vars.ACR_REPO_RG }} | grep ${{ steps.vars.outputs.agent_ip }} )&quot;]<br>          then <br>            echo &quot;Adding agent IP ${{ steps.vars.outputs.agent_ip }} to Azure Container Registry firewall whitelist&quot;<br>            az acr network-rule add --name ${{ vars.ACR_REPO_SHORT }} --resource-group ${{ vars.ACR_REPO_RG }} --ip-address ${{ steps.vars.outputs.agent_ip }}<br>          else<br>            echo &quot;Agent is already whitelisted; skipping.&quot;<br>          fi<br>      <br>    - uses: azure/docker-login@v1<br>      with:<br>        login-server: ${{ vars.ACR_REPO }}<br>        username: ${{ secrets.DOCKERIO_USERNAME }}<br>        password: ${{ secrets.DOCKERIO_PASSWORD }}<br>    - run: |<br>        docker build -t ${{ vars.ACR_REPO }}/frontend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }} -f front/Dockerfile .<br>        docker push ${{ vars.ACR_REPO }}/frontend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }}<br>            <br>        docker build -t ${{ vars.ACR_REPO }}/backend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }} -f back/Dockerfile .<br>        docker push ${{ vars.ACR_REPO }}/backend:${{ steps.vars.outputs.docker_tag }}${{ github.event.inputs.additional_string_for_docker_tag }}<br><br>    - name: Remove whitelisted agent ip<br>      if: always()<br>      run: |       <br>          echo &quot;Removing agent IP ${{ steps.vars.outputs.agent_ip }} from Azure Container Registry firewall whitelist&quot;<br>          az acr network-rule remove --name ${{ vars.ACR_REPO_SHORT }} --resource-group ${{ vars.ACR_REPO_RG }} --ip-address ${{ steps.vars.outputs.agent_ip }} --only-show-errors --output none<br>        <br>    - name: Update the Dockercompose file<br>      env:<br>        FRONT_NEW_VERSION: &#39;${{ vars.ACR_REPO }}\/frontend:${{ steps.vars.outputs.docker_tag }}${{ inputs.additional_string_for_docker_tag }}&#39;<br>        BACK_NEW_VERSION: &#39;${{ vars.ACR_REPO }}\/backend:${{ steps.vars.outputs.docker_tag }}${{ inputs.additional_string_for_docker_tag }}&#39;<br>      run: |<br>       sed -i &#39;s/^ *image: .*frontend:.*$/    image: &#39;$FRONT_NEW_VERSION&#39;/&#39; docker-compose-azure-dev.yml<br>       sed -i &#39;s/^ *image: .*backend:.*$/    image: &#39;$BACK_NEW_VERSION&#39;/&#39; docker-compose-azure-dev.yml<br><br>       <br>    - name: Deploy to Azure Web App<br>      uses: azure/webapps-deploy@v2<br>      with:<br>        app-name: ${{ vars.DEV_APP_NAME }}<br>        configuration-file: &#39;docker-compose-azure-dev.yml&#39;<br>        <br>    - name: Commit files<br>      run: |<br>        git config --local user.email &quot;action@github.com&quot;<br>        git config --local user.name &quot;GitHub Action&quot;<br>        git commit -m &quot;Bump versions in docker-compose-azure&quot; docker-compose-azure-dev.yml<br>        <br>    - name: Push changes<br>      uses: ad-m/github-push-action@master<br>      with:<br>        github_token: ${{ secrets.PAT }}<br>        <br>    - name: AZ CLI logout<br>      if: always()<br>      run: |<br>        az logout</pre><h3>Summary</h3><p>This GitHub Actions workflow enables a seamless CI/CD process for deploying a Dockerized application to an Azure Web App. By setting up the necessary jobs and steps, you can automate the build, push, and deployment processes, ensuring that your application is always up-to-date and running smoothly in the cloud.</p><p>Feel free to customize this workflow further to fit your specific needs, and happy deploying! 🚢</p><p>…</p><p>🙌🏻 Thank you for reading! Your time and attention are greatly appreciated.</p><p>🚀 I’m grateful you stayed with me until the end. If you have questions or feedback about this blog, I’d love to hear from you!</p><p>💼 Connect with me on LinkedIn: <a href="https://www.linkedin.com/in/domotorlugosi/">Dömötör Lugosi</a></p><p>See you in the next post!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=2fbec9995b65" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Web Scraping with Free Proxies and Selenium ️]]></title>
            <link>https://medium.com/@domotorlugosi/web-scraping-with-free-proxies-and-selenium-%EF%B8%8F-c2368adb5f63?source=rss-c4e36681d963------2</link>
            <guid isPermaLink="false">https://medium.com/p/c2368adb5f63</guid>
            <category><![CDATA[selenium]]></category>
            <category><![CDATA[python]]></category>
            <category><![CDATA[web-crawling]]></category>
            <category><![CDATA[proxy-server]]></category>
            <category><![CDATA[web-scraping]]></category>
            <dc:creator><![CDATA[Dömötör Lugosi]]></dc:creator>
            <pubDate>Tue, 21 May 2024 01:18:11 GMT</pubDate>
            <atom:updated>2024-05-21T01:18:11.002Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*eYNxEdNCuRVS_BdbSZc_Qw.jpeg" /></figure><p>Web scraping has become an essential skill for data enthusiasts and professionals who need to extract information from websites efficiently. However, as websites become more sophisticated in detecting and blocking scraping attempts, it’s crucial to employ advanced techniques to avoid being blocked. One such technique is using proxies with Selenium. In this blog post, we’ll walk through how to use <strong>free proxies with Selenium for web scraping</strong> and share some code to get you started.</p><h3>Why Use Proxies?</h3><p>Websites often implement measures to block multiple requests from the same IP address to prevent scraping. Proxies help you mask your IP address by routing your requests through different IPs, making it harder for websites to detect and block your scraping activities.</p><h3>Setting Up Your Environment</h3><p>First, ensure you have Python installed on your machine. You’ll also need the following Python packages:</p><ul><li>selenium</li><li>requests</li><li>beautifulsoup4</li></ul><p>You can install these packages using pip:</p><pre>pip install selenium requests beautifulsoup4</pre><h3>The Code</h3><p>Below is a step-by-step explanation of the code to scrape a website using Selenium with free proxies.</p><h3>Step 1: Import Required Libraries</h3><pre>from selenium import webdriver<br>from selenium.webdriver.common.by import By<br>from selenium.webdriver.chrome.options import Options<br>from bs4 import BeautifulSoup<br>import time<br>import requests</pre><h3>Step 2: Define the Proxy Filtering Function</h3><p>This function fetches a list of proxies from sslproxies.org and returns them in a list.</p><pre>def filter_proxies():   <br>    response = requests.get(&#39;https://www.sslproxies.org/&#39;)<br>    soup = BeautifulSoup(response.text, &quot;html.parser&quot;)<br>    proxies = []<br>    for item in soup.select(&quot;table.table tbody tr&quot;):<br>        if not item.select_one(&quot;td&quot;):<br>            break<br>        ip = item.select_one(&quot;td&quot;).text<br>        port = item.select_one(&quot;td:nth-of-type(2)&quot;).text<br>        proxies.append(f&quot;{ip}:{port}&quot;)<br>    return proxies</pre><h3>Step 3: Create a Proxy-Enabled WebDriver</h3><p>This function configures Selenium to use a specified proxy.</p><pre>def create_proxy_driver(PROXY):<br>    options = Options()<br>    options.add_argument(f&#39;--proxy-server={PROXY}&#39;)<br>    driver = webdriver.Chrome(options=options)<br>    return driver</pre><h3>Step 4: Scrape the Target Website</h3><p>This function attempts to scrape the target website (https://index.hu) using the available proxies. If a proxy fails, it switches to another one.</p><pre>def get_content(ALL_PROXIES, driver):<br>    link = &quot;https://index.hu&quot;<br>    while True:<br>        try:<br>            driver.get(link)<br>            print(&quot;Successfully accessed the link&quot;)<br>            <br>            # Add your Selenium scraping code here<br><br>            break  # Exit loop if scraping is successful<br><br>        except Exception as e:<br>            print(f&quot;Error: {e}&quot;)<br>            driver.quit()<br>            if not ALL_PROXIES:<br>                print(&quot;Proxies used up&quot;)<br>                ALL_PROXIES = filter_proxies()<br>            new_proxy = ALL_PROXIES.pop()<br>            driver = create_proxy_driver(new_proxy)<br>            print(f&quot;New proxy being used: {new_proxy}&quot;)<br>            time.sleep(1)</pre><h3>Step 5: Main Function</h3><p>This section initializes the proxy list and starts the scraping process.</p><pre>if __name__ == &#39;__main__&#39;:<br>    ALL_PROXIES = filter_proxies()<br>    new_proxy = ALL_PROXIES.pop()<br>    driver = create_proxy_driver(new_proxy)<br>    get_content(ALL_PROXIES, driver)</pre><h3>Full Code</h3><p>Here is the complete code for reference:</p><pre>from selenium import webdriver<br>from selenium.webdriver.common.by import By<br>from selenium.webdriver.chrome.options import Options<br>from bs4 import BeautifulSoup<br>import time<br>import requests<br><br>link=&quot;https://index.hu&quot;<br><br>def filter_proxies():   <br>    response = requests.get(&#39;https://www.sslproxies.org/&#39;)<br>    soup = BeautifulSoup(response.text, &quot;html.parser&quot;)<br>    proxies = []<br>    for item in soup.select(&quot;table.table tbody tr&quot;):<br>        if not item.select_one(&quot;td&quot;):<br>            break<br>        ip = item.select_one(&quot;td&quot;).text<br>        port = item.select_one(&quot;td:nth-of-type(2)&quot;).text<br>        proxies.append(f&quot;{ip}:{port}&quot;)<br>    return proxies<br><br>def create_proxy_driver(PROXY):<br>    options = Options()<br>    options.add_argument(f&#39;--proxy-server={PROXY}&#39;)<br>    driver = webdriver.Chrome(options=options)<br>    return driver<br><br>def get_content(ALL_PROXIES, driver):<br>    while True:<br>        try:<br>            driver.get(link)<br>            print(&quot;Successfully accessed the link&quot;)<br><br>            # Add your Selenium scraping code here<br>            <br>        except Exception as e:<br>            print(f&quot;Error: {e}&quot;)<br>            driver.quit()<br>            if not ALL_PROXIES:<br>                print(&quot;Proxies used up&quot;)<br>                ALL_PROXIES = filter_proxies()<br>            new_proxy = ALL_PROXIES.pop()<br>            driver = create_proxy_driver(new_proxy)<br>            print(f&quot;New proxy being used: {new_proxy}&quot;)<br>            time.sleep(1)<br><br>if __name__ == &#39;__main__&#39;:<br>    ALL_PROXIES = filter_proxies()<br>    new_proxy = ALL_PROXIES.pop()<br>    driver = create_proxy_driver(new_proxy)<br>    get_content(ALL_PROXIES, driver)</pre><h3>Conclusion</h3><p>Using proxies with Selenium can significantly enhance your web scraping capabilities by preventing IP blocking and enabling access to sites that would otherwise restrict your scraping efforts. By following the steps outlined in this guide, you can build a robust web scraper that navigates these challenges effectively.</p><p>Happy scraping!</p><p>…</p><p>🙌🏻 Thank you for reading! Your time and attention are greatly appreciated.</p><p>🚀 I’m grateful you stayed with me until the end. If you have questions or feedback about this blog, I’d love to hear from you!</p><p>💼 Connect with me on LinkedIn: <a href="https://www.linkedin.com/in/domotorlugosi/">Dömötör Lugosi</a></p><p>See you in the next post!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=c2368adb5f63" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Ensuring Your Website’s Health: A Free and Effective Automated Monitoring Solution ]]></title>
            <link>https://medium.com/@domotorlugosi/ensuring-your-websites-health-a-free-and-effective-automated-monitoring-solution-11e4f9d93058?source=rss-c4e36681d963------2</link>
            <guid isPermaLink="false">https://medium.com/p/11e4f9d93058</guid>
            <category><![CDATA[website]]></category>
            <category><![CDATA[selenium]]></category>
            <category><![CDATA[github-actions]]></category>
            <category><![CDATA[automated-testing]]></category>
            <category><![CDATA[devops]]></category>
            <dc:creator><![CDATA[Dömötör Lugosi]]></dc:creator>
            <pubDate>Sun, 07 Jan 2024 01:13:42 GMT</pubDate>
            <atom:updated>2024-01-07T01:19:23.349Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*2Q57HM0piW7sKBjwrrHezQ.png" /></figure><p>In the fast-paced digital age, where the reliability and efficiency of websites are crucial, having a dependable and cost-effective solution to ensure your website is functioning correctly is essential. This article introduces an automated approach to monitor your website’s health using Selenium, a powerful tool for automating web browsers, and GitHub Actions, an excellent resource for continuous integration and deployment.</p><pre># TLDR Version<br>## Features<br><br>- **Automated Testing:** Regularly checks for console errors and measures page load times.<br>- **Screenshot Capturing:** Takes screenshots of the website for visual inspection.<br>- **Flexible Execution:** Set to run automatically on an hourly basis or can be triggered manually for specific URLs.<br>- **Automatic Notifications:** Sends email notifications via GitHub if any issues are detected during a run. Can be configured to send notifications through Microsoft Teams or Slack.<br>- **CI/CD Integration:** Can be integrated into continuous integration and deployment pipelines.<br><br>## Setup and Usage<br><br>1. **Fork the Repository:** <br>   Fork the https://github.com/lugosidomotor/smoke-test repository to your GitHub account<br><br>2. **Configure the Script:**<br>   Update the URL in two places in the script to point to your website. Lines 10 and 24 in .github/workflows/SMOKE_TEST.yml.<br><br>3. **GitHub Actions:**<br>   Once set up, use the Actions tab on GitHub to view the script&#39;s performance. It displays loading speed, logs, and screenshots.</pre><h4><strong>The Power of Selenium and GitHub Actions</strong></h4><p>Selenium is a versatile tool that allows you to automate web browser actions, making it perfect for testing web applications. By combining Selenium with GitHub Actions, a feature on GitHub that automates your workflow, you have a powerful toolkit at your fingertips. This combination allows you to automate tests and monitor your website’s performance regularly.</p><h4><strong>Getting Started with the Provided Script</strong></h4><p>I’ve developed a Python script that uses Selenium for automated website testing. <strong>This script can check for console errors, measure page load times, and take screenshots of your website.</strong> To use it, simply fork the repository at <a href="https://github.com/lugosidomotor/smoke-test">https://github.com/lugosidomotor</a>. After forking, you’ll need to update the URL in two places in the script to point to your website.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*FjE2PBcO5xLWRv78DJp1zw.png" /></figure><h4><strong>Monitoring Performance on GitHub</strong></h4><p>Once you’ve set up the script and pushed it to your GitHub repository, you can utilize the Actions tab on GitHub to view the performance. It displays the loading speed, logs, and screenshots generated by the script, giving you a comprehensive view of your website’s performance.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*8Q8ob8WWwEcAlpdr5ZRWFQ.png" /></figure><h4>Automated and Manual Execution</h4><p>The true beauty of this script lies in its flexibility. It’s set up to automatically run on an hourly basis, ensuring consistent monitoring of your website. However, if you wish to check a different URL or perform an immediate test, the script can also be triggered manually, giving you control when you need it.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*R39J54lpHKW9UET-gIbuBA.png" /></figure><h4>Automatic Notifications for Errors</h4><p>When the script identifies an issue during its run, GitHub automatically sends an email notification. This feature ensures that you’re always in the loop should any problems arise. Additionally, the script can be easily modified to send notifications via Microsoft Teams or Slack, allowing for seamless integration into your existing communication channels.</p><h4><strong>Integration into CI/CD Pipelines</strong></h4><p>This approach isn’t just limited to monitoring; it can be integrated into your continuous integration and continuous deployment (CI/CD) pipelines. By incorporating this script into your workflow, you ensure that every update or change made to your website doesn’t compromise its performance and functionality.</p><h4><strong>Real-World Application — My E-commerce Store</strong></h4><p>In my own e-commerce store, this script has been a game-changer. It not only performs the standard checks but also goes through a test purchase with each run. Crucially, this test order is automatically cancelled afterwards, ensuring no disruptions to our inventory or sales records. This step was added after facing multiple instances where unnoticed issues prevented customers from completing purchases, a problem we no longer face thanks to this automated system.</p><h4>Conclusion</h4><p>Automating your website monitoring doesn’t just save time; it provides peace of mind. With this simple, yet effective setup, you can continuously keep an eye on your website’s health, ensuring that your visitors always have the best experience possible. This proactive approach to web management is a game-changer, especially in today’s digital-first marketplace.</p><p>…</p><p>🙌🏻 Thank you for reading! Your time and attention are greatly appreciated.</p><p>🚀 I’m grateful you stayed with me until the end. If you have questions or feedback about this blog, I’d love to hear from you!</p><p>💼 Connect with me on LinkedIn: <a href="https://www.linkedin.com/in/domotorlugosi/">Dömötör Lugosi</a></p><p>See you in the next post!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=11e4f9d93058" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Run Existing WordPress Site Locally With Docker — Step-by-Step Guide ]]></title>
            <link>https://medium.com/@domotorlugosi/run-existing-wordpress-site-locally-with-docker-step-by-step-guide-0d4d6d92865e?source=rss-c4e36681d963------2</link>
            <guid isPermaLink="false">https://medium.com/p/0d4d6d92865e</guid>
            <category><![CDATA[wordpress]]></category>
            <category><![CDATA[staging-environments]]></category>
            <category><![CDATA[woocommerce]]></category>
            <category><![CDATA[docker]]></category>
            <dc:creator><![CDATA[Dömötör Lugosi]]></dc:creator>
            <pubDate>Sun, 31 Dec 2023 20:28:42 GMT</pubDate>
            <atom:updated>2024-03-22T08:41:07.258Z</atom:updated>
            <content:encoded><![CDATA[<h3>Run Existing WordPress Site Locally With Docker — Step-by-Step Guide 🚀</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*cVM62iD7xqbn1wXYgEhSPA.png" /></figure><h4>Introduction</h4><p>Docker has revolutionized the way developers work with environments, including WordPress. This small script, “<a href="https://github.com/lugosidomotor/DockerLocalWordPress">Run Existing WordPress Site Locally With Docker</a>”, harnesses this technology, making local WordPress development seamless and efficient.</p><h4>Why a Local Docker Environment is Crucial</h4><p>Local environments are indispensable for safe experimentation. They allow you to test updates, plugins, and customizations without risking your live site. This approach is especially important for staging environments, where you can comprehensively test and review changes prior to implementing them in the live environment. It’s a safety net for your website’s integrity.</p><h4>🌟 Overview</h4><p>The tool sets up a local WordPress environment using Docker. It mirrors your live site, providing an accurate testing ground.</p><h4>🛠️ Requirements</h4><p>You’ll need Docker and Docker Compose.</p><h4>🧠 How It Works</h4><ul><li><strong>WordPress Container:</strong> Manages your site’s files, themes, plugins, and uploads.</li><li><strong>Database Container:</strong> Handles MySQL files and database snapshots.</li></ul><h4>🔧 The Setup Process</h4><ol><li><strong>Clone/Download:</strong> Get the project files:<br><a href="https://github.com/lugosidomotor/DockerLocalWordPress">https://github.com/lugosidomotor/DockerLocalWordPress</a></li><li><strong>Database Prep:</strong> Create a mysqldump from your site’s DB and save as mysqldumps/backup.sql.gz</li><li><strong>Content Prep:</strong> Copy wp-content from your site to site/wp-content</li><li><strong>Configuration:</strong> Set variables in .env</li><li><strong>Run: </strong>In the project root, execute:</li></ol><pre>docker-compose up -d &amp;&amp; docker exec -ti wordpress &#39;/prep.sh&#39;</pre><p>6. <strong>Open in Browser:</strong> <a href="http://localhost">http://localhost</a></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*dcq9uIrSwJp-oRsvFX_33A.png" /></figure><h4>Conclusion</h4><p>The beauty of this tool lies in its staging capabilities. Test updates, new features, or any site changes in a risk-free environment. It’s ideal for developers and site administrators who prioritize stability and reliability.</p><p>…</p><p>🙌🏻 Thank you for reading! Your time and attention are greatly appreciated.</p><p>🚀 I’m grateful you stayed with me until the end. If you have questions or feedback about this blog, I’d love to hear from you!</p><p>💼 Connect with me on LinkedIn: <a href="https://www.linkedin.com/in/domotorlugosi/">Dömötör Lugosi</a></p><p>See you in the next post!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=0d4d6d92865e" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Build Your Own Website & Custom Email for Free: A Google Sites & Cloudflare Guide ]]></title>
            <link>https://medium.com/@domotorlugosi/creating-a-free-website-with-personalized-email-using-google-sites-and-cloudflare-33fb512d9835?source=rss-c4e36681d963------2</link>
            <guid isPermaLink="false">https://medium.com/p/33fb512d9835</guid>
            <category><![CDATA[website]]></category>
            <category><![CDATA[cloudflare]]></category>
            <category><![CDATA[google-sites]]></category>
            <category><![CDATA[website-development]]></category>
            <category><![CDATA[free]]></category>
            <dc:creator><![CDATA[Dömötör Lugosi]]></dc:creator>
            <pubDate>Sun, 31 Dec 2023 16:34:21 GMT</pubDate>
            <atom:updated>2024-03-04T22:11:54.179Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*sAp8QBvaVxkfkBVHD603lQ.jpeg" /></figure><p>Having a personal online presence is more accessible than ever before, and for those seeking simplicity and ease of use, Google Sites emerges as a top choice. <strong>While I am a great enthusiast of WordPress for its flexibility and extensive features, it requires continuous attention and maintenance</strong>.<br>On the other hand, <strong>most managed services come with a subscription fee</strong>. Therefore, if your goal is to establish a straightforward, swift, and hassle-free online presence, Google Sites stands out. Coupled with Cloudflare, which offers free services to manage your domain and security, you can create a website and a professional email address without any associated costs. <strong>This makes it an excellent solution for those who prefer a zero-maintenance and free solution</strong>.</p><p><strong>Step 1: Purchase a Domain </strong><br>The first step is to purchase a domain from a registrar. You can choose from a variety of service providers such as Namecheap, or if you require a country-specific domain (e.g., .hu or .ca), you would go to a local registrar in your country.</p><p><strong>Step 2: Adding Your Domain to Cloudflare</strong><br>After securing your domain name, the next step is to create a free account with Cloudflare. Once you’ve registered, you can add your domain to your Cloudflare account. This process involves updating the nameserver (NS) records at your domain registrar’s platform with the NS records provided by Cloudflare. By doing this, you transfer the management of your domain’s DNS settings to Cloudflare. Your domain will then appear as active on your Cloudflare dashboard, indicating that it’s ready for the next steps in configuration.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*h0PRhoNV5iC2deTisC8vQQ.png" /></figure><p><strong>Step 3: Clean Up DNS Records</strong><br>Navigate to the DNS section in your Cloudflare account and remove all existing records (using the Edit -&gt; Delete function). This is done to prevent any conflicts with previous configurations, as you will be adding fresh records that correspond to your new setup.</p><p><strong>Step 4: Setting Up Your Custom Email</strong><br>To have a personalized email address such as hello@yourdomain.com, go to the <strong>Email -&gt; Destination addresses</strong> section in Cloudflare and input an existing email address you wish to use, like your Gmail address. After this, establish a ‘Catch-All’ routing rule on the <strong>Routing rules page</strong> which will redirect all emails sent to @yourdomain.com to the email address you have verified. This step will also automatically create the necessary MX records for email routing.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*0Kspr78qrTt-SfQ1rD4RBg.png" /></figure><p>Be mindful that with this setup, you will receive all emails sent to your address, but if you wish to reply from your custom address, additional SMTP settings are required, such as those provided by services like Sendgrid or Brevo. These services also offer free options.</p><p><strong>Step 5: Create a Google Sites Webpage<br></strong>With your domain ready, proceed to create your webpage using Google Sites. This platform is user-friendly and requires no previous web development experience.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*xAe9s0R_AOvc74Q8-uIPZw.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*mnELezCLE-0EDopDeTMI9Q.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*5QLjXIHLH1PlA8waFNhH5w.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*Na83OKKpTQ_7Uy5rPmPKBg.png" /></figure><p><strong>Step 6: Publish your website</strong><br>Go to the icon with a little person and a plus sign and select “General access” -&gt; “Published site” -&gt; Public.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/727/1*A_w71rIxFH09aX2VacDkbQ.png" /></figure><p><strong>Step 7: Publishing with Custom Domain on Google Sites</strong><br>In Google Sites, go to the publish settings and select the option to use a custom domain. Here, you will need to create a TXT record for domain verification purposes and then a CNAME record (<strong>the Proxy Status should be set to DNS-only</strong>) to direct your domain to Google’s servers, following the instructions provided by Google Sites. These steps will integrate your domain with your Google Sites webpage, making it accessible to the public under your own domain name.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*q2ip6owqJ-iUgSEyeiGYKQ.png" /></figure><p><strong>Step 8: Redirect to ‘www’ Using Cloudflare Page Rules </strong><br>Since Google Sites primarily operates with the ‘www’ prefix, it’s essential to ensure that your site is reachable whether visitors type ‘www’ before your domain or not. To achieve this, you’ll need to configure Page Rules in Cloudflare to redirect all traffic consistently to the ‘www’ version of your domain.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*q2ip6owqJ-iUgSEyeiGYKQ.png" /></figure><p>Before setting up the Page Rules, you must first create an A record (<strong>the Proxy Status should be set to Proxied</strong>) in Cloudflare for your domain without the ‘www’. This A record will point to an IP address, which can be any IP, such as 8.8.8.8, which is commonly used for this purpose. Once the A record is in place, you can proceed to define your Page Rules as follows:</p><p>The Page Rules for redirecting to ‘www’ should be configured as follows:</p><ol><li>*yourdomain.yourdomainending/* Forwarding URL (Status Code: 301 - Permanent Redirect, Url: https://www.yourdomain.yourdomainending) Enabled</li><li>https://yourdomain.yourdomainending/* Forwarding URL (Status Code: 301 - Permanent Redirect, Url: http://www.yourdomain.yourdomainending) Enabled</li><li>http://yourdomain.yourdomainending/* Forwarding URL (Status Code: 301 - Permanent Redirect, Url: https://www.yourdomain.yourdomainending) Enabled</li></ol><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*dJG7N9ra-xAPNrrveW22eQ.png" /></figure><p>By following these steps, you can set up a website and email domain that enhances your digital identity, all at no cost. Remember to allow some time for DNS changes to propagate across the internet. Once complete, your website and custom email will be up and running.</p><p>…</p><p>🙌🏻 Thank you for reading! Your time and attention are greatly appreciated.</p><p>🚀 I’m grateful you stayed with me until the end. If you have questions or feedback about this blog, I’d love to hear from you!</p><p>💼 Connect with me on LinkedIn: <a href="https://www.linkedin.com/in/domotorlugosi/">Dömötör Lugosi</a></p><p>See you in the next post!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=33fb512d9835" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Streamlining WooCommerce Analytics Without the Bloat of Plugins]]></title>
            <link>https://medium.com/@domotorlugosi/streamlining-woocommerce-analytics-without-the-bloat-of-plugins-3b0e936cca16?source=rss-c4e36681d963------2</link>
            <guid isPermaLink="false">https://medium.com/p/3b0e936cca16</guid>
            <category><![CDATA[streamlit]]></category>
            <category><![CDATA[woocommerce]]></category>
            <category><![CDATA[sales-dashboard]]></category>
            <category><![CDATA[ecommerce]]></category>
            <dc:creator><![CDATA[Dömötör Lugosi]]></dc:creator>
            <pubDate>Sun, 31 Dec 2023 00:07:41 GMT</pubDate>
            <atom:updated>2024-01-06T13:02:11.855Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*WYI7fX1OwOfciXzq7aQbPg.png" /></figure><p>Check out the project repository at <a href="https://github.com/lugosidomotor/WooCommerce-Analytics">WooCommerce Analytics on GitHub</a> and streamline your WooCommerce analytics today.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*fH8MhOy6nY7tdw47x6E4vQ.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*uLtz-2a-5gWZynFB0gONXA.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*KvPkV56Rz62xH_AMBP4HYg.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*Rr6n3Ncd-gzy_mQgjkAhrQ.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*aQmddPuwMEJdXfOv1NKcJw.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*dnugvCpVGNzwSuM_Zn0Q5Q.png" /></figure><p>In the bustling e-commerce space, WooCommerce stands out as a versatile platform, powering millions of online stores. It’s flexible, open-source, and, most importantly, can be tailored to the unique needs of every store owner. However, with flexibility often comes complexity, especially when it comes to analytics. The typical solution? Plugins. But for those of us who prefer a leaner, more efficient setup, this approach doesn’t cut it. That’s why I embarked on creating a WooCommerce analytics dashboard that operates independently of the WooCommerce site — free from the weight of additional plugins.</p><h3>Why Opt for a Plugin-Free Solution?</h3><p>Every plugin you add to your WooCommerce site is a gamble. It can potentially slow down your site, open up security vulnerabilities, or just add to the visual clutter of your admin area. As someone who appreciates speed and simplicity, I found that the existing solutions were simply too cumbersome for my taste. I wanted something that could run outside of the WooCommerce environment, be lightning fast, and give me the insights I needed at a glance without any extra fluff. Thus, the WooCommerce Sales Dashboard was born.</p><h3>Introducing the WooCommerce Sales Dashboard</h3><p>The WooCommerce Sales Dashboard is a standalone application that feeds off a database dump from your WooCommerce store. Using the power of MySQL and the interactivity of Streamlit, it offers a rich, dynamic visualization of sales data. This includes:</p><ul><li>Monthly and yearly sales trends.</li><li>The ratio of returning customers.</li><li>Geographical distribution of sales.</li><li>Top-performing products and categories.</li><li>Average order value over time.</li></ul><p>And the best part? It’s all accessible through a clear and concise dashboard, minus the drag on your main WooCommerce site.</p><h3>How It Works</h3><p>The core of this solution is a set of Docker containers that handle the database and visualization components separately from your live WooCommerce site. Here’s a quick rundown:</p><ol><li>Data Dump: Export your WooCommerce database into a dump.sql file. This will be the foundation of your analytics.</li><li>Database Container: Using Docker, spin up a MySQL container that houses your data securely and independently.</li><li>Visualization with Streamlit: A Python-based Streamlit application reads from your database and presents the data through interactive charts and graphs.</li></ol><p>All of this operates on your local machine or a separate server, ensuring that your live site remains untouched and unencumbered by analytics processing.</p><h3>Setting It Up</h3><p>Getting started with the WooCommerce Sales Dashboard is straightforward:</p><ol><li>Clone the repository from GitHub.</li><li>Place your dump.sql file into the project directory.</li><li>Run the start.sh script to kick everything off.</li></ol><p>The provided start.sh script automates the process, setting up the containers, importing your data, and firing up the Streamlit application, which you can then access via your browser at <a href="http://localhost:8501.">http://localhost:8501.</a></p><h3>Lighter, Faster, Smarter</h3><p>The WooCommerce Sales Dashboard project embodies the principle of “less is more.” By decoupling analytics from the main site, you get a faster, more responsive, and focused experience. Your WooCommerce site remains snappy, and you get the added peace of mind from keeping your analytics processing isolated.</p><p>…</p><p>🙌🏻 Thank you for reading! Your time and attention are greatly appreciated.</p><p>🚀 I’m grateful you stayed with me until the end. If you have questions or feedback about this blog, I’d love to hear from you!</p><p>💼 Connect with me on LinkedIn: <a href="https://www.linkedin.com/in/domotorlugosi/">Dömötör Lugosi</a></p><p>See you in the next post!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=3b0e936cca16" width="1" height="1" alt="">]]></content:encoded>
        </item>
    </channel>
</rss>