Google Cloud Storage — Setup dynamic robots.txt file in NUXT with public Google Cloud Storage

You can use different ways to control your robots.txt file or other static files inside your project, but what if you can send this responsibility out of your hands and have a fallback if something is missing?

Sound too good to be true? I will teach you to use Google Cloud Storage with a public bucket with a custom file and fallback file inside, and then how to use it inside your NUXT project.




Sharing of much content of DevOps Engineer, Software Architect or Software Developer to teach more people to bee better specialist in the field. Primary focus are DevOps

Recommended from Medium

Manage your everyday contracts and grow your business with Avokaado

CORS — Cross Origin Resource Sharing

Upgrading Umbraco 8 with ImageProcessor with media files in Azure blobs to Umbraco 9 and ImageSharp

Python Functions: Lambdas, Closures, Decorators, and Currying

| Engineering News-Record

Che on OpenShift Tech Preview

How to set up Interceptors for Http calls in Flutter App

With this post start my new path to become a machine learning engineer

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Paris Nakita Kejser

Paris Nakita Kejser

DevOps Engineer, Software Architect, Software Developer, Data Scientist and identify me as a non-binary person.

More from Medium

Setting up Jenkins pipeline for continuous deployment of service on Google Kubernetes Engine

How I Setup CI/CD With Github Actions For K3s

How to test your secured GCP cloud functions

Create a ssh key for your GitLab project on IBM Cloud