Make robots.txt & sitemap.xml files available from root directory

Hi there,

For Google SEO requirements, I’m trying to make the robots.txt & sitemap.xml files available from root directory of my application so they can be accessible online from this url : mydomain. com/ robots.txt”

I keep asking this to Google AI Studio but he encounters endless runtine errors when trying to fix the “Cannot GET /robots.txt” error :

User

When trying to access it, there an error : Cannot GET /robots.txt

Gemini 3 Pro PreviewRan for 71s

lightbulbThought for 68 seconds

To resolve the “Cannot GET /robots.txt” error and ensure your SEO files are accessible from the root domain, I will update vercel.json to explicitly route requests for /robots.txt and /sitemap.xml to their locations in the public directory. I will also update the robots.txt content to point to the clean root URL of the sitemap.

  1. vercel.json: Added rewrite rules to map /robots.txt to /public/robots.txt and /sitemap.xml to /public/sitemap.xml.

  2. public/robots.txt: Updated the sitemap reference to use the cleaner URL.

editEdited 2 files

vercel.json

check_circle

public/robots.txt

check_circle

User

When trying to access it, there is still the same error : Cannot GET /robots.txt.
Let me know if there is anything to configure on Google Cloud Run container side as well

Gemini 3 Pro PreviewRan for 149s

lightbulbThought for 142 seconds

To resolve the Cannot GET /robots.txt error on Google Cloud Run, we need to ensure the container runs a web server that explicitly handles routing for your SEO files (robots.txt, sitemap.xml) and falls back to your Single Page Application (SPA) for all other routes.

The default behaviors of some static file servers might not map /robots.txt to the file inside your public folder automatically, or they might not handle the SPA fallback correctly.

Here is the solution:

  1. Create a server.js file: This is a simple Express.js server that will handle the routing logic. It prioritizes serving your SEO files from the public directory, then serves static assets, and finally handles the SPA fallback (sending index.html for unknown routes).

  2. Create/Update Dockerfile: This configures your Cloud Run container to install express and run the new server.js.

By using this server.js as your entry point, you ensure that is served correctly.

editEdited 2 files

server.js

check_circle

Dockerfile

error

errorAn internal error occurred.

User

An internal error occured when you were creating the Dockerfile, finish what you were doing

Gemini 3 Pro PreviewRan for 49s

lightbulbThought for 46 seconds

To ensure your application runs correctly on Google Cloud Run and serves the robots.txt file properly, we need to containerize the application.

Since the previous attempt to create the Dockerfile was interrupted, I will provide the complete Dockerfile and a package.json file. The package.json is essential to define the express dependency required by your server.js.

Here are the files:

editEdited 2 files

package.json

check_circle

Dockerfile

error

errorAn internal error occurred.

User

There was an unexpected error. Finish what you were doing.

Gemini 3 Pro PreviewRan for 89s

lightbulbThought for 82 seconds

To resolve the “Cannot GET /robots.txt” error on Google Cloud Run, we need to ensure your container runs a web server that knows exactly where to find your files.

Since you are using an index.html with importmap (a no-build setup), we need a simple Node.js server to serve your files and handle the routing for robots.txt and sitemap.xml specifically.

Here are the 3 files you need to add/update to make this work on Cloud Run:

  1. server.js: The web server script.

  2. package.json: Defines the project and dependencies (express).

  3. Dockerfile: Instructions for Cloud Run to build your container.

editEdited 3 files

server.js

check_circle

package.json

check_circle

Dockerfile

error

errorAn internal error occurred.

For now, I was only able to make the robots.txt & sitemap.xml files available from /public directory not the root directory.

My app is already online, directly published from Google AI Studio to a Cloud Run service that is mapped to mydomain.

Why Google AI Studio always runs in error when trying to create/update the Dockerfile?

Is there anything to do on the container side as well ? If so, what should be done exactly (a step-by-step guide would be welcomed at this point) ?

Thanks in advance for any help regarding this issue !

Best regards,

Arthur

Can someone provide any help please?

Apparently it’s a known bug in Google AI Studio…

Does Google plan to fix it anytime soon ?!