You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
More control over Robots.txt from env variables.
Also the site is missing a sitemap.xml. Should at least contain a basic default single page sitemap.xml file
The text was updated successfully, but these errors were encountered:
In terms of robots.txt the only thing I could think of would be an env variable which controls whether or not the site should be indexed or not.
What was your idea on that?
I want to control robots where I would want only google-bot to index but block others
Sitemap.xml with at least basic one line that says my custom domain as the main page, I believe there is already a variable in the Dockerfile called SHARE where someone would put lets say example.com top level domain where this page is hosted.
More control over Robots.txt from env variables.
Also the site is missing a sitemap.xml. Should at least contain a basic default single page sitemap.xml file
The text was updated successfully, but these errors were encountered: