You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The default robots.txt installed in the Dataverse docker image disallows all access to the site. It would be helpful to be able to configure the image with a customized robots.txt that is similar to the one we use in production.
This issue was encountered in testing the SPA - we need to test that the json-ld added to the header is accessible to web crawlers.
SPA issue: IQSS/dataverse-frontend#350
The text was updated successfully, but these errors were encountered:
The default robots.txt installed in the Dataverse docker image disallows all access to the site. It would be helpful to be able to configure the image with a customized robots.txt that is similar to the one we use in production.
This issue was encountered in testing the SPA - we need to test that the json-ld added to the header is accessible to web crawlers.
SPA issue: IQSS/dataverse-frontend#350
The text was updated successfully, but these errors were encountered: