You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm thinking about implementing llms.txt in the context of a platform like Replicate. where there are multiple models and each one has its own documentation and API interface.
Would it make sense to have a unique llms.txt path for each model?
I haven't delved too much into the spec yet. What are the mechanisms for discovering these paths? Is it just a matter of concatenating the domain + llm.txt, e.g. https://subdomain.example.com/llms.txt?
If we wanted to support multiple files, we'd have to come up with some way of signalling to crawlers that an LLM file exists for a given route. Maybe a meta tag?
Great question! There's nothing explicitly written up about this (although
I'd be happy to do so) but my view is the correct approach is that a user
should say want docs for "some.domain/path/sub", then the system should
look for an llms.txt in "some.domain/path/sub/llms.txt", then
"some.domain/path/llms.txt", and finally "some.domain/llms.txt".
If the site owner wants /path/sub to *also* include the stuff from parent
dirs, they should just include that in that llms/txt.
Message ID: ***@***.***>
Hi @jph00 and team. 👋🏼
I'm thinking about implementing llms.txt in the context of a platform like Replicate. where there are multiple models and each one has its own documentation and API interface.
Would it make sense to have a unique llms.txt path for each model?
https://replicate.com/llms.txt
https://replicate.com/black-forest-labs/flux-schnell/llms.txt
https://replicate.com/daanelson/imagebind/llms.txt
I haven't delved too much into the spec yet. What are the mechanisms for discovering these paths? Is it just a matter of concatenating the domain + llm.txt, e.g.
https://subdomain.example.com/llms.txt
?If we wanted to support multiple files, we'd have to come up with some way of signalling to crawlers that an LLM file exists for a given route. Maybe a meta tag?
The text was updated successfully, but these errors were encountered: