You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The text generated by the LLM sometimes appears in the current note (e.g., MS, Word document) and sometimes in a new window, making it difficult to anticipate. I hope the developers can address this puzzlement. I do not have a better suggestion yet ( is it possible to create a code snippet that identifies the file type, such as a word document, PDF, HTML, or others, before processing the text....ok! it just a hypothesis).
if the app can support iCloud LLMs, it would be better. It would be awesome if we could let people experience better LLMs that they've set up themselves. Sometimes, the devices people are using just aren't powerful enough to run the really big models, so this would be a great solution!
hope this app can be the best one !
The text was updated successfully, but these errors were encountered:
To clarify the intuition behind the buttons:
The Proofread, Rewrite, Friendly, Concise, and Professional options replace text you select.
Only the options where you want to select & understand other text (Summary, Key Points, Table) open in a separate window so you can chat with the response & have it rendered in Markdown.
Also, Writing Tools supports cloud-based LLMs if that’s what you meant - there’s support for providers such as Gemini, OpenAI, and Mixtral in Settings.
@Aryamirsepasi is also adding support for built-in local LLMs (without having to download Ollama etc.).
The text generated by the LLM sometimes appears in the current note (e.g., MS, Word document) and sometimes in a new window, making it difficult to anticipate. I hope the developers can address this puzzlement. I do not have a better suggestion yet ( is it possible to create a code snippet that identifies the file type, such as a word document, PDF, HTML, or others, before processing the text....ok! it just a hypothesis).
if the app can support iCloud LLMs, it would be better. It would be awesome if we could let people experience better LLMs that they've set up themselves. Sometimes, the devices people are using just aren't powerful enough to run the really big models, so this would be a great solution!
hope this app can be the best one !
The text was updated successfully, but these errors were encountered: