Struggling to stay abreast with the dynamic advancements in the field of AI, often overwhelmed by the sheer volume of incoming information? Let your worries be a thing of the past with researchAssistant - your personal AI-powered ally. Simply feed it a list of URLs, and it will dutifully scan, summarize, and rank the information, presenting you with a detailed analysis for each source. The beauty lies in the seamless integration of these analyses into Obsidian, making the once overwhelming information instantly navigable. ResearchAssistant transforms your digital research experience, ensuring you never miss a beat in the fast-paced world of AI (or any other field of your interest).
Before starting the app you should have Obsidian installed on your computer and set some variables in the .env file. Use the .env-example as a template. Add you keys and folder path and rename it to .env.
docker-compose up -d
# 1. activate python3 venv
source venv/bin/activate
# 2. install requirements
pip3 install -r requirements.txt
# 3. then start the app.py
python3 app.py
Once the app is started there are two rest enpoints that can be triggered. These will be triggered in the future by a browser plugin. For now they have to be triggered manually:
- Analyse URLs: send a POST request with a list of URLS to http://localhost:5000/urls
curl --location 'http://localhost:5000/urls' \
--header 'Content-Type: application/json' \
--data '{
"urls": ["https://some-example-url.com", "https://www.a-second-example-url.com/lates/post"]
}'
- Create Canvas: send a GET request to http://localhost:5000/knowledgeMap
curl --location 'http://localhost:5000/knowledgeMap'