Skip to content

IDinsight/momconnect-aaq

Repository files navigation

logo logo

Developer Docs | Features | Usage | Architecture | Funders and Partners

Ask A Question is a free and open-source tool created to help non-profit organizations, governments in developing nations, and social sector organizations use Large Language Models for responding to citizen inquiries in their native languages.

🤸‍♀️ Features

❓ LLM-powered search

Match your questions to content in the database using embeddings from LLMs.

🤖 LLM responses

Craft a custom reponse to the question using LLM chat and the content in your database

🔌 Integrate with your own chatbot

Connect to your own chatbot on platforms like Turn.io, Glific, and Typebot using our APIs.

📚 Manage content

Use the AAQ App to add, edit, and delete content in the database (Sign up for a demo here)

🚨 Message Triaging

Identify urgent or important messages based on your own criteria.

🧑‍💼 Content manager dashboard

See which content is the most sought after, the kinds of questions that receive poor feedback, identify missing content, and more

🚧 Upcoming

💬 Conversation capability

Refine or clarify your question through conversation

📹 Multimedia content

Respond with not just text but voice, images, and videos as well.

🧑‍💻 Engineering dashboard

Monitor uptime, response rates, throughput HTTP reponse codes and more

Note

Looking for other features? Please raise an issue with [FEATURE REQUEST] before the title.

Usage

To get answers from your database of contents, you can use the /search endpoint. This endpoint returns the following:

  • Search results: Finds the most similar content in the database using cosine distance between embeddings.
  • (Optionally) LLM generated response: Crafts a custom response using LLM chat using the most similar content.

See docs or API docs for more details and other API endpoints.

❓ Simple content search

curl -X 'POST' \
  'https://[DOMAIN]/api/search' \
  -H 'accept: application/json' \
  -H 'Authorization: Bearer <BEARER TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
  "query_text": "how are you?",
  "generate_llm_response": false,
  "query_metadata": {}
}'

🤖 With LLM response

The query looks the same as above, except generate_llm_response is set to true:

curl -X 'POST' \
  'https://[DOMAIN]/api/search' \
  -H 'accept: application/json' \
  -H 'Authorization: Bearer <BEARER TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
  "query_text": "this is my question",
  "generate_llm_response": true,
  "query_metadata": {}
}'

📚 Manage content

You can access the admin console at

https://[DOMAIN]/

Architecture

We use docker-compose to orchestrate containers with a reverse proxy that manages all incoming traffic to the service. The database and LiteLLM proxy are only accessed by the core app.

Flow

Documentation

See here for full documentation.

Funders and Partners

google_dot_org