Skip to content

Latest commit

 

History

History
114 lines (55 loc) · 2.48 KB

README.md

File metadata and controls

114 lines (55 loc) · 2.48 KB

Gemini LLM Model

It is a Language Model (LLM) that responds effectively to questions. This model has been developed using the Gemini API and utilizes Streamlit for the frontend, all powered by Python

Tech Stack

Client: Python, Streamlit, Gemini Api

Demo

Insert gif or link to demo

gemini1

https://rb.gy/u5b495

Run Locally

Clone the project

  git clone https://github.com/modamaan/Gemini_LLM_model.git

Go to the project directory

  cd my-project

Install dependencies

 pip install -r requirements.txt

Start the server

  streamlit run app.py

Related Projects

Here are some related projects

Gemini_Invoice_Reader_Application

Product Hunt

Color Reference

Color Hex
Headline #87CEEB #87CEEB
Subheading #FAFAFA #FAFAFA
Background #0E1117 #0E1117

Contributing

Contributions are always welcome!

See contributing.md for ways to get started.

Please adhere to this project's code of conduct.

Environment Variables

To run this project, you will need to add the following environment variables to your .env file

API_KEY

ANOTHER_API_KEY

Badges

MIT License

GPLv3 License

AGPL License

Authors

License

MIT

Support

linkedin

portfolio

For support, email [email protected].