Skip to content

Commit

Permalink
code checkin
Browse files Browse the repository at this point in the history
  • Loading branch information
juncongmoo committed Feb 27, 2023
1 parent f7a1d17 commit c6a3556
Show file tree
Hide file tree
Showing 17 changed files with 2,430 additions and 3 deletions.
142 changes: 142 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

# pyenv
.python-version
.idea

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock

# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# MacOS DS_Store
.DS_Store

# Pickle folder
.pkl_memoize_py3

# Folder where optimized models are stored
optimized_model

# Config file for tests coverage
.coveragerc
13 changes: 10 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
# **Open source implementation for LLaMA-based ChatGPT. 15x faster training process than ChatGPT (wip)**
# ChatLLaMA

> 📢 Open source implementation for LLaMA-based ChatGPT runnable in a single GPU. 15x faster training process than `ChatGPT`
Meta has recently released LLaMA, a collection of foundational large language models ranging from 7 to 65 billion parameters.
LLaMA is creating a lot of excitement because it is smaller than GPT-3 but has better performance. For example, LLaMA's 13B architecture outperforms GPT-3 despite being 10 times smaller. This new collection of fundamental models opens the door to faster inference performance and chatGPT-like real-time assistants, while being cost-effective and running on a single GPU.
Expand All @@ -12,14 +14,19 @@ The good news is that we introduce `ChatLLaMA`, the first open source implementa
- ChatLLaMA has built-in support for DeepSpeed ZERO to speedup the fine-tuning process.
- The library also supports all LLaMA model architectures (7B, 13B, 33B, 65B), so that you can fine-tune the model according to your preferences for training time and inference performance.

If you like the project, please show your support by [leaving a star ⭐](https://github.com/nebuly-ai/nebullvm/stargazers).


<img width="1032" alt="Screen Shot 2023-02-26 at 10 56 13 PM" src="https://user-images.githubusercontent.com/83510798/221439813-5972d029-dae5-4561-ab3d-5a55fa5cde09.png">

Image from [OpenAI’s blog](https://openai.com/blog/chatgpt).


# Installation

```
pip install chatllama
```


# Get started with ChatLLaMA

> :warning: Please note this code represents the algorithmic implementation for RLHF training process of LLaMA and does not contain the model weights. To access the model weights, you need to apply to Meta's [form](https://forms.gle/jk851eBVbX1m5TAv5).
Expand Down
1 change: 1 addition & 0 deletions chatllama/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
__version__ = '0.0.3'
Empty file.
62 changes: 62 additions & 0 deletions chatllama/langchain_modules/prompt_templates.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
REWARD_TEMPLATE = dict(
template=(
"Lets pretend that you are a lawyer and you have to"
"evalaute the following completion task from a given"
"assigment with a score between 0 and 5 where 0 represents"
"a bad assignment completion and 5 a perfect completion.\n"
"You MUST evaluate: text quality, content quality and"
"coherence.\n"
"You MUST return only the number that represents your"
"judgment.\n"
"The assignement is:\n{user_input}\n"
"The completion is:\n{completion}\n"
),
input_variables=["user_input", "completion"],
)


AI_CHATBOT_TEMPLATE = dict(
template=(
"Assistant is a large language model trained by Meta and Nebuly.ai\n"
"Assistant is designed to be able to assist with a wide range of "
"tasks, from answering simple questions to providing in-depth "
"explanations and discussions on a wide range of topics. As a "
"language model, Assistant is able to generate human-like text "
"based on the input it receives, allowing it to engage in "
"natural-sounding conversations and provide responses that are "
"coherent and relevant to the topic at hand.\n\n"
"Assistant is constantly learning and improving, and its capabilities "
"are constantly evolving. It is able to process and understand large "
"amounts of text, and can use this knowledge to provide accurate and "
"informative responses to a wide range of questions. Additionally, "
"Assistant is able to generate its own text based on the input it "
"receives, allowing it to engage in discussions and provide "
"explanations and descriptions on a wide range of topics.\n\n"
"Overall, Assistant is a powerful tool that can help with a wide "
"range of tasks and provide valuable insights and information on a "
"wide range of topics. Whether you need help with a specific "
"question or just want to have a conversation about a particular "
"topic, Assistant is here to assist.\n\n{history}\n\n"
"Human: {human_input}\n"
"Assistant:"
),
input_variables=["history", "human_input"],
)


PERSON_CHATBOT_TEMPLATE = dict(
template=(
"You are a human chatting with a chatbot. The chatbot is a large "
"language model trained by Meta and Nebuly-ai\n"
"The chatbot is designed to be able to assist you with a wide range "
"of tasks, from answering simple questions to providing in-depth "
"explanations and discussions on a wide range of topics. You are a "
"human and you are testing the chatbot. Ask the chatbot questions and"
"see how it responds. You can also ask the chatbot to tell you a "
"story."
"\n\n{history}\n\n"
"Chatbot: {chatbot_input}\n"
"Human:"
),
input_variables=["history", "chatbot_input"],
)
Loading

0 comments on commit c6a3556

Please sign in to comment.