Skip to content

Commit

Permalink
docs: update baseline result
Browse files Browse the repository at this point in the history
  • Loading branch information
junewgl committed Dec 25, 2023
1 parent 8634628 commit 45d2c55
Show file tree
Hide file tree
Showing 2 changed files with 522 additions and 0 deletions.
261 changes: 261 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,267 @@
[**简体中文**](README.zh.md) | [**Discord**](https://discord.gg/7uQnPuveTY) | [**Wechat**](https://github.com/eosphoros-ai/DB-GPT/blob/main/README.zh.md#%E8%81%94%E7%B3%BB%E6%88%91%E4%BB%AC) | [**Huggingface**](https://huggingface.co/eosphoros) | [**Community**](https://github.com/eosphoros-ai/community)
</div>

## Baseline
- update time: 2023/12/08
- metric: execution accuracy (ex)
- more details refer to [docs/eval-llm-result.md](https://github.com/eosphoros-ai/DB-GPT-Hub/blob/main/docs/eval_llm_result.md)

<table style="text-align: center;">
<tr>
<th style="text-align: center;">Model</th>
<th>Method</th>
<th>Easy</th>
<th>Medium</th>
<th>Hard</th>
<th>Extra</th>
<th>All</th>
</tr>
<tr >
<td></td>
<td>base</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
</tr>
<tr>
<td>Llama2-7B-Chat</td>
<td>lora</td>
<td>0.887</td>
<td>0.641</td>
<td>0.489</td>
<td>0.331</td>
<td>0.626</td>
</tr>
<tr>
<td></td>
<td>qlora</td>
<td>0.847</td>
<td>0.623</td>
<td>0.466</td>
<td>0.361</td>
<td>0.608</td>
</tr>
<tr>
<td></td>
<td>base</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
</tr>
<tr>
<td>Llama2-13B-Chat</td>
<td>lora</td>
<td>0.907</td>
<td>0.729</td>
<td>0.552</td>
<td>0.343</td>
<td>0.68</td>
</tr>
<tr>
<td></td>
<td>qlora</td>
<td>0.911</td>
<td>0.7</td>
<td>0.552</td>
<td>0.319</td>
<td>0.664</td>
</tr>
<tr>
<td></td>
<td>base</td>
<td>0.214</td>
<td>0.177</td>
<td>0.092</td>
<td>0.036</td>
<td>0.149</td>
</tr>
<tr>
<td>CodeLlama-7B-Instruct</td>
<td>lora</td>
<td>0.923</td>
<td>0.756</td>
<td>0.586</td>
<td>0.349</td>
<td>0.702</td>
</tr>
<tr>
<td></td>
<td>qlora</td>
<td>0.911</td>
<td>0.751</td>
<td>0.598</td>
<td>0.331</td>
<td>0.696</td>
</tr>
<tr>
<td></td>
<td>base</td>
<td>0.698</td>
<td>0.601</td>
<td>0.408</td>
<td>0.271</td>
<td>0.539</td>
</tr>
<tr>
<td>CodeLlama-13B-Instruct</td>
<td>lora</td>
<td>0.94</td>
<td>0.789</td>
<td>0.684</td>
<td>0.404</td>
<td>0.746</td>
</tr>
<tr>
<td></td>
<td>qlora</td>
<td>0.94</td>
<td>0.774</td>
<td>0.626</td>
<td>0.392</td>
<td>0.727</td>
</tr>
<tr>
<td></td>
<td>base</td>
<td>0.577</td>
<td>0.352</td>
<td>0.201</td>
<td>0.066</td>
<td>335</td>
</tr>
<tr>
<td>Baichuan2-7B-Chat</td>
<td>lora</td>
<td>0.871</td>
<td>0.63</td>
<td>0.448</td>
<td>0.295</td>
<td>0.603</td>
</tr>
<tr>
<td></td>
<td>qlora</td>
<td>0.891</td>
<td>0.637</td>
<td>0.489</td>
<td>0.331</td>
<td>0.624</td>
</tr>
<tr>
<td></td>
<td>base</td>
<td>0.581</td>
<td>0.413</td>
<td>0.264</td>
<td>0.187</td>
<td>0.392</td>
</tr>
<tr>
<td>Baichuan2-13B-Chat</td>
<td>lora</td>
<td>0.903</td>
<td>0.702</td>
<td>0.569</td>
<td>0.392</td>
<td>0.678</td>
</tr>
<tr>
<td></td>
<td>qlora</td>
<td>0.895</td>
<td>0.675</td>
<td>0.58</td>
<td>0.343</td>
<td>0.659</td>
</tr>
<tr>
<td></td>
<td>base</td>
<td>0.395</td>
<td>0.256</td>
<td>0.138</td>
<td>0.042</td>
<td>0.235</td>
</tr>
<tr>
<td>Qwen-7B-Chat</td>
<td>lora</td>
<td>0.855</td>
<td>0.688</td>
<td>0.575</td>
<td>0.331</td>
<td>0.652</td>
</tr>
<tr>
<td></td>
<td>qlora</td>
<td>0.911</td>
<td>0.675</td>
<td>0.575</td>
<td>0.343</td>
<td>0.662</td>
</tr>
<tr>
<td></td>
<td>base</td>
<td>0.871</td>
<td>0.632</td>
<td>0.368</td>
<td>0.181</td>
<td>0.573</td>
</tr>
<tr>
<td>Qwen-14B-Chat</td>
<td>lora</td>
<td>0.895</td>
<td>0.702</td>
<td>0.552</td>
<td>0.331</td>
<td>0.663</td>
</tr>
<tr>
<td></td>
<td>qlora</td>
<td>0.919</td>
<td>0.744</td>
<td>0.598</td>
<td>0.367</td>
<td>0.701</td>
</tr>
<tr>
<td></td>
<td>base</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
</tr>
<tr>
<td>ChatGLM3-6b</td>
<td>lora</td>
<td>0.855</td>
<td>0.605</td>
<td>0.477</td>
<td>0.271</td>
<td>0.59</td>
</tr>
<tr>
<td></td>
<td>qlora</td>
<td>0.843</td>
<td>0.603</td>
<td>0.506</td>
<td>0.211</td>
<td>0.581</td>
</tr>
</table>


## Contents
- [DB-GPT-Hub: Text-to-SQL parsing with LLMs](#db-gpt-hub-text-to-sql-parsing-with-llms)
- [Contents](#contents)
Expand Down
Loading

0 comments on commit 45d2c55

Please sign in to comment.