Skip to content

Commit

Permalink
fixed escaping issue (#55)
Browse files Browse the repository at this point in the history
  • Loading branch information
soldni authored Oct 15, 2023
1 parent 1728f4f commit c1c3cbd
Showing 1 changed file with 0 additions and 23 deletions.
23 changes: 0 additions & 23 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,26 +28,3 @@ Dolma is a toolkit to curate large datasets for (pre)-training ML models. Its ke
To install, simply type `pip install dolma` in your terminal.

To learn more about how to use the Dolma Toolkit, please visit the [documentation](/docs).

## Citation

If you use the Dolma dataset or toolkit, please cite the following items:

```bibtex
@techreport{DolmaDataset,
author = {Soldaini, Luca and Kinney, Rodney and Bhagia, Akshita and Schwenk, Dustin and Atkinson, David and Authur, Russell and Chandu, Khyathi and Dumas, Jennifer and Lucy, Li and Lyu, Xinxi and Magnusson, Ian and Naik, Aakanksha and Nam , Crystal and Peters, Matthew E. and Ravichander, Abhilasha and Shen, Zejiang and Strubell, Emma and Subramani, Nishant and Tafjord, Oyvind and Walsh, Evan Pete and Hajishirzi, Hannaneh and Smith, Noah A. and Zettlemoyer, Luke and Beltagy, Iz and Groeneveld, Dirk and Dodge, Jesse and Lo, Kyle},
title = {{Dolma: An Open Corpus of 3 Trillion Tokens for Language Model Pretraining Research}},
institution = {{Allen Institute for AI}},
year = {2023},
note = {Released under ImpACT License as Medium Risk artifact, \url{https://github.com/allenai/dolma}}
}
```

```bibtex
@software{DolmaToolkit,
author = {{Soldaini, Luca and Lo, Kyle and Kinney, Rodney and Naik, Aakanksha and Ravichander, Abhilasha and Bhagia, Akshita and Groeneveld, Dirk and Schwenk, Dustin and Magnusson, Ian and Chandu, Khyathi}},
title = {{The Dolma Toolkit}},
year = {2023},
note = {{Apache 2.0 License, Version \texttt{0.9.0}, \url{https://github.com/allenai/dolma}}}
}
```

0 comments on commit c1c3cbd

Please sign in to comment.