Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Develop #1

Merged
merged 72 commits into from
Jun 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
72 commits
Select commit Hold shift + click to select a range
6102f38
X
knc6 Aug 25, 2023
d238a72
vv
knc6 Feb 15, 2024
7b8141a
Add vv
knc6 Feb 15, 2024
b3137a9
Add finetune
knc6 Feb 16, 2024
a92e3e3
Update finteune.
knc6 Feb 16, 2024
cf09a2e
ft1
knc6 Feb 18, 2024
bd97247
Add finetune with chemnlp.
knc6 Feb 22, 2024
1b84feb
Describe.
knc6 Feb 23, 2024
3703919
x
knc6 Feb 27, 2024
eae8c5c
unsloth models.
knc6 Mar 4, 2024
5fbfaf3
Organize repo.
knc6 Mar 8, 2024
23e5b0e
Update train_prop.py
knc6 Mar 8, 2024
d42f9a4
Update train_prop.py
knc6 Mar 8, 2024
c13e3e9
First LLM FF?
knc6 Mar 12, 2024
3f05d93
Add chem desc
Mar 24, 2024
d84c27a
Add robo
knc6 Mar 24, 2024
9ce7caa
Add robo
knc6 Mar 24, 2024
c69bcdf
Merge branch 'develop' of https://github.com/usnistgov/atomgpt into d…
knc6 Mar 24, 2024
812d10f
X
Mar 25, 2024
e46067e
X
Mar 25, 2024
28ccfc0
X
knc6 Mar 25, 2024
3605031
Update train.
knc6 Mar 25, 2024
598abf9
Update fix.
knc6 Mar 25, 2024
4ed0288
Update train prop
knc6 Mar 28, 2024
b4ba1c8
Add new chemnlp.
knc6 Apr 10, 2024
d599208
Add environment.yml.
knc6 May 17, 2024
07a7b66
Create main.yml
knc6 May 17, 2024
9c056fe
Update main.yml
knc6 May 17, 2024
cd6f0a6
Update environment.yml
knc6 May 17, 2024
2627d19
Update environment.yml
knc6 May 17, 2024
356a6b3
Update environment.yml
knc6 May 17, 2024
ff55d66
Update environment.yml
knc6 May 17, 2024
fe94571
Add new env.
knc6 May 17, 2024
108b8a9
Add new env.
knc6 May 17, 2024
38c3c4e
Add new env.
knc6 May 17, 2024
14dce75
Merge branch 'develop' into HEAD
knc6 May 17, 2024
853b563
Add new env.
knc6 May 17, 2024
d428c07
Update environment.yml
knc6 May 17, 2024
cb5af8d
Update main.yml
knc6 May 17, 2024
302bff0
Update main.yml
knc6 May 17, 2024
1c61180
Update environment.yml
knc6 May 17, 2024
1c0b321
Add new env.
knc6 May 17, 2024
3e91284
Add new env.
knc6 May 17, 2024
ee42ca7
Merge branch 'develop' of https://github.com/usnistgov/atomgpt into HEAD
knc6 May 17, 2024
0223c2f
Add new env.
knc6 May 17, 2024
5a33b6f
Add new env.
knc6 May 17, 2024
d529aa0
Add new env.
knc6 May 17, 2024
6714ccb
Add new env model.
knc6 May 17, 2024
4a8dc0f
No CPU test.
knc6 May 17, 2024
9c76d26
No CPU test.
knc6 May 17, 2024
c42cfc1
Add inverse and forward model folder.
knc6 May 20, 2024
f51fb06
update forward.
knc6 May 28, 2024
1229f42
Add forward.
knc6 Jun 22, 2024
70dc6a8
Add forward in GAaction.
knc6 Jun 22, 2024
9ce6831
Add forward in GAaction.
knc6 Jun 22, 2024
f163f7b
Add forward in GAaction.
knc6 Jun 22, 2024
9fb34dc
Fix test.
knc6 Jun 22, 2024
539b64e
Fix test.
knc6 Jun 22, 2024
111c73e
Fix test.
knc6 Jun 22, 2024
41f53e4
Fix test.
knc6 Jun 22, 2024
f721b84
Fix test.
knc6 Jun 22, 2024
cb17475
Fix py.
knc6 Jun 22, 2024
62e97d0
Add env.
knc6 Jun 23, 2024
826df4d
Add env.
knc6 Jun 23, 2024
a20769b
Add env.
knc6 Jun 23, 2024
13da6ed
Add inverse.
knc6 Jun 23, 2024
e187062
Bettr environment, lower forward samples.
knc6 Jun 23, 2024
20d8dda
Fix py.
knc6 Jun 23, 2024
ef82af7
Fix py.
knc6 Jun 23, 2024
0de5faa
Add t5.
knc6 Jun 23, 2024
080bb58
Add inverse test.
knc6 Jun 23, 2024
58bf747
Add inverse better example.
knc6 Jun 24, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
73 changes: 73 additions & 0 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
name: AtomGPT github action
on: [push, pull_request]

jobs:
miniconda:
name: Miniconda ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: ["ubuntu-latest"]
steps:
- uses: actions/checkout@v2
- uses: conda-incubator/setup-miniconda@v2
with:
activate-environment: test
environment-file: environment.yml
python-version: "3.9"
auto-activate-base: false
- shell: bash -l {0}
run: |
conda info
conda list
- name: Lint
shell: bash -l {0}
run: |
conda install flake8 pycodestyle pydocstyle
#flake8 --ignore E203,W503 --exclude=examples,tests,scripts --statistics --count --exit-zero intermat/calculators.py intermat/generate.py
#pycodestyle --ignore E203,W503 --exclude=examples,tests,scripts intermat
#pydocstyle --match-dir=core --match-dir=io --match-dir=io --match-dir=ai --match-dir=analysis --match-dir=db --match-dir=tasks --count intermat

- name: Run pytest
shell: bash -l {0}
run: |
#source ~/.bashrc
find . -type f > before_test_files.txt
conda env create -f environment.yml
conda activate atomgpt
pip install triton
#conda install alignn dgl=2.1.0 pytorch torchvision torchaudio pytorch-cuda transformers peft trl triton -c pytorch -c nvidia -y
conda install pytest coverage codecov -y
#conda install pytest coverage codecov pandas numpy matplotlib phonopy scikit-learn jarvis-tools --quiet
#export DGLBACKEND=pytorch
#export CUDA_VISIBLE_DEVICES="-1"
#pip install phonopy flake8 pytest pycodestyle pydocstyle codecov pytest-cov coverage

python setup.py develop
echo 'environment.yml'
conda env export
echo 'forward model'
echo 'ls'
ls
echo 'ls atomgpt'
ls atomgpt
echo 'atomgpt/examples/'
ls atomgpt/examples/
echo 'atomgpt/examples/forward_model/'
ls atomgpt/examples/forward_model/
python atomgpt/forward_models/forward_models.py --config_name atomgpt/examples/forward_model/config.json

echo 'inverse model'
python atomgpt/examples/inverse_model/run.py
coverage run -m pytest
coverage report -m -i
codecov
#codecov --token="85bd9c5d-9e55-4f6d-bd69-350ee5e3bb41"

#train_alignn.py -h
#echo 'Pre-trained models'
#pretrained.py -h
#find . -type f > after_test_files.txt



12 changes: 11 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,11 @@
# atomgpt
# AtomGPT: atomistic generative pre-trained transformer for forward and inverse materials design

## Forward model example

python atomgpt/forward_models/forward_models.py --config_name atomgpt/examples/forward_model/config.json

## Inverse model example

python atomgpt/inverse_models/inverse_models.py --config_name atomgpt/examples/inverse_model/config.json

#python atomgpt/examples/inverse_model/run.py
4 changes: 2 additions & 2 deletions atomgpt/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
"""Initialize atomgpt."""
# Code coming soon
"""Version number."""
__version__ = "2024.6.8"
30 changes: 30 additions & 0 deletions atomgpt/config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
from typing import Optional
from pydantic_settings import BaseSettings
class TrainingPropConfig(BaseSettings):
"""Training config defaults and validation."""

benchmark_file: Optional[str] = None
# "AI-SinglePropertyPrediction-exfoliation_energy-dft_3d-test-mae"
id_prop_path: Optional[str] = None
prefix: str = "xyz"
model_name: str = "gpt2"
leaderboard_dir: str = (
"/wrk/knc6/AFFBench/jarvis_leaderboard/jarvis_leaderboard"
)
batch_size: int = 8
max_length: int = 512
num_epochs: int = 500
latent_dim: int = 1024
learning_rate: float = 1e-3
test_each_run: bool = True
include_struct: bool = False
pretrained_path: str = ""
seed_val: int = 42
n_train: Optional[int] = None
n_val: Optional[int] = None
n_test: Optional[int] = None
train_ratio: Optional[float] = None
val_ratio: float = 0.1
test_ratio: float = 0.1
keep_data_order: bool = False
output_dir: str = "temp"
1 change: 1 addition & 0 deletions atomgpt/data/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
"""Module for dataclass."""
Binary file added atomgpt/data/chemnlp_desc.json.zip
Binary file not shown.
Binary file added atomgpt/data/chemnlp_new_desc.json.zip
Binary file not shown.
Loading
Loading