We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi,
Nice work done! I'm trying to run inference in a card that don't support bf16 or fp16 ops, so I need to disable bf16 during inference.
I've changed the following config that I found in the model folder:
params.json:
{ "async_checkpointing": false, "async_eval_ngpus": -1, "batch_size": 2, "data": "", "disable_logging": false, "disable_workers_print": false, "dtype": "fp32", # I changed this into fp32 "dump_after_steps": 0, .... }
and consolidate_params.json: params.json:
{ { "dtype": "fp32", # I changed this into fp32 "model_parallel_size": 1, "on_gpu": true, "src": "/fsx-onellm/rpasunuru/SFT/v2.1_textpp_7b_1366k_sftv1.4_exp1/v2.1_textpp_7b_1366k_sftv1.4_exp1_run000/checkpoints/checkpoint_0001200_noimggen/", "tgt": "/fsx-onellm/rpasunuru/SFT/v2.1_textpp_7b_1366k_sftv1.4_exp1/v2.1_textpp_7b_1366k_sftv1.4_exp1_run000/checkpoints/checkpoint_0001200_noimggen_consolidated/", "tokenizer_path": null }
however, nothing changed during inference, there is still a mistake says I can't do bf16 ops when I'm running
anole/chameleon/inference/chameleon.py
Line 646 in 219a9a3
I met the same error msg with facebookresearch/chameleon#39
If any help could be provided, I'd be very grateful for that
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hi,
Nice work done! I'm trying to run inference in a card that don't support bf16 or fp16 ops, so I need to disable bf16 during inference.
I've changed the following config that I found in the model folder:
params.json:
and consolidate_params.json:
params.json:
however, nothing changed during inference, there is still a mistake says I can't do bf16 ops when I'm running
anole/chameleon/inference/chameleon.py
Line 646 in 219a9a3
I met the same error msg with facebookresearch/chameleon#39
If any help could be provided, I'd be very grateful for that
The text was updated successfully, but these errors were encountered: