Skip to content

Easy, fast and very cheap training and inference on AWS Trainium and Inferentia chips.

License

Notifications You must be signed in to change notification settings

mmcclean-aws/optimum-neuron

 
 

Repository files navigation

Optimum Neuron

🤗 Optimum Neuron is the interface between the 🤗 Transformers library and AWS Accelerators including AWS Trainium and AWS Inferentia. It provides a set of tools enabling easy model loading, training and inference on single- and multi-Accelerator settings for different downstream tasks. The list of officially validated models and tasks is available here. Users can try other models and tasks with only few changes.

Install

To install the latest release of this package:

pip install optimum[neuron]

Optimum Neuron is a fast-moving project, and you may want to install it from source:

pip install git+https://github.com/huggingface/optimum-neuron.git

Alternatively, you can install the package without pip as follows:

git clone https://github.com/huggingface/optimum-neuron.git
cd optimum-neuron
python setup.py install

Last but not least, don't forget to install the requirements for every example:

cd <example-folder>
pip install -r requirements.txt

How to use it?

Quick Start

🤗 Optimum Neuron was designed with one goal in mind: to make training and inference straightforward for any 🤗 Transformers user while leveraging the complete power of AWS Accelerators.

Transformers Interface

There are two main classes one needs to know:

  • TrainiumArgumentParser: inherits the original HfArgumentParser in Transformers with additional checks on the argument values to make sure that they will work well with AWS Trainium instances.
  • TrainiumTrainer: this version trainer takes care of doing the proper checks and changes to the supported models to make them trainable on AWS Trainium instances.

The TrainiumTrainer is very similar to the 🤗 Transformers Trainer, and adapting a script using the Trainer to make it work with Trainium will mostly consist in simply swapping the Trainer class for the TrainiumTrainer one. That's how most of the example scripts were adapted from their original counterparts.

from transformers import TrainingArguments
+from optimum.neuron import TrainiumTrainer as Trainer

training_args = TrainingArguments(
  # training arguments...
)

# A lot of code here

# Initialize our Trainer
trainer = Trainer(
    model=model,
    args=training_args,  # Original training arguments.
    train_dataset=train_dataset if training_args.do_train else None,
    eval_dataset=eval_dataset if training_args.do_eval else None,
    compute_metrics=compute_metrics,
    tokenizer=tokenizer,
    data_collator=data_collator,
)

Documentation

Check out the documentation of Optimum Neuron for more advanced usage.

If you find any issue while using those, please open an issue or a pull request.

About

Easy, fast and very cheap training and inference on AWS Trainium and Inferentia chips.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.4%
  • Makefile 0.6%