Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate safetensors for model serialization #2532

Open
1 task
strickvl opened this issue Mar 15, 2024 · 11 comments
Open
1 task

Integrate safetensors for model serialization #2532

strickvl opened this issue Mar 15, 2024 · 11 comments
Assignees
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@strickvl
Copy link
Contributor

Open Source Contributors Welcomed!

Please comment below if you would like to work on this issue!

Contact Details [Optional]

[email protected]

What happened?

ZenML currently uses Python's pickle module (via cloudpickle library) for model serialization and materialization. However, the safetensors library is fast becoming a standard for storing tensors and model weights, offering a reasonable alternative to pickle. Integrating safetensors into ZenML would provide users with a more efficient and secure option for model serialization.

Task Description

Implement support for using safetensors instead of pickle for model materialization in ZenML. The task involves the following:

  1. Modify the base materializers to use safetensors for model serialization.
  2. Update the integration-specific materializers (located in src/zenml/integrations) to utilize safetensors where appropriate.
  3. Ensure backward compatibility with existing pickle-based serialized models.
  4. Update relevant documentation and examples to reflect the new safetensors option.

Expected Outcome

  • ZenML will support model serialization using safetensors, providing a faster and more secure alternative to pickle.
  • Users will have the option to choose between pickle and safetensors for model materialization.
  • The integration of safetensors will be seamless, maintaining compatibility with existing ZenML workflows.
  • Documentation and examples will be updated to guide users on how to utilize the safetensors option effectively.

Steps to Implement

  1. Familiarize yourself with the safetensors library and its usage for model serialization.
  2. Modify the base materializers in ZenML to include support for safetensors serialization.
  3. Identify integration-specific materializers in src/zenml/integrations that would benefit from safetensors and update them accordingly.
  4. Implement backward compatibility measures to ensure existing pickle-based serialized models can still be loaded.
  5. Update relevant documentation, including the user guide and API reference, to explain the new safetensors option and provide examples of its usage.
  6. Write unit / integration tests to verify the functionality of safetensors serialization in various scenarios.
  7. Submit a pull request with the implemented changes for review.

Additional Context

Integrating safetensors into ZenML aligns with the project's goal of providing efficient and secure tools for machine learning workflows. By offering an alternative to pickle, ZenML empowers users with more options for model serialization, catering to their specific needs and preferences.

Code of Conduct

  • I agree to follow this project's Code of Conduct
@strickvl strickvl added enhancement New feature or request good first issue Good for newcomers labels Mar 15, 2024
@strickvl strickvl changed the title Integrate safetensors for model serialization Integrate safetensors for model serialization Mar 15, 2024
@Dev-Khant
Copy link

Dev-Khant commented Mar 17, 2024

Hi @strickvl I want to work on this task.

By reading your detailed description I found out that safetensors could be used for Huggingface, PyTorch, PyTorch Lightning and Tensorflow integrations. And we could change the current method to use safetensors to store the model. Please correct me if I'm wrong here.

Here I could add a new materialize named for eg. SafetensorsMaterializer similar to CloudPickleMaterializer.

But I'm struggling with figuring out how can we show users both the option of pickle and safetensors, I mean where can we make this change? As I'm new to this repo, please can you guide me a little on this? Thanks :)

@strickvl
Copy link
Contributor Author

Hi @Dev-Khant good question!

I think what would be the best first place to start would be simply to add new materializers that use safetensors. Then we can allow users to specify them as a custom materializer for their chosen outputs. (See here for more details on that).

We can keep the new materializers as part of the standard library, but they just wouldn't be the default. (The alternative would be to have a config option on the materializer itself, but that's a big / complicated feature to implement and I think we shouldn't start there).

So, don't change the existing materializers but add new ones that use safetensors and update the docs so that people know how to use these parallel options. Hope that makes sense!

@Dev-Khant
Copy link

@strickvl Totally Understood. As you said we will have parallel options for materialized, so correct if me I am wrong, we will have let's say two HFPTModelMaterializer one would be with the current approach and another one with safetensors.

@strickvl
Copy link
Contributor Author

Correct.

@Saedbhati
Copy link

@strickvl can you assign this issue to me? thanks.

@htahir1
Copy link
Contributor

htahir1 commented Oct 7, 2024

@Saedbhati sure go for it! I've assigned it to you. Please keep in mind the conversation in this thread however :-)

@CoreyJness
Copy link

I am working on this as well!

@JasonBodzy
Copy link

JasonBodzy commented Oct 29, 2024

@htahir1
I was looking through #2539 and am curious (in terms of a torch materializer) if it would be more efficient to use safetensors.safe_open and safetensors.save_file for load and save functionality respectively.

While this approach would require handling single versus multiple tensors slightly differently I feel it would avoid the problem of saving/loading models twice.

Would there be a downside to this approach?

@htahir1
Copy link
Contributor

htahir1 commented Oct 29, 2024

@JasonBodzy thanks for the interest - it's an interesting suggestion! Using safetensors' native functions could potentially help avoid the double-save problem we ran into earlier.

Though we'd need to solve a couple of challenges:

  1. Remote storage compatibility - safetensors functions need to work with ZenML's fileio wrappers to support remote artifact stores (which was a blocker in previous attempts)
  2. Model architecture handling - we still need an efficient way to save/load this alongside the weights

What do you think? If you're keen to explore this approach further, would be great to see how these pieces could come together. Feel free to share more thoughts or suggestions on tackling these requirements :-)

@bcdurak curious about your thoughts here too!

@Squishedmac
Copy link

Hi!i is this still being worked on? if it hasn't been resolved id love to try and resolve it

@htahir1
Copy link
Contributor

htahir1 commented Nov 17, 2024

@Squishedmac go for it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

7 participants