Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add telemetry #4441

Open
wants to merge 15 commits into
base: develop
Choose a base branch
from
Open

Add telemetry #4441

wants to merge 15 commits into from

Conversation

valentinsulzer
Copy link
Member

@valentinsulzer valentinsulzer commented Sep 15, 2024

Description

Adds very basic telemetry - records when a simulation is solved.

The following section has been added to the user guide to explain the telemetry we are doing:

PyBaMM collects anonymous usage data to help improve the library. This telemetry is enabled by default but can be easily disabled. Here's what you need to know:

  • What is collected: Basic usage information like PyBaMM version, Python version, and which functions are run.
  • Why: To understand how PyBaMM is used and prioritize development efforts.
  • Opt-out: To disable telemetry, set the environment variable PYBAMM_OPTOUT_TELEMETRY=true or use pybamm.telemetry.disable() in your code.
  • Privacy: No personal information (name, email, etc) or sensitive information (parameter values, simulation results) is ever collected.

Copy link

codecov bot commented Sep 15, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.42%. Comparing base (444ecc1) to head (2e222c6).
Report is 1 commits behind head on develop.

Additional details and impacted files
@@           Coverage Diff            @@
##           develop    #4441   +/-   ##
========================================
  Coverage    99.41%   99.42%           
========================================
  Files          293      295    +2     
  Lines        22554    22599   +45     
========================================
+ Hits         22423    22468   +45     
  Misses         131      131           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@agriyakhetarpal
Copy link
Member

agriyakhetarpal commented Sep 15, 2024

We could also make pybamm.telemetry.disable() a top-level pytest fixture as a cleaner way to disable telemetry when running tests

@valentinsulzer
Copy link
Member Author

We could also make pybamm.telemetry.disable() a top-level pytest fixture as a cleaner way to disable telemetry when running tests

Yes, but we also want to disable it when our dependencies run their tests, so that wouldn't be enough right?

@agriyakhetarpal
Copy link
Member

agriyakhetarpal commented Sep 15, 2024

We could also make pybamm.telemetry.disable() a top-level pytest fixture as a cleaner way to disable telemetry when running tests

Yes, but we also want to disable it when our dependencies run their tests, so that wouldn't be enough right?

Oh, yes, forgot about that! In that case, we would need both the logic you added and the fixture to call it.

Copy link
Contributor

@kratman kratman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand why we want to know which parts of PyBaMM our users are using, but I am not sure this is the best approach.

My concerns:

  • I feel like I would be upset to find out a free research tool was tracking my usage and sending it to a server. I think for a paid tool the expectation that they want to know your usage is a little more reasonable.
  • Users installing through pip might not realize we are tracking their usage. It would be better to allow an opt-in instead so that users know we are collecting their data.
  • Are we only going to be tracking simulation data so that we can just keep it in the simulation class or are we going to have to add the tracking to a lot of different places and make sure we add it to new functionality?

examples/scripts/compare_lithium_ion.py Outdated Show resolved Hide resolved
pyproject.toml Outdated Show resolved Hide resolved
Copy link
Contributor

@martinjrobins martinjrobins left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @valentinsulzer. I suggest a timeout with default opt-out so we don't break ppl's CI, see below

"This is entirely optional and does not impact the functionality of PyBaMM.\n"
"For more information, see https://docs.pybamm.org/en/latest/source/user_guide/index.html#telemetry"
)
while True:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should time-out after a short period with default no so we don't break everyone's automated stuff (say 10-20 sec?)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I agree, we should not have an infinite loop here

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A better option could be to check for the ci_env_vars above (can be set as a constant at the top of the file) and assume non-interactive if set, and return early (like nox does it with session.interactive).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function is only ever called by generate which already returns early if it detects an automated environment, but I'll also add the time check as a backup

)
while True:
user_input = input("Do you want to enable telemetry? (Y/n): ").strip().lower()
if user_input in ["yes", "y"]:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if user_input in ["yes", "y"]:
if user_input in ["yes", "y", ""]:

So if user presses enter it will be interpreted as a yes (as per the prompt)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants