-
Notifications
You must be signed in to change notification settings - Fork 785
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add pinnx
submodule
#1932
base: master
Are you sure you want to change the base?
Add pinnx
submodule
#1932
Conversation
Hi @chaoming0625 , thank you for adding the code. It is nice. My first suggestion is that as there is some overlapped code between DeepXDE and PINNx, such as geometry, we should reduce the duplication as much as possible. You can import and reuse the code in DeepXDE. |
Hi, @lululxvi , I have made some changes. However, |
Therefore, hosting |
Hi, @lululxvi , in the next, my colleague will continue to promote the integration of 'pinnx' into 'deepxde', and he will focus more on solving every problem in the integration. |
No problem. |
we can install pinnx module using the following command: pip install deepxde[pinnx]
I have updated the PR introduction to clearly explain the motivation behind the |
I have also added an option to install pip install deepxde[pinnx] |
I updated the PR introduction once again. |
…s/transformers`, reusing `deepxde` functions
I have maximized function reuse from |
I think now it is ready to merge this PR. I have tried most models. I also propose to leverage a more agile development process, in which we could consider submitting the PR in smaller increments. |
Hi, @lululxvi , could you check whether there are issues that need to be fixed? Now I have time to update the code. |
Sounds great! As this PR has a lot of changes, I will gradually review each file. |
In response to chaobrain/pinnx#19, #1904, and #1899, I am trying to merge pinnx into
DeepXDE
.This pull request includes a submodule
deepxde.pinnx
, which enables explicit variables and physical units in physics-informed neural networks.Motivation
pinnx module was engineered to address a pivotal challenge prevalent in current PINN libraries: the absence of explicit physical semantics.
Existing PINN libraries, such as DeepXDE, necessitate that users manually track the order and significance of variables. For instance, within these frameworks,
variables[0]
might denote amplitude,variables[1]
for frequency, and so forth. This method lacks an intrinsic link between the sequence of variables and their physical meanings, thereby increasing complexity and the potential for errors. In stark contrast, pinnx empowers users to assign clear and meaningful names to variables (e.g.,variables["amplitude"]
andvariables["frequency"]
). This approach obviates the need for manual management of variable order, enhancing both code readability and maintainability.Another significant limitation of existing PINN libraries is their reliance on users to manage intricate Jacobian and Hessian relationships between variables. This process is not only cumbersome but also prone to mistakes. PINNx revolutionizes this aspect by employing a straightforward dictionary indexing mechanism to track intuitive gradient relationships. For example, users can effortlessly access the Hessian matrix element via via
hessian["y"]["x"]
and the Jacobian matrix elementjacobian["y"]["t"]
. This simplification streamlines the workflow, allowing users to focus on modeling rather than matrix management.Another prevalent deficiency in current PINN frameworks is the lack of support for physical units, which are crucial for maintaining dimensional consistency in physical equations. Take the Burgers' equation for example, the left-hand side and the right-hand side must share the same physical units (e.g., meters per second squared). To ensure such consistency, PINNx integrates
brainunit.autograd
module, enabling unit-aware automatic differentiation. This feature preserves unit information during the computation of first, second, or higher-order derivatives, ensuring that physical unit dimensions remain consistent throughout the differentiation process. Consequently, PINNx effectively mitigates errors arising from unit mismatches, thereby enhancing the reliability of simulations.A quick example
This new submodule support most of PINN examples in
DeepXDE
.The documentation is hosted in
docs/pinnx_docs
directory. The ample examples are included inexamples/pinnx_examples
.Installation
To use the
pinnx
module, install it with the following command:Dependency
pinnx
module heavily depends on the following packages:brainstate
provides a state-based compilation system in JAX. It is very similar to pytorch's syntax. However, it can benefit from the JIT compilation of JAX and XLA. We recommend users to learnbrainstate
via its documentation: https://brainstate.readthedocs.io/quickstart/concepts-en.htmlbrainunit
provides physical units and unit-aware mathematical systems tailored for Scientific AI within JAX. For the usage, please see its documentation: https://brainunit.readthedocs.ioLow-precision training support
Changing the training precision is very easy in
pinnx
. Simply settingmodels will be trained using
bfloat16
.Here is an example: