Skip to content

Research at UC Berkeley using IPython

Dav Clark edited this page Feb 7, 2014 · 21 revisions

If you are at Berkeley and using IPython for your research, please add yourself to this list

  • Kathryn (Katy) Huff - I use IPython and the IPython notebook for everything. In particular, I use IPython and the notebook for prototyping code that contributes to my nuclear engineering research. My most recent IPython notebook uses python packages such as PyNE and yt to generate a large 3-D regular mesh of the core of the Pebble-Bed Fluoride-Salt-Cooled High-Temperature Nuclear Reactor. This mesh and the materials described on it will be used as input for high-fidelity neutronics simulations using external codes such as Serpent, MCNP, and MOOSE.

  • Min Ragan-Kelley (Plasma Theory and Simulation Group, Applied Science & Technology) We had an existing C++ plasma physics simulation application, with a primitive GUI (OOPIC, OOPD1). I removed the GUI code, and turned the application into a C++ library, which I exposed to Python via Cython wrappers. With the new Python API, I was able to use IPython notebooks to build and run simulations interactively, with immediate feedback, plotting (with matplotlib), and analysis (with numpy and scipy). This new code was applied to a system of bipolar flow of electrons and ions in a diode, allowing a automated optimization of physical parameters, a more efficient approach, which was able to reproduce results that had previously taken days, in just minutes. This work is currently under review for publication. I was also able to easily build HTML reports with nbconvert, which I frequently used to discuss results with my advisor, who had moved to a different university.

  • Isaac Shivvers - I'm a graduate student in astrophysics, and I use the IPython notebook extensively in my research on the optical spectra of supernovae. I've found it incredibly useful as a lab notebook of sorts, keeping track of my ideas and calculations at the beginning of a new project as I try various approaches and try to understand the results. I've been able to run a parallel notebook cluster on one of the large astronomy computers and then access and run it through my browser on my laptop, which has made very long/difficult calculations much more friendly.

  • Chris Holdgraf - I'm studying computational and cognitive neuroscience, using a method called electrocorticography to collect brain activity from surgical patients. I'm interested in understanding the mechanisms that the brain has for making sense of noisy or incomplete auditory information, and I investigate this by fitting linear regression models as well as doing classification. It's fairly code-intensive work, and IPython has drastically improved the readability and effectiveness of my codebase. I have 5 notebooks that I use for my analysis. My raw data goes into the first notebook, and then the final product of each notebook is saved to an HDF file so that it can be reopened by the next notebook. These are all version controlled, so it stores the complete history of my analysis for a particular set of data. It's also really useful because it lets me keep a visual record of my findings via the inline plots of the notebook.

  • Beth Reid (Cosmology Data Science Fellow @ Berkeley Center for Cosmological Physics) - My work centers on making three-dimensional maps of the cosmos with galaxy redshift surveys, and using those maps to infer fundamental properties of our universe (dark matter, dark energy, gravity, galaxy formation, ...) I use IPython daily in my data analysis workflow. I was a Mathematica Notebook user in graduate school, but quickly decided when I saw the license fees that keeping my code in Mathematica was not sustainable through the frequent institutional shuffle of postdoc-hood. I transitioned to a C/Python combination, but really missed the notebook feature of Mathematica. It's the perfect format to save the scientific process -- wrangle/plot data and perform calculations until you conclude something ("I can ignore the XYZ effect for this analysis or Our data gives a 5 sigma detection of ZYX.") Saving the PROCESS to later look back at when you're writing a paper (or reconsidering why you made some decision 6 months ago) is invaluable. So that's why I started using IPython. However, I'm slowly learning that IPython has a lot more to offer as well -- for example, tab completion on functions and argument names makes analysis much quicker and smoother.

  • Cindee Madison (Helen Wills Neuroscience Institute) - The lab uses multi-modal brain imaging to study healthy aging and dementia. We use IPython notebooks as part of our provenance tracking for many different kinds of biomarker and cognitive data processing. Most studies have an ipython notebook directory with a number of relevent notebooks for many stages of data analysis. In addition I use IPython notebooks to teach everything from image processing, to machine learning, to basic statistics in the lab.

  • Dav Clark (D-Lab) - I and other instructors use IPython notebooks extensively in teaching, and we are actively building out our research arm on building effective and inclusive teaching. Likewise, for my dissertation work on the attitude & belief impacts of climate change education, I used IPython notebooks extensively to provide a narrative along with data processing steps, before evaluating models and visualizing data in R using rpy2 (again, in the notebook).

Clone this wiki locally