Install Conda and friends on Google Colab, easily.
TLDR: Check the example notebook here!
On your Colab notebook, run the following code as the first executable cell:
!pip install -q condacolab
import condacolab
condacolab.install()
After the kernel restart, you can optionally add a new cell to check that everything is in place:
import condacolab
condacolab.check()
It is important that you perform the installation first thing in the notebook because it will require a kernel restart, thus resetting the variables set up to that point.
The default condacolab.install()
provides Mambaforge, but there are other conda
distributions to choose from:
install_anaconda()
: This will install the Anaconda 2022.05 distribution, the most recent version built for Python 3.7 at the time of update. This contains plenty of data science packages (link to current version docs).install_miniconda()
: This will install the Miniconda 4.12.0 distribution, using a version built for Python 3.7. Unlike Anaconda, this distribution only containspython
andconda
.install_miniforge()
: Like Miniconda, but built offconda-forge
packages. The Miniforge distribution is officially provided by conda-forge but I forked and patched it so it's built for Python 3.7.install_mambaforge()
: Like Miniforge, but withmamba
included. The Mambaforge distribution is officially provided by conda-forge but I forked and patched it so it's built for Python 3.7.
For advanced users, install_from_url()
is also available. It expects a URL pointing to a constructor
-like installer, so you can prebuild a Python 3.7 distribution that fulfills your own needs.
If you want to build your own
constructor
-based installer, check the FAQ below!
Once the installation is done, you can use conda
and/or mamba
to install the needed packages:
!conda install openmm
# or, faster:
!mamba install openmm
If you have a environment file (e.g. environment.yml
), you can use it like this:
!conda env update -n base -f environment.yml
# or, faster:
!mamba env update -n base -f environment.yml
- The Python kernel needs to be restarted for changes to be applied. This happens automatically. If you are wondering why you are seeing a message saying "Your session crashed for an unknown reason", this is why. You can safely ignore this message!
- You can only use the
base
environment, so do not try to create more environments withconda create
.
Google Colab runs on Python 3.7. We install the Miniconda distribution on top of the existing one at /usr/local
, add a few configuration files so we stay with Python 3.7 (conda
auto updates by default) and the newly installed packages are available. Finally, we wrap the Python executable to redirect and inject some environment variables needed to load the new libraries. Since we need to re-read LD_LIBRARY_PATH
, a kernel restart is needed.
The recommended approach is to build your own constructor
-based installer. We have provided an example in constructor-example/construct.yaml
.
You can generate a
constructor
installer on Colab too! Follow this tutorial.
Locally, follow these steps:
- In your local computer:
conda create -n constructor -c conda-forge constructor
conda activate constructor
mkdir my-installer
cd my-installer
curl -sLO https://raw.githubusercontent.com/jaimergp/condacolab/main/constructor-example/construct.yaml
curl -sLO https://raw.githubusercontent.com/jaimergp/condacolab/main/constructor-example/pip-dependencies.sh
- Add your
conda
packages toconstruct.yaml
in thespecs
section. Read the comments to respect the constrains already present! You can also adapt the metadata to your liking. - If you do need to install
pip
requirements, uncomment thepost_install
line and editpip-dependencies.sh
. - Run
constructor --platform linux-64 .
- Upload the resulting
.sh
to an online location with a permanent URL. GitHub Releases is great for this! - In Colab, run:
!pip install -q condacolab
import condacolab
condacolab.install_from_url(URL_TO_YOUR_CUSTOM_CONSTRUCTOR_INSTALLER)
Yes, as long as you make sure you also install rpy2
to overwrite Colab's installation.
See issue #26 for more details.