Python installation

Note

In the B202 (FRIZ), conda is already installed. You first need to initialize it by typing this command in a terminal:

conda init tcsh

Close the terminal, open a new one, and type:

conda create --name neurocomputing python=3.9
conda activate neurocomputing
conda install -c conda-forge numpy matplotlib jupyterlab scikit-learn

This can take a while, be patient.

Before every session, or when you open a new terminal, you will need to type:

conda activate neurocomputing

Here are the main Python dependencies necessary for the exercises:

If you are using Linux, you can probably install all the dependencies from your package manager. For the others, use either Anaconda or Colab.

Anaconda

Installing Anaconda

Python should be already installed if you use Linux, a very old version if you use MacOS, and probably nothing under Windows. Moreover, Python 2.7 became obsolete in December 2019 but is still the default on some distributions.

For these reasons, we strongly recommend installing Python 3 using the Anaconda distribution, or even better the community-driven fork Miniforge:

https://github.com/conda-forge/miniforge

Anaconda offers all the major Python packages in one place, with a focus on data science and machine learning. To install it, simply download the installer / script for your OS and follow the instructions. Beware, the installation takes quite a lot of space on the disk (around 1 GB), so choose the installation path wisely.

Installing packages

To install packages (for example numpy), you just have to type in a terminal:

conda install numpy

Refer to the docs (https://docs.anaconda.com/anaconda/) to know more.

If you prefer your local Python installation, or if a package is not available or outdated with Anaconda, the pip utility allows to also install virtually any Python package:

pip install numpy

Virtual environments

It is a good idea to isolate the required packages from the rest of your Python installation, otherwise conflicts between package versions may arise.

Virtual environments allow to create an isolated Python distribution for a project. The Python ecosystem offers many tools for that:

As we advise to use Anaconda, we focus here on conda environments, but the logic is always the same.

To create a conda environment with the name neurocomputing using Python 3.9, type in a terminal:

conda create --name neurocomputing python=3.9

You should see that it installs a bunch of basic packages along python:

(base) ~/ conda create --name neurocomputing python=3.9
Collecting package metadata (current_repodata.json): done
Solving environment: done

## Package Plan ##

  environment location: /Users/vitay/Applications/miniforge3/envs/neurocomputing

  added / updated specs:
    - python=3.9


The following NEW packages will be INSTALLED:

  bzip2              conda-forge/osx-arm64::bzip2-1.0.8-h3422bc3_4 None
  ca-certificates    conda-forge/osx-arm64::ca-certificates-2022.9.24-h4653dfc_0 None
  libffi             conda-forge/osx-arm64::libffi-3.4.2-h3422bc3_5 None
  libsqlite          conda-forge/osx-arm64::libsqlite-3.39.4-h76d750c_0 None
  libzlib            conda-forge/osx-arm64::libzlib-1.2.12-h03a7124_4 None
  ncurses            conda-forge/osx-arm64::ncurses-6.3-h07bb92c_1 None
  openssl            conda-forge/osx-arm64::openssl-3.0.5-h03a7124_2 None
  pip                conda-forge/noarch::pip-22.2.2-pyhd8ed1ab_0 None
  python             conda-forge/osx-arm64::python-3.9.13-h96fcbfb_0_cpython None
  readline           conda-forge/osx-arm64::readline-8.1.2-h46ed386_0 None
  setuptools         conda-forge/noarch::setuptools-65.4.1-pyhd8ed1ab_0 None
  sqlite             conda-forge/osx-arm64::sqlite-3.39.4-h2229b38_0 None
  tk                 conda-forge/osx-arm64::tk-8.6.12-he1e0b03_0 None
  tzdata             conda-forge/noarch::tzdata-2022d-h191b570_0 None
  wheel              conda-forge/noarch::wheel-0.37.1-pyhd8ed1ab_0 None
  xz                 conda-forge/osx-arm64::xz-5.2.6-h57fd34a_0 None


Proceed ([y]/n)?

Preparing transaction: done
Verifying transaction: done
Executing transaction: done
#
# To activate this environment, use
#
#     $ conda activate neurocomputing
#
# To deactivate an active environment, use
#
#     $ conda deactivate

Retrieving notices: ...working... done

As indicated at the end of the message, you need to activate the environment to use its packages:

conda activate neurocomputing

When you are done, you can deactivate it, or simply close the terminal.

Note

You need to activate the environment every time you start an exercise or open a new terminal!

You can then install all the required packages to their latest versions, alternating between conda and pip:

conda install numpy matplotlib jupyterlab scikit-learn
pip install tensorflow

If you installed the regular Anaconda and not miniforge, we strongly advise to force using the conda forge channel:

conda install -c conda-forge numpy matplotlib jupyterlab scikit-learn

Alternatively, you can use one of the following files and install everything in one shot:

conda env create -f conda-linux.yml
conda env create -f conda-macos.yml
Note

If you have a CUDA-capable NVIDIA graphical card, follow these instructions to install tensorflow:

https://www.tensorflow.org/install/pip

Using notebooks

When the installation is complete, you just need to download the Jupyter notebooks (.ipynb) on this page, activate your environment, and type:

jupyter lab name_of_the_notebook.ipynb

to open a browser tab with the notebook.

Colab

Another option is to run the notebooks in the cloud, for example on Google Colab:

https://colab.research.google.com/

Colab has all major ML packages already installed, so you do not have to care about anything. Under conditions, you can also use a GPU for free (but for maximally 24 hours in a row).

A link to run the notebooks on colab is provided in the list of exercises. Note that you will need a Google account (a dedicated one is fine if you are concerned about privacy).

If you want to save your progress, make a copy of the notebook in your Google drive when you open the link, or download the notebook at the end.