You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently we have two SLURM job script for spider, to support conda and tykky. We could maybe combine these two in a single more flexible script, something like:
#!/bin/bash#SBATCH --nodes=1#SBATCH --ntasks=1#SBATCH --time=24:00:00#SBATCH --cpus-per-task=4#SBATCH --partition=normalsource~/.bashrc
NODE=`hostname -s`
PORT=`shuf -i 8400-9400 -n 1`
LPORT=${LPORT:-8889}
PYTHON=${PYTHON:-python}echo"Run the following on your local machine: "echo"ssh -i /path/to/private/ssh/key -N -L ${LPORT}:${NODE}:${PORT}${USER}@spider.surf.nl"${PYTHON} -m jupyter lab --no-browser --port=${PORT} --ip=${NODE}
With such a script, one could set to use Python from conda:
export PYTHON=`conda activate jupyter_dask && which python`
sbatch jupyter_dask_spider.bsh
Currently we have two SLURM job script for spider, to support
conda
andtykky
. We could maybe combine these two in a single more flexible script, something like:With such a script, one could set to use Python from
conda
:or from
tykky
:or even from another container image to be retrieved e.g. from Docker Hub:
The text was updated successfully, but these errors were encountered: