Student Cluster
This topic gives you a quick overview of the steps involved to run a job on the D-INFK student cluster.
Login in
If your course or project uses
Jupyter then go to
https://student-jupyter.inf.ethz.ch and log in there. More information can be found
here.
For running jobs directly you can log in to
student-cluster.inf.ethz.ch
or to the actual login nodes
student-cluster1.inf.ethz.ch
and
student-cluster2.inf.ethz.ch
via secure shell (
ssh
).

When you log in you will be informed about the remaining time per course or project and how much free space you have in your home directory. Keep an eye on these numbers that you do not run out of time or space before a deadline.
Running Jobs
Please read on
here.
GPUs
Currently there are two NVidia cards in the cluster:
Limits
You have the following general limitations on resources that you can use.
Home Directory
You have 20 GB of space in your home directory, independent of how many courses or projects you have.
Scratch Space
Your individual scratch space under
/work/scratch/{your user name}
has a hard space limit of 100GB. Data in there has a retention period that depends on the amount of data:
The cleaning job that deletes data according to age starts
23:00
every day.
You are not allowed to keep data alive by automatically updating time stamps of files.
Work Space
Some courses and projects provide additional work space for you or your team under
/work/courses
,
/work/projects
or
/work/users
in which case your TAs or supervisors will inform you.
Jobs
Resources available for jobs have the following default limits:
|
GPU jobs |
CPU jobs |
Number of running jobs |
1 |
1 |
GPUs per job |
1 |
- |
CPU cores per job |
2 |
1, with time sharing |
RAM per job |
24 GB |
4 to 8 GB, course or project specific |
Space in /tmp |
40 GB |
4 GB |
Queued jobs |
2 |
- |
The amount of hours that you have as well as maximum runtime per job depends on your courses and projects. Each course or project comes with its own budget which is displayed when you log in to a login node as well as in the spawner page of
JupyterHub.
For courses and projects with special needs some of these limits may be different in which case your TAs or supervisors will inform you.
Login Nodes
On the login nodes you also have the following restrictions:
Expiration of Access
For courses access to the cluster will be disabled in the morning of the last Monday of the semester holidays. For BsC and MsC projects the access ends on the date requested by your supervisor.
All data in your home directory will also be deleted. Please
copy all data that you still need away before this happens.