This is an old revision of the document!
Storage in the Aurora cluster is organized as detailed by Lunarc. Read the following: http://lunarc-documentation.readthedocs.io/en/latest/quick_reference/#file-systems
The home folders are maintained and backed up by Lunarc, and they are in a special storage machine dedicated to users.
Please follow these rules to keep the storage tidy.
# create folder [pflorido@aurora1 hep]$ mkdir /projects/hep/nobackup/shared/pp/pflorido # copy data [pflorido@aurora1 hep]$ cp mydata /projects/hep/nobackup/shared/pp/pflorido/mydata # list file permissions [pflorido@aurora1 ~]$ ls -ltrah /projects/hep/nobackup/shared/pp/pflorido/mydata -rw-r--r--. 1 pflorido hep 0 12 apr 12.32 /projects/hep/nobackup/shared/pp/pflorido/mydata # change to read/write only user and group (no rights to everyone) [pflorido@aurora1 ~]$ chmod 660 /projects/hep/nobackup/shared/pp/pflorido/mydata # check applied permissions [pflorido@aurora1 ~]$ ls -ltrah /projects/hep/nobackup/shared/pp/pflorido/mydata -rw-rw----. 1 pflorido hep 0 12 apr 12.32 /projects/hep/nobackup/shared/pp/pflorido/mydata
scratch
folders. Be kind to your colleagues! storage has a cost.
In particular, the HEP nodes dedicated storage usable by members of all divisions is accessible on hep nodes au[193-216]
at this path:
/projects/hep/nobackup/
The organization of data in the above folder is as follows, and users are encouraged to enforce it:
Folder name | Folder purpose | Description | Subfolders |
---|---|---|---|
software | Application software | This folder hosts software that is not accessible via cvmfs (see later). This usually includes user/project specific libraries and frameworks. | /np for Nuclear Physics users |
/pp for Particle Physics users |
|||
/tp for Theoretical Physics users |
|||
shared | Data that will stay for long term | This folder should be used for long-term stored data. For example, long-term data sets or data needed for the whole duration of a phd project or shared among people belonging to the same research group. | /np for Nuclear Physics users |
/pp for Particle Physics users |
|||
/tp for Theoretical Physics users |
|||
scratch | Data that will stay for short term | This folder should be used for short-term stored data. For example, data needed for a week long calculation or temporary calculation. This folder should be considered unreliable as its contents will be purged from time to time. The cleanup interval is yet to be decided | /np for Nuclear Physics users |
/pp for Particle Physics users |
|||
/tp for Theoretical Physics users |
|||
users | Copy of Iridium User homes | This folder contains a copy of each user's private home folder on Iridium. DO NOT use this on Aurora. This folder will be used as a backup until we plan some other Iridium backup. | /<username> each user her own folder |
/npguests/<username> for Nuclear Physics guests |
|||
/ppguests/<username> for Particle Physics guests |
Other special folders:
/cvmfs | Special folder containing CERN maintained software | user cannot write | This special folder is dedicated to software provided by CERN. This folder is read-only. Usually the content of this folder are managed via specific scripts that a user can run. If you need to add some software that you cannot find, contact the administrators. | /geant4.cern.ch for Nuclear Physics users |
---|---|---|---|---|
/atlas.cern.ch for Particle Physics users |
Description yet to come.
The Iridium storage is still accessible at
/projects/hep/iridium/
Please refer to the documentation at basic_information for it.
There are two storage servers installed for the exclusive use of particle physics.
The recommendation is to use these instead of /project/hep/nobackup
so to give space to the other users of it.
/project/hep/fs2
is a symbolic link to /project/hep/nobackup
Storage path | Intended usage | Size |
---|---|---|
/project/hep/fs3/share | data that is meant to stay for long time | 35TB |
/project/hep/fs4/scratch | temporary data | 35TB |
At the moment, only the user folders are available. We are negotiating a 10TB common storage with Lunarc. More information is soon to come.