This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
iridium_cluster:testnodes [2014/06/02 08:17] florido |
iridium_cluster:testnodes [2015/09/25 08:42] florido [Setup the work environment] |
||
---|---|---|---|
Line 8: | Line 8: | ||
^ hostname ^ purpose ^ | ^ hostname ^ purpose ^ | ||
| **nptest**-iridium.lunarc.lu.se | test node for nuclear physics | | | **nptest**-iridium.lunarc.lu.se | test node for nuclear physics | | ||
- | | **pptest**-iridium.lunarc.lu.se | test node for particle physics | | + | | **pptest**-iridium.lunarc.lu.se | test node for particle physics and theoretical physics | |
- | They can also be used from time to time to host courses. | + | They can also be used from time to time to host temporary courses. |
They are meant for **interactive** access to the cluster, as opposed to the ''arc-iridium.lunarc.lu.se'' batch interface that can be used for batch submission of jobs. (link and explanation will come) | They are meant for **interactive** access to the cluster, as opposed to the ''arc-iridium.lunarc.lu.se'' batch interface that can be used for batch submission of jobs. (link and explanation will come) | ||
Line 26: | Line 26: | ||
---- | ---- | ||
- | **Particle physics:** | + | **Particle physics and Theoretical Physics:** |
<code> | <code> | ||
# access pptest-iridium | # access pptest-iridium | ||
Line 83: | Line 83: | ||
Administrators provided scripts for quick setup of your work enviroment. | Administrators provided scripts for quick setup of your work enviroment. | ||
- | Just execute the command in the column //Script to run// at the shell prompt, or add it to your ''.bashrc'' or ''.bash_profile'' file so that is executed every time you login. | + | Just execute the command in the column //Script to run// at the shell prompt, or add it to your ''.bash_profile'' file so that is executed every time you login. |
- | The following are active now: | + | :!:**NOTE:** do NOT add these scripts to ''.bashrc'' as suggested previously or you will not be able to rsync/scp. Contents of ''.bashrc'' are NOT supposed to generate output, but unfortunately some of these scripts do. |
- | ^ Environment | Script to run | Description | | + | |
- | ^ ATLAS Experiment environment | ''setupATLAS'' | Will setup all the needed environment variables for ATLAS experiment, and present a selection of other environments that the user can setup. | | + | |
- | ^ Various other environments through //module// | <code>module avail</code> | Will show a list of available environments. To enable one, execute the command <code>module load <name of environment></code> | | + | |
+ | :!: **NEW!** 20150923 :!: All the environments should be enabled using ''modules''. This will also work with the batch system. The previous solution didn't. Please update your scripts. | ||
+ | ^ Environment | Commands to run | Description | | ||
+ | ^ List existing environments | <code bash>module avail</code> | Will show a list of available environments. To enable one, execute the command <code>module load <name of environment></code> More info on modules on http://modules.sourceforge.net/ | | ||
+ | ^ ATLAS Experiment environment | <code bash>module load enableATLAS; | ||
+ | setupATLAS</code> | Will setup all the needed environment variables for ATLAS experiment, and present a selection of other environments that the user can setup. | | ||
+ | |||
+ | ==== Keeping your session alive and your work running even if you disconnect ==== | ||
+ | |||
+ | The cluster offers you various tools, among these [[:it_tips:screen|Screen]] and the usual ''nohup'' command to detach from terminal output. | ||
+ | |||
+ | However, I personally suggest a tool called byobu, that is essentially Screen or Tmux on steroids. | ||
+ | You can read about it here: | ||
+ | https://help.ubuntu.com/community/Byobu | ||
==== Local disk space on the test nodes ==== | ==== Local disk space on the test nodes ==== | ||
- | Every node has a local ''/tmp'' temporary disk space that can be used for computations. The contents of such space will be deleted regularly. Users can put any sort of data there. | + | Every node has a local ''/tmp'' temporary disk space that can be used for computations. The contents of such space will be deleted regularly. Users can put any sort of data there. Currently the space available is 300GB. |
+ | |||