First steps: Difference between revisions

From Tycho
Jump to navigation Jump to search
No edit summary
No edit summary
 
(10 intermediate revisions by 2 users not shown)
Line 3: Line 3:
* Get a user account by contacting a sponsor and signing up at  
* Get a user account by contacting a sponsor and signing up at  


https://hpc.ku.dk/account.html https://hpc.ku.dk/account.html
https://hpc.ku.dk/account.html


You have to select "Astro" as your group
You have to select "Astro" as your group. '''Remember to sign the rules-of-conduct-form'''


* Set up an SSH key pair for secure passwordless login (see [Accessing Tycho])
* If you have received the old welcome e-mail without a link to this wiki, the info is not correct. You should never login to fend0X.hpc.ku.dk, but instead use the Tycho-specific frontends called astro0X.hpc.ku.dk (see [[Hardware]] for an up to date list of available frontends / Analysis hardware).
 
* Set up an SSH key pair for secure passwordless login (see [[Accessing Tycho]])
 
* Add <code>module load astro</code> to you <code>$HOME/.bashrc</code> file to have access to all the custom installed software. You can read more about the <code>module</code> command.


* Add <code>module load astro</code> to you <code>$HOME/.bashrc</code> file to have access to all the custom installed software.
* Consider changing <code>umask 077</code> to <code>umask 027</code> in <code>$HOME/.bashrc</code> to allow collaborators and/or your supervisor read access to your files when logged in to the cluster.
* Consider changing <code>umask 077</code> to <code>umask 027</code> in <code>$HOME/.bashrc</code> to allow collaborators and/or your supervisor read access to your files when logged in to the cluster.
* If you are going to travel, but want to continue working on Tycho while you are travelling, you need to set up a Dynamic Firewall '''before''' traveling. You can do so by following the detailed instructions [https://hpc.ku.dk/documentation/otp.html: here]
* You can set up e.g. Visual Studio Code for remote development for transparent editing of files on the cluster. See [[Visual Studio Remote Development]]
* Every user has a 50 GB quota for the home folder (<code>/groups/astro/yourusername</code>. Whenever you are going to be working with large amounts of data, considering using the <code>/lustre/astro/yourusername</code> directory. This scratch folder is residing on a ZFS based high performance Lustre filesystem with dedicated hardware for our group. No quotas are enforced and the total space (disregarding the transparent compression) is 1300 TB.

Latest revision as of 08:14, 15 January 2024

To start using Tycho, please follow these steps

  • Get a user account by contacting a sponsor and signing up at

https://hpc.ku.dk/account.html

You have to select "Astro" as your group. Remember to sign the rules-of-conduct-form

  • If you have received the old welcome e-mail without a link to this wiki, the info is not correct. You should never login to fend0X.hpc.ku.dk, but instead use the Tycho-specific frontends called astro0X.hpc.ku.dk (see Hardware for an up to date list of available frontends / Analysis hardware).
  • Add module load astro to you $HOME/.bashrc file to have access to all the custom installed software. You can read more about the module command.
  • Consider changing umask 077 to umask 027 in $HOME/.bashrc to allow collaborators and/or your supervisor read access to your files when logged in to the cluster.
  • If you are going to travel, but want to continue working on Tycho while you are travelling, you need to set up a Dynamic Firewall before traveling. You can do so by following the detailed instructions here
  • Every user has a 50 GB quota for the home folder (/groups/astro/yourusername. Whenever you are going to be working with large amounts of data, considering using the /lustre/astro/yourusername directory. This scratch folder is residing on a ZFS based high performance Lustre filesystem with dedicated hardware for our group. No quotas are enforced and the total space (disregarding the transparent compression) is 1300 TB.