CryoSPARC-Sapelo2: Difference between revisions
No edit summary |
No edit summary |
||
Line 38: | Line 38: | ||
# gacrc-cryo is the service user account that will run the cryoSPARC workflow jobs for all regular cryoSPARC users on the master node and each worker node that will be used for computation. | # gacrc-cryo is the service user account that will run the cryoSPARC workflow jobs for all regular cryoSPARC users on the master node and each worker node that will be used for computation. | ||
# Some tasks can only be performed by gacrc-cryo, like start or stop cryosparcm from the master node, user management, connect or update worker nodes to master, etc. | # Some tasks can only be performed by gacrc-cryo, like start or stop cryosparcm from the master node, user management, connect or update worker nodes to master, etc. | ||
# Regular | # Regular cryoSPARC users can still run cryosparcm on the master node to check status of the master and its database, using cryosparcm status and cryosparcm checkdb | ||
'''cryoSPARC group space:''' /work/cryosparc/ . There are 6 sub-directories in /work/cryosparc/ : | '''cryoSPARC group space:''' /work/cryosparc/ . There are 6 sub-directories in /work/cryosparc/ : | ||
# cryosparc_master/ , cryosparc_worker/ : | # cryosparc_master/ , cryosparc_worker/ : Master and worker installation folders | ||
# database/ : cryoSPARC database folder | # database/ : cryoSPARC database folder | ||
# users/ : cryoSPARC user project folder | # users/ : cryoSPARC user project folder | ||
# cryosparc_cluster/ : | # cryosparc_cluster/ : The folder storing cluster integration scripts | ||
# testdataset/ : | # testdataset/ : The folder storing cryoSPARC test data | ||
# src_v3.3.1/ : | # src_v3.3.1/ : The folder storing cryoSPARC sources v3.3.1 | ||
# | # | ||
Line 55: | Line 55: | ||
===== User Login ===== | ===== User Login ===== | ||
User needs SSH forwarding from a local computer, for example, for | |||
ssh -N -L 39000:128.192.75.59:39000 [[Mailto:zhuofei@ss-cryo.gacrc.uga.edu|zhuofei@ss-cryo.gacrc.uga.edu]] | |||
Then open a browser (Chrome) on the local machine and navigate to <code><nowiki>http://localhost:39000</nowiki></code>. The user should be presented with the cryoSPARC login page (UGA email and login pwd "123abc"). | |||
=== Documentation === | === Documentation === |
Revision as of 15:31, 16 December 2021
Category
Engineering
Program On
Sapelo2
Version
3.3.1
Author / Distributor
See https://guide.cryosparc.com/
Description
"CryoSPARC (Cryo-EM Single Particle Ab-Initio Reconstruction and Classification) is a state of the art HPC software solution for complete processing of single-particle cryo-electron microscopy (cryo-EM) data. CryoSPARC is useful for solving cryo-EM structures of membrane proteins, viruses, complexes, flexible molecules, small particles, phase plate data and negative stain data." For more information, please see https://guide.cryosparc.com/.
NOTE: Users are required to be added into GACRC cryosparc group before being allowed to run this software. Please fill out the GACRC General Support form to request. We will reach out to you once we received your request.
Configurations
Master node VM:
- Host name: ss-cryo.gacrc.uga.edu
- Intel Xeon processors (8 cores and 24GB of RAM)
- mongodb is installed
Worker nodes:
- Two NVIDIA Tesla K40m nodes, Intel Xeon processors (16 cores and 128GB of RAM) and 8 NVIDIA K40m GPU cards per node.
- cryoSPARC recommends to use SSD or caching particle data. /lscratch/gacrc-cryo was set up on worker nodes for this purpose.
- The amount of space that cryoSPARC can use in /lscratch/gacrc-cryo is capped at 100GB.
cryoSPARC group: cryosparc
cryoSPARC service account: gacrc-cryo
- gacrc-cryo is the service user account that will run the cryoSPARC workflow jobs for all regular cryoSPARC users on the master node and each worker node that will be used for computation.
- Some tasks can only be performed by gacrc-cryo, like start or stop cryosparcm from the master node, user management, connect or update worker nodes to master, etc.
- Regular cryoSPARC users can still run cryosparcm on the master node to check status of the master and its database, using cryosparcm status and cryosparcm checkdb
cryoSPARC group space: /work/cryosparc/ . There are 6 sub-directories in /work/cryosparc/ :
- cryosparc_master/ , cryosparc_worker/ : Master and worker installation folders
- database/ : cryoSPARC database folder
- users/ : cryoSPARC user project folder
- cryosparc_cluster/ : The folder storing cluster integration scripts
- testdataset/ : The folder storing cryoSPARC test data
- src_v3.3.1/ : The folder storing cryoSPARC sources v3.3.1
Running cryoSPARC from Sapelo2
User Login
User needs SSH forwarding from a local computer, for example, for
ssh -N -L 39000:128.192.75.59:39000 [[1]]
Then open a browser (Chrome) on the local machine and navigate to http://localhost:39000
. The user should be presented with the cryoSPARC login page (UGA email and login pwd "123abc").
Documentation
About cryoSPARC: https://guide.cryosparc.com/
User Interface and Usage Guide: https://guide.cryosparc.com/processing-data/user-interface-and-usage-guide
All Job Types in cryoSPARC: https://guide.cryosparc.com/processing-data/all-job-types-in-cryosparc
Management and Monitoring: https://guide.cryosparc.com/setup-configuration-and-management/management-and-monitoring
Introductory Tutorial: https://guide.cryosparc.com/processing-data/cryo-em-data-processing-in-cryosparc-introductory-tutorial
Tutorials and Usage Guides: https://guide.cryosparc.com/processing-data/tutorials-and-case-studies
Cluster (Slurm) integration: https://guide.cryosparc.com/setup-configuration-and-management/how-to-download-install-and-configure/downloading-and-installing-cryosparc#connect-a-cluster-to-cryosparc
Installation
- Version 3.3.1 master is installed on the master node (ss-cryo.gacrc.uga.edu). Source codes are downloaded in /work/cryosparc/cryosparc_master on the master node.
- Version 3.3.1 workers are installed on two worker nodes (NVIDIA Tesla K40m GPU nodes rb6-[3-4]). Source codes are downloaded in /work/cryosparc/cryosparc_worker on the master ndoe.
System
64-bit Linux