CryoSPARC-Sapelo2

From Research Computing Center Wiki
Revision as of 16:31, 16 December 2021 by Moses (talk | contribs)
Jump to navigation Jump to search

Category

Engineering

Program On

Sapelo2

Version

3.3.1

Author / Distributor

See https://guide.cryosparc.com/

Description

"CryoSPARC (Cryo-EM Single Particle Ab-Initio Reconstruction and Classification) is a state of the art HPC software solution for complete processing of single-particle cryo-electron microscopy (cryo-EM) data. CryoSPARC is useful for solving cryo-EM structures of membrane proteins, viruses, complexes, flexible molecules, small particles, phase plate data and negative stain data." For more information, please see https://guide.cryosparc.com/.

NOTE: Users are required to be added into GACRC cryosparc group before being allowed to run this software. Please fill out the GACRC General Support form to request. We will reach out to you once we received your request.

Configurations

Master node VM:

  1. Host name: ss-cryo.gacrc.uga.edu
  2. Intel Xeon processors (8 cores and 24GB of RAM)
  3. mongodb is installed

Worker nodes:

  1. Two NVIDIA Tesla K40m nodes, Intel Xeon processors (16 cores and 128GB of RAM) and 8 NVIDIA K40m GPU cards per node.
  2. cryoSPARC recommends to use SSD or caching particle data. /lscratch/gacrc-cryo was set up on worker nodes for this purpose.
  3. The amount of space that cryoSPARC can use in /lscratch/gacrc-cryo is capped at 100GB.

cryoSPARC group: cryosparc

cryoSPARC service account: gacrc-cryo

  1. gacrc-cryo is the service user account that will run the cryoSPARC workflow jobs for all regular cryoSPARC users on the master node and each worker node that will be used for computation.
  2. Some tasks can only be performed by gacrc-cryo, like start or stop cryosparcm from the master node, user management, connect or update worker nodes to master, etc.
  3. Regular cryoSPARC users can still run cryosparcm on the master node to check status of the master and its database, using cryosparcm status and cryosparcm checkdb

cryoSPARC group space: /work/cryosparc/ . There are 6 sub-directories in /work/cryosparc/ :

  1. cryosparc_master/ , cryosparc_worker/ : Master and worker installation folders
  2. database/ : cryoSPARC database folder
  3. users/ : cryoSPARC user project folder
  4. cryosparc_cluster/ : The folder storing cluster integration scripts
  5. testdataset/ : The folder storing cryoSPARC test data
  6. src_v3.3.1/ : The folder storing cryoSPARC sources v3.3.1

Running cryoSPARC from Sapelo2

User Login

User needs SSH forwarding from a local computer, for example, for

ssh -N -L 39000:128.192.75.59:39000 [[1]]

Then open a browser (Chrome) on the local machine and navigate to http://localhost:39000. The user should be presented with the cryoSPARC login page (UGA email and login pwd "123abc").

Documentation

About cryoSPARC: https://guide.cryosparc.com/

User Interface and Usage Guide: https://guide.cryosparc.com/processing-data/user-interface-and-usage-guide

All Job Types in cryoSPARC: https://guide.cryosparc.com/processing-data/all-job-types-in-cryosparc

Management and Monitoring: https://guide.cryosparc.com/setup-configuration-and-management/management-and-monitoring

Introductory Tutorial: https://guide.cryosparc.com/processing-data/cryo-em-data-processing-in-cryosparc-introductory-tutorial

Tutorials and Usage Guides: https://guide.cryosparc.com/processing-data/tutorials-and-case-studies

Cluster (Slurm) integration: https://guide.cryosparc.com/setup-configuration-and-management/how-to-download-install-and-configure/downloading-and-installing-cryosparc#connect-a-cluster-to-cryosparc

Installation

  • Version 3.3.1 master is installed on the master node (ss-cryo.gacrc.uga.edu). Source codes are downloaded in /work/cryosparc/cryosparc_master on the master node.
  • Version 3.3.1 workers are installed on two worker nodes (NVIDIA Tesla K40m GPU nodes rb6-[3-4]). Source codes are downloaded in /work/cryosparc/cryosparc_worker on the master ndoe.

System

64-bit Linux