CryoSPARC-Sapelo2: Difference between revisions
No edit summary |
No edit summary |
||
Line 26: | Line 26: | ||
# Host name: ss-cryo.gacrc.uga.edu | # Host name: ss-cryo.gacrc.uga.edu | ||
# | # Intel Xeon processors (8 cores and 24GB of RAM) | ||
# mongodb is installed | # mongodb is installed | ||
'''Worker nodes:''' | '''Worker nodes:''' | ||
# Two NVIDIA Tesla K40m nodes | # Two NVIDIA Tesla K40m nodes, Intel Xeon processors (16 cores and 128GB of RAM) and 8 NVIDIA K40m GPU cards per node. | ||
# cryoSPARC recommends to use SSD or caching particle data. /lscratch/gacrc-cryo was set up on worker nodes for this purpose. | # cryoSPARC recommends to use SSD or caching particle data. /lscratch/gacrc-cryo was set up on worker nodes for this purpose. | ||
# The amount of space that cryoSPARC can use in /lscratch/gacrc-cryo is capped at 100GB. | # The amount of space that cryoSPARC can use in /lscratch/gacrc-cryo is capped at 100GB. |
Revision as of 11:37, 16 December 2021
Category
Engineering
Program On
Sapelo2
Version
3.3.1
Author / Distributor
See https://guide.cryosparc.com/
Description
"CryoSPARC (Cryo-EM Single Particle Ab-Initio Reconstruction and Classification) is a state of the art HPC software solution for complete processing of single-particle cryo-electron microscopy (cryo-EM) data. CryoSPARC is useful for solving cryo-EM structures of membrane proteins, viruses, complexes, flexible molecules, small particles, phase plate data and negative stain data." For more information, please see https://guide.cryosparc.com/.
NOTE: Users are required to be added into GACRC cryosparc group before being allowed to run this software. Please fill out the GACRC General Support form to request. We will reach out to you after we get your request.
Configurations on Sapelo2
Master node VM:
- Host name: ss-cryo.gacrc.uga.edu
- Intel Xeon processors (8 cores and 24GB of RAM)
- mongodb is installed
Worker nodes:
- Two NVIDIA Tesla K40m nodes, Intel Xeon processors (16 cores and 128GB of RAM) and 8 NVIDIA K40m GPU cards per node.
- cryoSPARC recommends to use SSD or caching particle data. /lscratch/gacrc-cryo was set up on worker nodes for this purpose.
- The amount of space that cryoSPARC can use in /lscratch/gacrc-cryo is capped at 100GB.
Running Program
Also refer to Running Jobs on Sapelo2.
Documentation
About cryoSPARC: https://guide.cryosparc.com/
User Interface and Usage Guide: https://guide.cryosparc.com/processing-data/user-interface-and-usage-guide
All Job Types in cryoSPARC: https://guide.cryosparc.com/processing-data/all-job-types-in-cryosparc
Management and Monitoring: https://guide.cryosparc.com/setup-configuration-and-management/management-and-monitoring
Introductory Tutorial: https://guide.cryosparc.com/processing-data/cryo-em-data-processing-in-cryosparc-introductory-tutorial
Tutorials and Usage Guides: https://guide.cryosparc.com/processing-data/tutorials-and-case-studies
Cluster (Slurm) integration: https://guide.cryosparc.com/setup-configuration-and-management/how-to-download-install-and-configure/downloading-and-installing-cryosparc#connect-a-cluster-to-cryosparc
Installation
- Version 3.3.1 master is installed on the master node (ss-cryo.gacrc.uga.edu). Source codes are downloaded in /work/cryosparc/cryosparc_master on the master node.
- Version 3.3.1 workers are installed on two worker nodes (NVIDIA Tesla K40m GPU nodes rb6-[3-4]). Source codes are downloaded in /work/cryosparc/cryosparc_worker on the master ndoe.
System
64-bit Linux