Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
en:services:application_services:high_performance_computing:hail [2019/04/09 10:39]
tehlers [Interactive Sessions]
en:services:application_services:high_performance_computing:hail [2019/04/15 15:33] (current)
ckoehle2 update reserved core count for interactive sessions
Line 35: Line 35:
 A Spark cluster to be used with Scala from the [[https://​spark.apache.org/​docs/​latest/​quick-start.html|interactive console]] can be spawned in a similar fashion, except we start an interactive LSF job and use the wrapper script ''​lsf-spark-shell.py''​ instead: A Spark cluster to be used with Scala from the [[https://​spark.apache.org/​docs/​latest/​quick-start.html|interactive console]] can be spawned in a similar fashion, except we start an interactive LSF job and use the wrapper script ''​lsf-spark-shell.py''​ instead:
 <​code>​ <​code>​
-srun -p int -N 4 --ntasks-per-node=-t 01:00:00 lsf-spark-shell.sh+srun -p int -N 4 --ntasks-per-node=20 -t 01:00:00 lsf-spark-shell.sh
 </​code>​ </​code>​
 ===== Running Hail ===== ===== Running Hail =====
Line 71: Line 71:
 An LSF job running the ''​pyspark''​-based console for Hail can then be submitted as follows: An LSF job running the ''​pyspark''​-based console for Hail can then be submitted as follows:
 <​code>​ <​code>​
-bsub -int -4 -R span[ptile=1] -01:00 -ISs lsf-pyspark-hail.sh+srun -int -4 --ntasks-per-node=20 -01:00:00 lsf-pyspark-hail.sh
 </​code>​ </​code>​
 Once the console is running, initialize hail with the global Spark context ''​sc''​ in the following way: Once the console is running, initialize hail with the global Spark context ''​sc''​ in the following way: