HPC

To facilitate faculty and student research, the College operates two computing clusters for different usages:

  • A general-purpose computing cluster that can distribute multiple copies of an application over many computing cores, to run:
    • independently with different parameters, or
    • correlated with messages passed between themselves
  • A Hadoop cluster designed for big data distributed over many computing cores and processed using one of the frameworks MapReduce or Spark.

Such distributed computing is related to the ideas of parallel computing (which could occur within a single computer with multiple CPUs) and grid computing (describing a large number of mostly independent computers that cooperate on a project).