Difference between revisions of "Cluster Information"

From Earlham CS Department
Jump to navigation Jump to search
(Summer of Fun (2009))
m (Summer of Fun (2009))
Line 6: Line 6:
 
* MPI (hybrid mp and shared)
 
* MPI (hybrid mp and shared)
 
* OpenMP + MPI (hybrid)
 
* OpenMP + MPI (hybrid)
 +
 
GalaxSee Goals
 
GalaxSee Goals
 
* Good piece of code, serves as teaching example for n-body problems in petascale.
 
* Good piece of code, serves as teaching example for n-body problems in petascale.
Line 11: Line 12:
 
* Architecture generally supports hybrid model running on large-scale constellations.
 
* Architecture generally supports hybrid model running on large-scale constellations.
 
* Produces runtime data that enables nice comparisons across multiple resources (scaling, speedup, efficiency).
 
* Produces runtime data that enables nice comparisons across multiple resources (scaling, speedup, efficiency).
 +
* Render in BCCD, metaverse, and /dev/null environments.
  
 
GalaxSee - scale to petascale with MPI and OpenMP hybrid.
 
GalaxSee - scale to petascale with MPI and OpenMP hybrid.

Revision as of 10:27, 14 May 2009

Summer of Fun (2009)

Implementations of area under the curve

  • Serial
  • OpenMP (shared)
  • MPI (message passing)
  • MPI (hybrid mp and shared)
  • OpenMP + MPI (hybrid)

GalaxSee Goals

  • Good piece of code, serves as teaching example for n-body problems in petascale.
  • Dials, knobs, etc. in place to easily control how work is distributed when running in parallel.
  • Architecture generally supports hybrid model running on large-scale constellations.
  • Produces runtime data that enables nice comparisons across multiple resources (scaling, speedup, efficiency).
  • Render in BCCD, metaverse, and /dev/null environments.

GalaxSee - scale to petascale with MPI and OpenMP hybrid.

  • GalaxSee - render in-world and steer from in-world.
  • Area under a curve - serial, MPI, and OpenMP implementations.
  • OpenMPI - testing, performance.
  • Start May 11th

To Do

  • Subscribe to ccg@cs.earlham.edu
  • Work on two poster abstracts
  • Work on team essay

(Old) To Do

BCCD Liberation

  • v1.1 release - upgrade procedures

Curriculum Modules

  • POVRay
  • GROMACS
  • Energy and Weather
  • Dave's math modules
  • Standard format, templates, how-to for V and V

LittleFe

Infrastructure

  • Masa's GROMACS interface on Cairo
  • gridgate configuration, Open Science Grid peering
  • hopper'

SC Education

Current Projects

Past Projects

General Stuff

Items Particular to a Specific Cluster

Curriculum Modules

Possible Future Projects

Archive