Named in honor of American inventor Thomas Alva Edison, Berkeley Lab's newest supercomputer can execute nearly 2.4 quadrillion calculations per second at peak theoretical speeds. | Photo by Roy Kaltschmidt, Berkeley Lab.

Titan, Mira, Sequoia and Hopper. Superheroes? No, supercomputers.

This month on, we'll be highlighting these supercomputers and the amazing science they make possible.

Supercomputers are used to model and simulate complex, dynamic systems with many data points that would be too expensive, impractical or impossible to physically demonstrate. Data scientists write codes and algorithms that simulate each individual component of the model or process being explored. This means that scientists can simulate the evolution of our universe star by star and galaxy by galaxy, or even the heart’s electrical system at the cellular level. Supercomputers also play a critical role in keeping our nuclear stockpile safe, secure and effective. Supercomputer simulations help scientists understand everything from weapon design to safety features and overall performance -- all without physical testing.

The Energy Department's National Labs have incredible computational resources, including some of the fastest supercomputers in the world. Several of these computers operate at the petascale, or in excess of one quadrillion floating point operations, or calculations, per second. The National Labs are even exploring the next frontier in high performance computing -- the exascale. Though none have been built just yet, exascale supercomputers will achieve at least one quintillion calculations per second.

Many of the supercomputers at the National Labs also serve as user facilities -- scientific facilities that are available for use by researchers from other government agencies, the private sector and academia.

Join us on Facebook, Twitter, Google+ and Instagram this month as we explore the science of supercomputing.