Navy DoD Supercomputing Resource Center's New Supercomputer Adds Over 17 PetaFLOPS of Computing Power to DoD's Capacity

Navy DoD Supercomputing Resource Center’s New Supercomputer Adds Over 17 PetaFLOPS of Computing Power to DoD’s Capacity

The Department of Defense’s (DoD) High Performance Computing Modernization Program (HPCMP) recently completed a portion of its FY 2023 investment in supercomputing capabilities to support the DoD science and technology (S&T), test and evaluation (T&E), and procurement engineering communities. The acquisition consists of a supercomputing system with related hardware and software maintenance services. At 17.7 petaFLOPS, this system will replace three older supercomputers in the DoD HPCMPs ecosystem and ensure its aggregate supercomputing capacity remains above 100 petaFLOPS, with the latest available technology. This system significantly improves the ability of programs to support the Department of Defense’s most demanding computational challenges and includes AMD’s next-generation acceleration technology in the form of 128 AMD MI300A Accelerator Processing Units (APUs).
The system will be installed at the Navy DoD Supercomputing Resource Center (Navy DSRC) facility operated by the Commander, Naval Meteorology and Oceanography Command (CNMOC) at Stennis Space Center, Mississippi and will provide high performance computing capabilities for users of all Department services and agencies. The system architecture is as follows:
An HPE Cray EX4000 system with 256,512 total processing cores, consisting of AMD EPYC Genoa processors, 128 AMD MI300A accelerator processing units (APUs), and 24 NVIDIA L40 GPGPUs connected by a 200 gigabits per second Cray Slingshot-11 interconnect and backed by 20 PB of usable Cray ClusterStor E1000 Luster storage, including 2 PB of NVMe-based solid-state storage and 538 TiB of system memory.
The system is expected to enter production service in 2024. It will be named BLUEBACK after the United States Navy submarine USS Blueback (SS-581) and will join existing Navy DSRC HPC systems NARWHAL, a 308,480 core HPE Cray EX supercomputer that is currently the largest unclassified supercomputer in the DoD and is named in honor of USS Narwhal (SSN-671) and NAUTILUS, a 1 76.12 Penguin TrueHPC 8-core supercomputer named in honor of USS Nautilus (SSN-571).
Learn about the Department of Defense’s High Performance Computing Modernization Program (HPCMP).
The HPCMP provides the Department of Defense with supercomputing capabilities, high-speed network communications, and computational science expertise that enable Department of Defense scientists and engineers to conduct a broad range of focused research and development, test and evaluation, and procurement engineering activities. This partnership puts advanced technology into the hands of U.S. forces faster, less costly, and with greater certainty of success. Today, the HPCMP provides a comprehensive and advanced computing environment for the DoD that includes unique software development and systems design expertise, powerful high-performance computing systems, and a world-class research network. HPCMP is operated on behalf of the Department of Defense from the US Army Engineer Research and Development Center located in Vicksburg, Mississippi.
For more information, please visit our website at: https://www.hpc.mil.

About the Naval Oceanography and Meteorology Command
Naval Oceanography has approximately 2,900 globally distributed military and civilian personnel who collect, process and exploit environmental information to assist fleet and joint commanders in all warfare areas to ensure the freedom of action of the United States Navy in the physical battlespace from the depths of the ocean to the stars.







Date taken: 20.07.2023
Publication date: 07/20/2023 15:21
Story ID: 449663
Position: SPACE CENTER STENNIS, SM, USA






Web Views: 157
Downloads: 0

PUBLIC DOMAIN


#Navy #DoD #Supercomputing #Resource #Centers #Supercomputer #Adds #PetaFLOPS #Computing #Power #DoDs #Capacity
Image Source : www.dvidshub.net

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *