- Join BJ's Wholesale Club for just $20 right now to save on holiday shopping
- This $28 'magic arm' makes taking pictures so much easier (and it's only $20 for Black Friday)
- Two free ways to get a Perplexity Pro subscription for one year
- The 40+ best Black Friday PlayStation 5 deals 2024: Deals available now
- The 25+ best Black Friday Nintendo Switch deals 2024
A Golden Era of HPC in Government Meets Accelerating Demands
When Seymour Cray built what is generally considered to be the first supercomputer in 1964, it ran 1 megaflop, (a million floating point operations per second). Today, the Frontier supercomputer at Oak Ridge National Laboratory in Tennessee performs one quintillion (a billion billion) operations per second. But the New York Times recently reported that Chinese researchers had broken the exascale computing barrier, performing a calculation that would have taken the Oak Ridge National Laboratory’s former machines 10,000 years to complete in a mere 304 seconds.
High performance computing (HPC) is not only getting faster and capable of more massive and complex calculations, it has also become much more accessible and affordable to businesses of all sizes and kinds. Federal governments around the world, however, are the most aggressive users of HPC. They see it as a vital tool for advancing R&D along with a nation’s competitiveness on the world stage.
HPC Comes of Age
The global HPC market is hot―generating $42 billion in 2021 and projected to reach $71.7 billion by 2030. Within the U.S. federal government, HPC is being used to accelerate basic science, develop therapeutics and other treatments for COVID-19, perform military applications such as simulations, handle climate and weather modeling, and a myriad of other tasks in diverse agencies.
Once, HPC solutions were built, purchased or leased, and delivered and managed by the customer at company data centers. Capacity and demand were hard to predict, so these systems could sometimes sit idle and at other times be overscheduled. Now, using cloud resources in conjunction with on-premises HPC infrastructure enables organizations to run HPC workloads as needed, on a pay-as-you-go basis, with demonstrable cost savings.
Another factor that is drawing more interest and investment in HPC is the availability of smaller, affordable machines. In addition to leadership-class supercomputers costing over $500,000, there are divisional machines priced between $250,000 and $500,000 and departmental machines costing as low as $100,000. Specialized servers, and processors are available to fit different uses and budgets.
A recent research study calculated that each dollar invested in HPC in a business environment led to $507 in sales revenue and $47 in cost savings. In addition to quantitative ROI metrics, HPC research was also shown to save lives, lead to important public/private partnerships, and spur innovations.
HPC Growth in U.S. Government
For years, U.S. federal government agencies like the Department of Defense (DOD), Department of Energy (DOE), and National Science Foundation have been conducting R&D of hardware and software for HPC and training people to utilize the machines and generate novel applications. While the DOD has kept the details of its supercomputer usage classified to protect national security, the DOE has become a global leader in development of HPC solutions for genomics, advanced and sustainable energy, large-scale scientific instrumentation, and quantum information science. The NSF provides support for HPC infrastructure at academic research facilities across the U.S., with supercomputer centers in California, Pennsylvania, Illinois, Tennessee, and Texas.
Other agencies have been created in recent years that intersect with HPC. They include the Intelligence Advanced Research Projects Activity (IARPA) and the National Institute of Standards and Technology (NIST) that handle scientific and engineering research using artificial intelligence, quantum computing, and synthetic biology. The Department of Homeland Security (DHS), Federal Bureau of Investigation (FBI), National Aeronautics and Space Administration (NASA), National Institutes of Health (NIH), and National Oceanic and Space Administration (NOAA) are all using HPC and active in defining new requirements for HPC solutions for a broad array of applications.
New Applications, New Architectures
Real-time big data analytics, deep learning, and modeling and simulation are newer uses of HPC that governments are embracing for a variety of applications. Big data analytics is being used to uncover crimes. Deep learning, together with machine learning, is able to detect cyber threats faster and more efficiently.
Modeling and simulation using supercomputers and vast amounts of data are generating high-fidelity simulations that allow scientists to analyze theories and validate experiments. From events in the solar system to the mechanisms within a proton to blood flow throughout the body, scientific simulations are varied, offer tremendous promise, include lots of data points, and require a lot of processing power. In the U.S., HPC allows the DOE to model and simulate the impacts of a warming climate on the ice sheets of Antarctica and Greenland and to model the movement and impact of water, gases, and storm systems on sea level and ocean temperature. The results of this research will influence important community planning and public safety decisions.
Whether government agencies are relying on HPC environments that are on-premises, in the cloud, or both, its vital that they develop an ecosystem of solution and services partners to help with the challenges ahead. With the state of the art of HPC hardware, software, AI, and other innovations evolving rapidly, a lack of preparedness and action could impact a country’s future viability in the global marketplace, on a warming planet, or on the battlefield.
For more on the HPC in government, read “To Out-compute is to Out-compete: Competitive Threats and Opportunities Relative to U.S. Government HPC Leadership.”
***
Intel® Technologies Move Analytics Forward
Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.
Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.