A large number of companies, including IBM, Microsoft and Google, along with universities and national laboratories have teamed up to form the COVID-19 High Performance Computing (HPC) Consortium. This new partnership is designed to scientists provide supercomputing resources while figuring out how to fight the disease caused by coronavirus known as covid-19.
Faced with a rapidly spreading disease, scientists can make thousands of models on supercomputers to better understand the epidemic, characterize the virus and devise possible vaccines and drug treatments. The organizers of the new consortium will supply 16 supercomputing systems to researchers, as well as a community to compete together.
“The advantage of having the consortium must accelerate and accelerate the scientific discovery that must be made to develop a vaccine, understand the virus, and eventually kill, “Michael Rosenfeld, Vice President of Data Centric Solutions at IBM, Gizmodo said. He said that powerful supercomputers can do in minutes or hours what ordinary computers can do in days, months or years.
The consortium currently represents corporate supercomputers, including IBM, Amazon, Google and Microsoft; universities including the Massachusetts Institute of Technology and the Rensselaer Polytechnic Institute; and Department of Energy National Laboratories including Lawrence Livermore, Oak Ridge and Los Alamos, as well as NASA and the National Science Foundation. The consortium is encouraging covid-19 researchers submit proposals through a central portal, which will assess a steering committee to put researchers in contact with the right supercomputers.
Supercomputing centers have always provided discretionary computation time for emergencies, such as during hurricane reactions, said Kelly Gaither, Texas Advanced Computing Center’s ddirector of health onenalytics. She told Gizmodo it was a good idea to spend time fighting the corona virus.
As for scientists actually do this with supercomputers pandemic, many are trying to structure the virus and its Spike protein, like how it differs from other coronaviruses, like the virus behind SARS. Supercomputers have already shown their value in fighting the disease on this front; For example, the Summit supercomputer at the Oak Ridge National Laboratory allowed researchers to shrink 8,000 potential virus-fighting molecules to just 77. Others use the computers to generate simulations of how the pandemic can play off, when the peak will occur, how long it takes depending on what measures in place, and which locations need the most inventory.
Researchers have already made proposals to investigate the virus with you.S. supercomputing resources. The national Science Foundation (NSF) issued one call for proposals with regard to covid-19 earlier this month and has already financed 10 fast-grants totaling $ 1,592,789, an NSF spokesperson told Gizmodo.
Rosenfeld told Gizmodo that the consortium offers researchers an opportunity to work together in ways they might not have done previously, such as helping each other get their code on the processors faster. Gaither said this encourages scientists of various specialties to solve and solve new problems ways and think creatively about how include supercomputers in their research.
While it is impossible to predict how long this pandemic will last, we can only hope that new scientific ones progress will help us beat faster and increase our defenses against future pandemics.