In its steadfastness to attain a new approach to attain a data-centric design, IBM has collaborated with The University of Michigan to develop and deliver “data-centric” supercomputing systems to fasten the pace of scientific discovery in fields as diverse as aircraft and rocket engine design, cardiovascular disease treatment, materials physics, climate modeling and cosmology.
The system is designed to enable high performance computing applications for physics to interact, in real time, with big data in order to improve scientists’ ability to make quantitative predictions. IBM’s systems use a GPU-accelerated, data-centric approach, integrating massive datasets seamlessly with high performance computing power, resulting in new predictive simulation techniques that promise to expand the limits of scientific knowledge.
The collaboration was announced in San Jose at the second annual OpenPOWER Summit 2016. Several other Foundation members contributed to the development of this new high performance computing system, which has the potential to reduce computing costs by accelerating statistical inference and machine learning.
ConFlux – The game changer?
The joint effort of IBM and U-M researchers have landed them with a computing resource called ConFlux designed to enable high performance computing clusters to communicate directly and at interactive speeds with data-intensive operations. Hosted at U-M, the project establishes a hardware and software ecosystem to enable large-scale data-driven modeling of complex physical problems, such as the performance of an aircraft engine, which consists of trillions of molecular interactions.
“There is a pressing need for data-driven predictive modeling to help re-envision traditional computing models in our pursuit to bring forth groundbreaking research,” said Karthik Duraisamy, Assistant Professor in the U-M Department of Aerospace Engineering and director of U-M’s Center for Data-driven Computational Physics.
“The recent acceleration in computational power and measurement resolution has made possible the availability of extreme scale simulations and data sets. ConFlux allows us to bring together large scale scientific computing and machine learning for the first time to accomplish research that was previously impossible.”
ConFlux, is currently drawing its funds from a grant from the National Science Foundation, that aims to advance predictive modeling in several fields of computational science. It meshes well with IBM that provides it with servers and software solutions.
“Scientific research is now at the crossroads of big data and high performance computing,” said Sumit Gupta, Vice President, high performance computing and data analytics, IBM.
“The explosion of data requires systems and infrastructures based on POWER8 plus accelerators that can both stream and manage the data and quickly synthesize and make sense of data to enable faster insights.”
U-M researchers understand the significance of IBM’s shift to data-centric systems, said Michael J Henesey, Vice President business development, data centric systems and innovation centers at IBM.
“They were enthusiastic about the application of this architecture to problems that are essential to the university and to the country,” Henesey said. “We will stay close to U-M to help inform our future system designs.”
As one of the first projects U-M will undertake with its advanced supercomputing system, researchers are working with NASA to use cognitive techniques to simulate turbulence around aircraft and rocket engines. They’re combining large amounts of data from wind tunnel experiments and simulations to build computing models that are used to predict the aerodynamics around new configurations of an aircraft wing or engine.
With ConFlux, U-M can more accurately model and study turbulence, helping to speed development of more efficient airplane designs. It will also improve weather forecasting, climate science and other fields that involve the flow of liquids or gases.
Progress in a wide spectrum of fields ranging from medicine to transportation relies critically on the ability to gather, store, search and analyze big data and construct truly predictive models of complex, multi-scale systems. Effective data centric design would enable computer scientists to engineer systems that are optimized for specific jobs —improving the speed and quality of insights.
Click here to better understand IBM’s new vision for Big Data Computing.