Skip to Content

Office of the Vice President for Research


Jason Hattrick-Simpers

Breakthrough Magazine

Q&A with Jason Hattrick-Simpers, College of Engineering & Computing

The stuff that makes modern life so comfortable — coffeemakers, smartphones, cars, airplanes and all the other devices that make the technological world go around — depends on materials, the building blocks of all that stuff. To speed the pace of research progress in developing new and better materials, the U.S. started the Materials Genome Initiative, and two of the national leaders in the effort are here at the University of South Carolina: Jochen Lauterbach and Jason Hattrick-Simpers of the College of Engineering and Computing. We asked Hattrick-Simpers a few questions about the MGI and the progress that’s been made so far.

When did the MGI get underway?

The MGI has its roots in the so-called Integrate Computational Materials Engineering movement that really started gaining traction in the last few years of the 20th century. Scientists suddenly had access to powerful personal computers, and reliable modeling tools were being developed that bridged between atomic-scale properties and macro-scale properties. So scientists for the first time were able to model something like a metal blade in a jet turbine to obtain temperature and stress profiles and use that information to create materials-design criteria. The actual MGI was formally announced by the White House in 2011.

What is the goal of the MGI?

The plan is to reduce the amount of time required for the develop-ment and deployment of new materials in applications (old and emerging). The stated goal is a two-fold reduction, from the current 10 to 20 years down to 5 to 10 years.

How is the MGI meant to achieve that?

The goal is to create an integrated research workflow that brings together computational, experimental and data science tools. The basic premise is to come up with a new idea or hypothesis, then use modern high-throughput experimental instruments to test that idea. The large data sets produced from these experiments are then combined with existing theoretical and experimental data sets and probed with data science tools to identify so-called "hidden" correlations. These correlations can then be investigated using computational models to uncover the new "rules" for material selection that guide the next round of experiments.

The name is similar to the Human Genome Initiative. How are they related?

The overlap between the two names is intentional. A genome is a set of information encoded in the language of DNA, which is the blueprint for an organism's growth and development. By mapping this genome in the latter part of the 20th century, revolutionary new treatments for all manner of human diseases, among other applications, are being developed. We know that materials also have genomes that can be associated with their final functionality, and that if we can successfully map them, then true "materials by design" could be realized. The benefits for existing technologies, as well as next-generation technologies, will be incredible if we are successful.

What kind of progress have you made so far?

A DARPA project on which I collaborated with Professor Lauterbach was very successful because of our combined expertise in using high-throughput experimental techniques and combing through large data sets to find optimal materials. In another project, we recently demonstrated the creation of high-temperature coatings for nuclear cladding materials, which could have far-reaching implications in the safety of nuclear reactors. During a nuclear meltdown, high-temperature steam reacts with the zircaloy cladding to produce zirconia and hydrogen gas, which will explode when exposed to the atmosphere. Such explosions were the primary causes of the uncontrolled release of radioactive byproducts during the Fukushima-Daiichi and Chernobyl nuclear disasters. New materials that can slow down the rate of this reaction, thus reducing the quantity of hydrogen produced, are an area of urgent need. We used a combination of computationally guided materials synthesis, high throughput experimentation and large-data analysis tools to identify novel iron-based coatings that address this need.