TAMEST annual conference showcases state of computational science

The Computational Revolution was the theme for the 2014 annual conference for The Academy of Medicine, Engineering & Science of Texas

Mapping the intricate neuronal networks of the human brain; finding and fixing flaws in a product before a single prototype is built; discovering a host of brand new materials with never-before-seen properties through a host of systematic, virtual experiments.

This is a just a sample of the computational science that was showcased earlier this month at the 11th annual conference of The Academy of Medicine, Engineering & Sciences of Texas (TAMEST) at the Hyatt Regency Lost Pines resort in Bastrop, Texas.

The two day conference, titled “The Computational Revolution in Medicine, Engineering & Science,” discussed computational science’s current state and future potential with a diverse array of talks presented by leaders in research and industry alike.

“In planning this meeting we tried to look for a theme that was natural to every component to the three components of TAMEST,” said Tinsley J. Oden, conference program chair and director of ICES, during his introductory remarks. “[Computational Science] has transformed forever the way scientific discoveries are made and how engineering and medicine are done,”

Eleven speakers offered a look into the state of the field. Read on for an overview of all the talks.

Modeling molecular and clinical biology

Many of the TAMEST talks focused on computational science as a tool to explore the form and function of life at the molecular level.

Bruce A. Beutler, director of the Center for the Genetics of Host Defense at The University of Texas Southwestern Medical Center and a Nobel Prize Laureate, described how he is uncovering genes that affect the immune system by inducing carefully tracked mutations into the mouse genome—a technique that can make the immune system falter or fail.

“This approach is akin to reverse engineering, and one applies it to a living system and one learns what is important by making the system fail over and over and over again in different ways,” said Beutler, who has thus far mutated 81 percent of coding genes in the black mouse.

Reverse engineering through induced mutation is a method that has been used in early days of genetics. However, the increased computational power of gene sequencing machines, statistical analysis programs and associated mapping technology is allowing Beutler to know the genetic underpinnings of phenotypes almost as soon as they are observed.

“These days when a phenotype is seen, immediately we know the cause, within the same minute or two,” said Beutler, describing the method of cross-referencing the genetic mutations present in mice with similar phenotypes in the mutation source, the mutated sperm of the paternal mouse, and analyzing the shared mutations for statistically significant correlations between the phenotype and genotypes.

Some of the new gene associations discovered implicate genes in the activation and dampening of the immune system, knowledge that could help in the development of more effective vaccines and treatments for autoimmune diseases, said Beutler.

But while Beutler is seeking to understand biology by breaking it down, Jose N. Onuchic, co-director of the Center for Theoretical Biological Physics at Rice University, spoke about his research in applying computational simulations to understand how proteins build themselves from strings of polypeptides into “molecular machines” like myosin, the protein responsible for muscle contraction, through a complex, and often disordered, folding process.

“You start from an unfolded protein and they come down toward your native state,” said Onuchic. “These motors...they’re not rigid machines, they fold and they unfold, as they evolve, so the entire methodology of folding should be helpful to understand [how they work].”

Computational science can make completely computer-generated simulations, but can also be applied in tandem with pictures taken through electron microscopes to generate detailed 3-D models of microscopic agents in nearly atomic resolution.

Wah Chiu, director of the National Center for Macromolecular Imaging at Baylor College of Medicine, described in his talk how he’s using these techniques to visualize bacteria-infecting viruses, called bacteriophages, before they claim a host and after.

One of Chiu’s successes includes modeling the “biological buckyball” shape of a variety of bacteriophages. The structure, made up of 60 tile-like units, was modeled by computationally compiling 20,000 images of viruses in various orientations into a single 3-D model. In addition, he has used similar techniques to investigate the individual structure of the units, elucidating the amino acids that make up the 14 proteins that form each structure.

Although they don’t infect humans, bacteriophages' sheer abundance, effects on bacterial communities, and promise as a biomolecular tool for gene transfer, make the virus a “fantastic system that molecular biology can manipulate and should understand,” said Chiu.

Natalia Trayanova, the Murray B. Sachs Endowed Chair at the Institute for Computational Medicine at Johns Hopkins University, and a keynote speaker, took biology into the clinical realm when discussing her heart models that incorporate information on the heart’s electrical and mechanical information all the way from the molecular level to that of the whole organ.

“In a way we view our heart models as like a ‘Google heart,’” said Trayanova. “You can look at the behavior of the whole heart but you can zoom in and look at the behavior at the molecular level, and the infrastructure is such that new information as it arrives for the dynamics of molecular processes or signaling can be incorporated within the whole infrastructure.”

Beyond being a general model, it serves as a template that can be adjusted to patient-specific details recovered during an MRI scan. Trayanova said that this was recently done using data from 13 patients with heart arrhythmia due to structural disease that underwent ablation therapy, a technique that burns away electrically errant tissue.

“We had our prediction and we compared it to what was done in the clinic,” said Trayanova. “And our predictions were always within what was successfully ablated, but not only that, they were always much smaller.”

Trayanova said that she is in the process of using her models to predict ablation sites before surgeries as a guide for physicians. It’s a guide that’s sorely needed. Ablation therapy for arrhythmia caused by structural disease has about a 51 percent success rate, said Trayanova.

Modeling the familiar

Many of the TAMEST speakers showed how computational science is enabling science-fiction-like feats. But keynote speaker Thomas J. Lange, Proctor & Gamble's director for Modeling & Simulation in corporate R&D, showed that it’s a technology essential to engineering and testing the familiar products people use each day, from dish soap to diapers.

“Everybody gets that you need science and engineering for jet engines, farm machines, military and defense. But really, seriously, for toilet paper? For diapers?” said Lange.

It’s the contradictions many consumer products possess and the scale at which they’re produced that makes computational science a must, said Lange. Creating detergent that removes dirt but not dyes, and producing billions of diapers in a number of days while being consistent in quality, is not an easy order. So, researchers at Proctor & Gamble use computational simulation to virtually test product components, manufacturing and packaging.

As examples, Lange showed simulations that ranged from chemical reactions between soap and dirt, to the kinematics of different jar opening methods, to the stability of different bottle shapes rushing down a product line, to a finite element model of a razor, which was included in a commercial for the product.

In a similar vein, Thomas Halsey, chief computational scientist at ExxonMobil Upstream Research Company, showcased how another familiar industry, oil and gas, uses computational tools to model oil fields, and predict drilling pathways and fluid flow through them. The tools are so vital that Halsey says that computational science is becoming just as important as the more traditional geophysicist and engineering jobs.

“What we have seen in the last decade to half decade is the emergence of computational science as a third major discipline in our industry,” said Halsey. “We’re facing challenges that we simply could not have addressed with the technology of 10 or 20 years ago.”

Halsey, among many TAMEST speakers, credits the advancement of high performance computers (HPC), or ‘super computers,’ as a major enabler of increasingly complex computational methods. Today’s fastest computers compute on the petascale, performing one quadrillion operations per second.

Dan Stanzione, deputy director of UT's Texas Advanced Computing Center (TACC), focused on HPC technology during his talk, detailing the explosion of data, and giving overviews of TACC’s two (soon to be three) HPC systems. But one of the main topics of Stanzione’s talk was that for HPC growth to meaningful, software needs to be optimized to run programs on the complex, distributed hardware of modern parallel machines.

“High level languages are the antithesis of performance today,” said Stanzione, describing two instances of how rewriting genomics research code resulted in an increased processing speed by a factor of 2600 and 3 million, with no change in hardware needed.

Modeling Big

More processing power means it’s feasible for researchers to investigate bigger problems. This can be read as small in size, but data heavy—like the virus, gene and protein research of previously mentioned talks. But it can also mean phenomena large in size and scope. These ‘big picture’ topics were well represented at the conference.

For example, Sharon C. Glotzer, a University of Michigan professor of chemical engineering, material science & engineering, macromolecular science and engineering, and physics, spoke about the “nearly infinite design space” of computationally-aided materials engineering.

“…It is through the use of computing that we are driving ourselves into this new age where not only our civilization will continue to be defined by new materials, but for the first time we can start to imagine a future where we can have materials that do whatever we want, in whatever way we want, for any purpose,” said Glotzer, describing self-assembling materials with the ability to self-camouflage, change shape, serve as biological implants, or enhance energy efficiency.

Computational science is the method to make such materials a reality, said Glotzer, because it enables essentially boundless digital experimentation and manipulation.

“Computer simulation, with the right models and with enough computer power, is in the position now to rapidly go through all of these different possibilities. We can take our codes and tune upon different dimensions independently in a way that experiment cannot.”

Glotzer and her team are developing simulation codes to help enable such experiments, and also to reverse-engineer materials with uncertain developmental steps.

ICES’ Clint Dawson and Omar Ghattas also took on the big picture in their TAMEST talks that described how they are using computational science to predict hurricane storm surge, and the ice sheet flow, respectively.

Dawson, director of the ICES Computational Hydraulics Group, focused on how historical and real-time data is used to model the long and short waves created by hurricanes, which can cause storm surge flooding that can create problems long after the hurricane is gone. He spoke about how his simulations are applied widely outside of the lab, being used by Texas' Division of Emergency Management to help develop hurricane evacuation routes, and even applied to forecast the drift of surface oil after the Deepwater Horizon oil spill of 2010, with about 60 percent overlap.

Ghattas, director of the ICES Center for Computational Geosciences, on the other hand, used his Antarctic ice sheet research, which is built around inferring causes for ice flow from observed effects, to highlight a question that was raised by many attendees and lecturers: how do you quantify uncertainty in simulations?

The approach Ghattas offered in his talk was Bayesian Inference, a computationally enabled technique that, rather than taking the “simplest” likely solution, a la Occam’s razor, evaluates the probability of many different scenarios.

“The Bayesian approach says rather than exclude a large number of possible models that are consistent with the data and picking one that is smoothest, let’s try to statistically characterize all possible models that are consistent with the data and attach a probability to them. So a solution to the inverse problem in the Bayesian context is a probability distribution,” said Ghattas.

“If we’re going to decide how to mitigate sea level rise, we better have models that are capable of not only giving some number for sea level rise, but equipping that number with the degree of confidence we have in that prediction.”

As for the final keynote speaker, Henry Markram, director of the Blue Brian Project, and the Human Brain Project, his talk focused on his work mapping a space with likely infinite potential—the human brain.

“What’s the causal chain of events all the way from the genes, to the proteins to the cells, to the synapses to the circuits to the brain regions, to the whole brain,” said Markram. “That’s what we want to try and construct, and that’s not possible with biology.”

Instead, Markram proposes using computers to predict biology, filling the missing gaps, and conducting experiments that could “literally take centuries to do.” The foundation of the brain simulation is built on established knowledge of the 55 cell types of the neocortex, their morphology, the statistics of their branching, and their volumes and densities within the brain. Markram is finding new information about the brain behavior by integrating information from published literature into the model, as well as using the model itself to investigate questions.

Already results are promising. For example, Mackram and his team used the model to successfully predict the location of all the brain’s synapses. Predicting how many of those were inhibitory synapses took the team five minutes. A similar prediction carried out by a research team using electromagnetic scanning techniques took three years.

“It’s the most practical way to develop technology, rather than to reinvent 4 million years of human nature.” said Mackram.

Written by Monica Kortsha

Posted: Jan. 27, 2014