Ramping Up to Multiscale: Taking Biomedical Modeling to a New Level

Multi-scale modeling is now at what might be called its gestational stage

For centuries, mathematics has been an indispensable ally of the physical sciences and engineering. Planes fly and telephones work because engineers know how to simplify physical systems into convenient mathematical models. But biologists and mathematicians have had a harder time communicating. As the old joke says, when you ask a mathematician to explain why a cow isn’t producing milk, he’ll probably begin, “Consider a spherical cow…”

 

Computer models of the heart incorporate detailed experimental information, both at the level of individual cells and at the level of anatomy. Here, a model developed by Peter Hunter’s team at the University of Auckland portrays the changing orientation of the heart’s muscle fibers from the outside to the inside of the heart wall. The spiraling of the fibers is believed to affect the flow of electric signals through the heart. Courtesy of Peter Hunter, PhD, Bioengineering Institute, The University of Auckland, New Zealand.However, attitudes are changing in both disciplines. With the advent of computational biology, some biologists are shifting toward more quantitative models. And today’s vast computing power means that mathematicians no longer have to simplify as much as they used to. The days of the “spherical cow” are over. Bioengineers can program an anatomically correct cow (or human) into their computers. The organs can be made out of virtual cells that behave the same way real cells do, and contain virtual proteins that interact like real proteins. Each biological scale—organism, organ, tissue, cell, protein, DNA—has been successfully modeled in isolation. Now, biologists and mathematicians are beginning to grapple with the problem of unifying all of these layers into a single multi-scale model.

 

Among the most mature types of multi-scale models are simulations of the human heart. Accurate equations that describe individual heart cells have existed since the early 1960s. They have greatly clarified how the flow of ions through channels in cell membranes causes heart cells to transmit electric signals at precisely timed intervals. Now the models are reaching down to the molecular level, to explain how gene expression or drugs cause changes in the ion channels. At the same time, they are reaching up to the organ level, placing the cell models in the context of macroscopic physiology.

 

In ischemia, for example, a local event—the blocking of blood flow in a coronary artery—creates organ-wide consequences, as a whole region of heart muscle is deprived of oxygen. This in turn affects the heart tissue at a cellular level, by altering the chemistry inside the cells. The intracellular changes create an arrhythmia, which propagates back up to the whole-organ level. This interplay between the different physical laws at different levels is what multi-scale modeling is all about.

 

Even so, cardiac models are not necessarily a blueprint for other parts of biology. “We’re a long way from generating the principles by which multilevel work should be done,” says Denis Noble, PhD, professor of cardiovascular physiology at Oxford University, one of the pioneers of cellular modeling of the heart. Indeed, multi-scale modeling is now at what might be called its gestational stage. Everybody knows it’s important, but no one quite knows how to do it.

 

Nevertheless, money is flowing. Last year, an interagency NIH/NSF/NASA/DOE program funded 24 investigators, to the tune of $20 million, to work on various projects in multi-scale modeling. A journal, Multi-scale Modeling and Simulation, launched in 2002 and published its first articles in 2003.  In almost every part of biology—from bacteria to humans, from the heart to the brain—scientists want to uncover the rules that organize nature’s complexity. “You have to hope there are underlying principles,” says James Glazier, PhD, the director of the Biocomplexity Institute at Indiana University and organizer of eight biocomplexity conferences. “If not, you’re out of luck.”

 

Engineering the Cell

In 2002, Yuri Lazebnik, PhD, of Cold Spring Harbor Laboratory wrote a much-discussed satirical article for Cancer Cell called “Can a biologist fix a radio?” Lazebnik’s answer was no. He argued that the usual research method of biologists—knock out one component at a time, and see which ones stop the cell from working—would not enable them to figure out how a transistor radio works. Why, then, should we expect to understand the workings of a cell in this way?

 

The circuit diagram of an AM transistor radio (above) looks forbiddingly complex until it’s overlain with functional modules. According to Herbert Sauro, the same can be true of protein interactions.Last year, at the Biocomplexity 7 conference, Herb Sauro, PhD, turned the question around. The assistant professor of biochemical control systems at Keck Graduate Institute asked: “Can an engineer fix a cell?” His answer was a qualified yes. “Engineers deal with complex systems day in and day out,” Sauro says. “Today’s computer systems have hundreds of millions of components, a level of complexity that is rapidly approaching that found in biological systems.” But engineers have a secret that not all biologists have learned, Sauro says: “Engineers modularize.”

 

It is still far from clear whether nature modularizes. If so, it does so in a very different way from human engineers, because natural systems are not rationally designed; they arise through natural selection. Nevertheless, the final outcome may be the same. A particular network may offer a powerful selective advantage precisely because it performs some function in an optimal manner.

 

To the layman, the circuit diagram of an AM radio looks incomprehensible (see diagram). But the system becomes easier to understand once you realize it has three modules: a resonance detector, a demodulator, and an amplifier (see figure). From there, an engineer can break the circuit diagram down into smaller modules, each with a specific function. In this way, possibly passing through many layers, the engineer can tell how any electronic device works.

 

A cell, like a radio or a computer chip, contains many components that interact with each other in a dizzyingly complex network. Most biologists, Sauro contends, are satisfied simply to list the components (the proteome) and identify which ones interact with each other (the “interactome”). He says they should also ask: What are the modules and what do they do?

 

The MAP (in blue) kinase cascade at the bottom of the protein interaction network (right) looks like a negative-feedback amplifier; however, some of the other “widgets” in the network have functions that are still unknown. Courtesy of Herbert SauroAs an example, Sauro cites the mitogen-activated protein kinase (MAPK) cascade, a complicated series of protein-protein interactions that senses conditions outside the cell and initiates cell division. When the MAPK cascade goes haywire, one possible result is cancer—which explains why many biologists are interested in it. It has a very distinctive and well-understood structure: three staircase steps with a feedback from the third back to the first.

 

When Sauro showed the “circuit diagram” of the MAPK cascade to engineers, they immediately told him what the circuit does. It’s a negative-feedback amplifier, a type of circuit invented in the 1920s to transmit transcontinental telephone calls. The purpose of the feedback is to cancel out distortion, amplifying only the true signal. Sauro admits that it is “still just a hypothesis” that it performs the same function in a cell. However, if this is the optimal way to amplify a signal without distortion, it’s possible that, during the course of evolution, nature may have stumbled onto the same solution that human engineers did.

 

Adam Arkin, PhD, an assistant professor of bioengineering and chemistry at the University of California, Berkeley, is one researcher who is taking an engineering approach to the study of cells. He has already compiled a library of protein interaction pathways, organized by their possible functions: switches, oscillators, amplifiers, noise filters (such as the MAPK cascade), and so on. Some of these are very widespread. As far as biologists know, the MAPK cascade is found in all eukaryotes. Unlike electronic components, Arkin says, biological modules have the ability to evolve and adapt. One particular switch, called the sin operon, is ubiquitous in bacteria but plays flexible roles. Arkin has showed that it can function as a graded switch, like a light dimmer; a bistable switch, like a normal wall switch; or a single pulse generator, like the switch of a flashbulb.

 

If it is true that nature modularizes, it raises the possibility that humans can actually design bacteria to perform certain functions. For larger organisms, such as humans, modularity is important because it simplifies multi-scale modeling. “If you’ve identified a module with a crisp function, then you can substitute that whole network with a single equation,” Sauro says. This kind of substitution is what will make multi-scale modeling possible. And such models will generate hypotheses that can be tested experimentally—one of the most important ways that computational biology can contribute to biological discovery.

 

The Heart of the Matter

Can an engineer repair a heart? The answer, again, is a qualified yes. Every day, defibrillation—a massive external shock applied to the heart—saves the lives of many people who would otherwise die within minutes. When done correctly and promptly, defibrillation has a success rate well over 90 percent. Ironically, though, scientists are not quite certain why it works. It is certainly a more violent and painful treatment than necessary—although, as Noble says, “In a condition where you otherwise die, you put up with that.”

 

Multi-scale models have enabled heart researchers to “see” much more clearly into the fibrillating heart. The models work on at least two scales. They couple cellular properties, such as the way a heart muscle cell reacts to ionic currents, with equations from physics that describe how electric currents propagate through conductors. The anatomy of the heart plays an important role, because heart muscle does not conduct electricity equally in every direction: the current flows preferentially along muscle fibers.

 

The models show that fibrillation starts with tachycardia. This may feel like a “rapid heartbeat,” but it is not really a heartbeat at all. A normal heartbeat is a wave of electrical excitation that progresses from the heart’s pacemaker (the sinoatrial node) and sweeps over the whole heart. Ventricular tachycardia, on the other hand, is a self-organizing spiral of electrical activity that rotates around a center, like a dog chasing its tail. Opinions differ as to whether the center is an anatomical defect, such as a piece of scar tissue, or whether the “rotor” can form anywhere. Either way, the heart muscle cannot sustain it, and the single spiral wave disintegrates into many. That is the onset of fibrillation.

 

Ventricular fibrillation is a complex three-dimensional phenomenon, but experimental methods can probe only the two dimensions of the heart’s surface. Using computer simulations researchers can observe (left) a tornado-like “scroll wave” of electric activity spiraling around a filament that passes through the heart muscle (the colors reflect the wave’s time of arrival at the heart surface). At right, the wave front inside this semi-transparent rendering of the heart is red and the filament is blue. Lab experiments can only image the places where the filament reaches the heart surface (black dots). Courtesy of KHWJ ten Tusscher and AV Panfilov, Utrecht University, The Netherlands.Defibrillation is a mystery. If the heart were a uniform electrical conductor, the shock from the defibrillator would have no way of penetrating the interior of the muscle, and so the gadget would never work. Evidently the heart is not homogenous, but a debate still rages over where to look for the inhomogeneities. Some heart physiologists believe that the relevant features are large-scale (the muscle fibers). Others claim that the shock sets up a voltage gradient across the gaps between layers of cells (or “interlaminal clefts”). Either explanation, if it could be proved by experiment, would be a triumph for computational biology’s ability to turn qualitative hypotheses into quantitative, testable predictions. The second hypothesis, which proceeds from cells up to the organ level, is perhaps more in the spirit of multi-scale reasoning, but in fact both of them require multi-scale modeling to work in a quantitative fashion.

 

Meanwhile, heart models are contributing to scientists’ understanding of other heart diseases as well. For example, long QT syndrome is an irregular heartbeat that can be caused either by drugs or by genetic mutations that affect the potassium channel. Often its first symptom is sudden death of an apparently healthy young person. Many drugs affect potassium levels, and it makes much more sense to test their side effects first on a computer model than on a live human.

 

Simulations can also help identify drugs with positive effects. Noble has used them to study an anti-anginal drug called ranolazine, which affects two channels at once, the potassium and sodium channels. So-called “multiple action drugs,” like ranolazine, have a poor reputation, says Noble, precisely because “our minds can’t wrap themselves around them.” Doctors prefer drugs with a single clear effect. But in the case of ranolazine, either action by itself would cause arrhythmia. The combination avoids arrhythmia as well as the undesirable side effects of other anti-angina drugs, such as low blood pressure. In January 2006, the FDA approved ranolazine for general use, making it the first new anti-anginal drug in two decades. While it is impossible to tell whether the computer models affected the FDA decision, Noble says that such models “can help a new drug application, since understanding what is going on is an important part of the regulatory process. People feel happier with a new compound as a possible drug the more we understand why it acts the way it does.”

 

A Panoply of Projects

Last year, the Interagency Modeling and Analysis Group (IMAG), a combined effort of several government agencies coordinated by Grace Peng of the National Institute of Biomedical Imaging and Bioengineering (NIBIB), awarded 24 grants for multi-scale modeling projects in biology. The grants were funded by the individual agencies (twelve by NIH, ten by NSF, and one each by NASA and the Department of Energy.) Although many of the projects are just beginning, they illustrate the wide diversity of applications envisioned for multi-scale models. Here are a few examples:

 

A multi-scale model by George Karniadakis and Igor Pivkin aims to be the first to predict clotting time from physical principles. A key ingredient in the model is “dissipative particle dynamics,” a stochastic method designed to model the flow of polymers through a fluid. In the simulation shown here, blue platelets are inactive, green platelets are “triggered” and red platelets are activated. Note that some blood continues to flow through the growing clot. Courtesy of Igor Pivkin and George Karniadakis, Brown UniversityJames Glazier, PhD, of Indiana University, will study the processes of limb formation and tissue regeneration. He believes that people in the field count too much on the amazing abilities of stem cells. “The genomic determinists think you’ll plunk a stem cell down in the body, and it will spontaneously regrow the tissue that should be there,” Glazier says. “Maybe you’ll be lucky and it will work that way. But I think that you will have to give complex spatiotemporal signals to those cells.” He plans to develop a model of the feedback between the molecular scale—the instructions encoded by DNA—and the large-scale forces that act on cells as a limb grows and takes shape.

 

George Karniadakis, PhD, professor of applied mathematics at Brown University will model the flow of platelets and the formation of blood clots. Platelets ordinarily look like smooth disks. But when they sense a defect in the arterial wall, they pump themselves up into sticky, spiny spheres. “This kind of phenomenon has never been modeled from first principles, because it’s computationally very complex,” says Karniadakis. Mathematicians and engineers are not used to working with flowing particles that suddenly change their shape and adhesiveness. However, Karniadakis is planning to borrow a new method called “dissipative particle dynamics” or DPD, which has been developed by polymer physicists in Europe. DPD is a typical “mesoscale” or intermediate-scale mathematical technique, which uses probabilities rather than deterministic equations, as classical physical models do. Ultimately, Karniadakis would like to plug this intermediate-scale model into a large-scale model of the body’s arterial tree. Last year, he and a group of colleagues used a grid of four supercomputers (based in San Diego, Urbana, Pittsburgh, and Argonne) to prove the basic proposition that you can simulate blood flow in such a complicated set of vessels as the human arterial tree. (see figure).

 

Robert Kunz, PhD, a physicist at Pennsylvania State University, also plans to apply modeling techniques from outside biology. He is developing a simulation of airflow in the human lung inspired, in part, by software used in the nuclear reactor industry. The flow of coolant in a nuclear reactor is too complicated to model in three dimensions, so computer programs represent the flow with a simplified, one-dimensional model. But if an accident occurs, such as a loss of coolant, the programs immediately switch over to a three-dimensional model of the affected region, and integrate the results seamlessly with the one-dimensional model of the whole reactor. Similarly, Kunz’s large-scale lung model will use 3-dimensional fluid dynamics to track the flow of air through the wider bronchial passages. However, in the sponge-like outer layer of the lung, where the flow becomes too complicated, his code will switch over to a one-dimensional approximation. In other words, it won’t track the twists and turns of every single air molecule, but it will track the progress of an entire breath of air toward its final destination, the alveoli. The model could be used to calculate the uptake of drugs such as inhaled insulin (another drug newly approved by the FDA), or to study how lungs decrease in efficiency with age. One of the other IMAG projects, led by Ching-Hong Lin of the University of Iowa, will also focus on the human lung.

 

Challenges and Pitfalls

At present, the number of realistic multi-scale models in biology is very small. “In reality, it has been achieved in only one organ system, the heart,” says Peter Hunter, PhD, professor of bioengineering at the University of Auckland. “The lungs are getting close. They have all the anatomy of the airways, pulmonary vessels, and gas exchange at the alveolar level, and they are starting to look at the smooth muscle.”

 

A three-dimensional model of air flow through the lung enables Robert Kunz to predict oxygen concentration (a) and the vorticity of air flow through the bronchi (b). However, only five percent of the lung’s volume is contained in its largest bronchi, shown here; 95 percent is contained its spongelike outer layer. This layer contains billions of bronchi, far too many to model in complete anatomical detail. Kunz is working on a way to integrate the three-dimensional model of the larger bronchi with a smaller-scale, one-dimensional model that describes the terminal bronchi. Courtesy of Robert Kunz, Pennsylvania State UniversityHunter is the co-chair of the Physiome Project of the International Union of Physical Sciences (www.physiome.org.nz), which runs a website that archives mathematical models of physiology. Currently the site lists around 300 models, essentially all of which work at a single scale. That’s no big surprise, because modelers have to learn to walk before they can run. The site is also, at this point, only descriptive: visitors can see computer code for the models but not actually run them. However, James Bassingthwaighte, PhD, a bioengineer at the University of Washington, is taking the next step by putting working versions of the Physiome Project models online. In theory, this will make it much easier to mix and match models at different scales.

 

However, there is more to multi-scale modeling than picking from a menu of single-scale models. Another thing you need is a lot of data. “Complex models have not caught on in biology the way they have caught on in, say, weather forecasting, because weather forecasters have sensors everywhere,” says Arkin. By contrast, clinical medicine has to make do with few sensors and intermittent data. Some fields, on the other hand, are swimming in data—genomics and proteomics, for example—but do not have enough models that can handle that level of complexity.

 

Glazier feels stymied not only by the lack of data, but the lack of desire to acquire the right kind of data. Every mathematical model incorporates measured parameters. These are like the labels on the radio’s circuit diagram that indicate the properties of a resistor or transistor. Glazier itemizes a few that are relevant to biology: “association and disassociation constants, diffusion constants, decay rates, cellular production rates, ...” But biologists aren’t convinced that it is worth the effort to measure them. “Biology is still a ninety percent qualitative discipline,” Glazier says. “There’s a basic bootstrapping problem. Until experimenters take modeling seriously, you won’t have people making measurements to pin down the parameters. And without the right parameters, the record of predictions is not very great.”

 

Another great challenge of multi-scale modeling is that the models at different scales may involve different physical principles and different assumptions. It will help to put the models on the same computing platform, as Bassingthwaighte is doing, but other fundamental questions need to be addressed. For one thing, “We have no sense of how error propagates from one level to the next,” Arkin says. For another, he asks, “Where are the boundaries between fast and slow reactions, or between deterministic and stochastic models?” Physical scientists have developed a very good sense of where the boundaries should be, and which details can be left out when going from one scale to the next. At present, biologists make these decisions in an ad hoc fashion, Arkin says. But perhaps Sauro’s modular approach, or switch-on-the-fly software like Kunz’s, can make the decisions more rationally based.

 

In some cases, the mathematical tools to fit the different scales together may be unfamiliar to biologists. The IMAG program is having a demonstrable effect by attracting researchers like Karniadakis and Kunz, who are bringing in new techniques from physics. Stochastic differential equations, for example, are unlikely to come up in a biologist’s mathematical training, but they are a natural fit for biological multi-scale models, because they address the elusive mesoscale. This is the level where there are too many components (such as cells or molecules) to simulate individually, but too few to trust in the law of averages. At the mesoscale, deviations from the average matter. “How elastic are arterial walls?” Karniadakis says. “The answer varies day by day, and across genders and ethnic groups. Even if you know the properties precisely, you need to know how they vary.” But including variation in a model is harder than it sounds. It means abandoning the comfortable deterministic models of classical mathematics and using probabilities. The elasticity of an arterial wall is no longer a number but a distribution, a miniature bell-shaped curve of possible values.

 

Even cardiac models, which have performed well with deterministic equations, may need a dose of randomness. “There’s a growing understanding that in some cases one has to do stochastic differential equations,” says James Keener, PhD, a mathematician at the University of Utah. One such place is the modeling of calcium flow, which he says is “highly inhomogeneous” within the cell. The classical models, which treat the interior of the cell as a uniform fluid, may be getting the right results for the wrong reasons.

 

Keener’s comment suggests a final word of caution about all mathematical models. Even the best-validated model is not guaranteed to last forever. It is always subject to correction, as experimenters discover new phenomena that weren’t included in the original assumptions. “Models are never right,” says Bassingthwaighte, “they’re just not wrong yet.”



All submitted comments are reviewed, so it may be a few days before your comment appears on the site.

Post new comment

The content of this field is kept private and will not be shown publicly.
CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
Image CAPTCHA
Enter the characters shown in the image.