The Physiome: A Mission Imperative

To understand biology—and provide appropriate medical care—scientists need to understand interactions across multiple scales. Hence the Physiome.

This is the reality of human biology: events span a 109 range in lengthscale (molecular to organismal) and a 1014 range in timescale (molecular movement to years). To understand this biology—and provide appropriate medical care—scientists need to understand the interactions across these scales.


When complex (top two rows) and simple (bottom two rows) rabbit ventricular models were each induced into arrhythmia, the patterns of electrical activity differed noticeably. The complex model included finer anatomical features such as vessels and endocardial structures. These extended the period during which arrhythmic activity was observed (compare last two panels of each sequence), illustrating the importance of using realistic heart models for patient-specific diagnosis and treatment prediction. Simulations performed with the Cardiac Arrhythmia Research Package. Figure courtesy of Dr Martin Bishop, University of Oxford. For further detail see Bishop M., et al., Development of an anatomically detailed MRI-derived rabbit ventricular model and assessment of its impact on simulations of electrophysiological function, Am J Physiol Heart Circ Physiol 298:699-718, 2010.“Systems that have clinical relevance and involve how to treat or prevent disease are always multi-scale and multi-feedback,” says Peter Kohl, MD, PhD, reader in cardiac physiology at the University of Oxford and a coordinator of the Virtual Physiological Human Network of Excellence (VPH NOE) funded by the European Commission.


Hence the physiome: an international effort to quantitatively describe human physiology across this vast range of scales. “Basically, with the range of scales involved, you have a mission impossible in front of you,” Kohl says. “On the other hand, whether you get your head around it with or without quantitative models, it remains the same range. It’s therefore a mission imperative rather than a mission impossible.”


Fortunately, the goal is not to simulate an entire physiological human on a computer in full detail (which would require more computational power than is available on the planet), but rather to develop models that “simplify the system to provide insight and identify causal relations,” Kohl says.


Some physiome models are already providing remarkable insight, says Jim Bassingthwaighte, PhD, professor of bioengineering at the University of Washington, who defined and named physiome in the early 1990s. Physiome efforts have sprung up in many different countries, with projects involving just about every organ in the body.  Models of the heart are the most advanced, and are currently being used in clinical studies to optimize treatments for a variety of heart problems. Physiome models of the lung and neuromuscular system are also making breakthroughs. Peter Hunter, PhD, director of the Auckland Bioengineering Institute (ABI) at the University of Auckland in New Zealand, believes that, over the next few years, multi-scale approaches will achieve clinical outcomes that wouldn’t have been otherwise possible.


One reason the physiome can make a difference is that medical treatments are often built on a phenomenology of “I do this thing and get this result,” says Nic Smith, PhD, professor of computational physiology at Oxford University and scientific coordinator of euHeart, the VPH heart physiome project. “In many ways, the physiome effort is trying to change that: To underpin the result with mechanisms that we can identify and think we understand.”



The notion of the physiome, Kohl says, is “more a philosophy than a project, and there are many people around the world who have adopted that philosophy.” The European Commission funds the “Virtual Physiological Human” (VPH) effort; the National Institutes of Health in the United States have the Interagency Modeling and Analysis Group and the Multiscale Modeling Consortium with specific funding initiatives directed at physiome research. The International Union of Physiological Scientists houses “The Physiome Project” and in particular the CellML project (, fed largely by Peter Hunter’s Institute in New Zealand. And countries such as the United Kingdom, Japan and China also have their own physiome projects (not described further here, but meritorious in their own right).


Though they share a similar philosophy, the various efforts have quite different focuses. For example, Peter Hunter’s project in New Zealand seeks to build models of every organ system and every level within each organ system.  In the future, the plan is to have modules that can be sha


red and connected to study whatever someone wants to study.


Meanwhile, Europe’s VPH project, formerly called the Europhysiome, puts a strong emphasis on engaging industry and clinical centers, says Smith, because that’s what interested the European Commission. “It’s quite translational and outcome focused, which has really moved the physiome from what is a very appealing scientific vision to being something that might really matter to people,” he says.  “I think that has been a very positive thing.”


In the US, the focus has been more on basic science, says Grace Che-Yaw Peng, PhD, program director at the National Institute of Biomedical Imaging and Bioengineering at the National Institutes of Health. “US investigators are digging deeper within each scale or between scales but not necessarily reaching that far out to clinic or industry,” she says.


These complementary approaches may help ensure the best research outcomes, Peng says. “Should researchers in Japan, China, US, and Europe agree on the heart model that gets incorporated into clinical process?” That’s not necessarily the goal, she says. “Everyone has a different question.”


Though a more coordinated approach might be more efficient, there is a precedent for grassroots efforts producing a valuable result: the human genome project. “It wasn’t coordinated at the outset, and there was no prescribed effort to control overlap,” Kohl says, “yet it’s been one of the most successful integration activities worldwide, sharing information and data. The physiome effort can learn a lot from that approach.”



Here, a multi-scale biophysical electromechanics model of the rat left ventricle progresses through a single heartbeat cycle from the end of diastole (A) through multiple key steps including (B) end iso-volumetric contraction, (C) end ejection, (D) end relaxation, (E) end recoil and (F) end diastases. The orientation and size of the cones embedded within the mesh indicate the direction and magnitude of principal strain, respectively. Blue and red cones represent states of compression and tension, respectively. Gold streamlines indicate the fiber orientation. The 3 colored spheres assist in visualizing the rotation of the ellipsoid. Researchers used this model to investigate how feedback loops regulate heart contraction. Reprinted from:  Niederer SA, Smith NP, 2009, The Role of the Frank–Starling Law in the Transduction of Cellular Work to Whole Organ Pump Function: A Computational Modeling Analysis. PLoS Comput Biol 5(4): e1000371. doi:10.1371/journal.pcbi.1000371.

The heart physiome project began 20 years ago as a collaborative endeavour between Auckland and Oxford. “In the physiome vision,” Smith says, “the heart is arguably the most advanced example of taking information from lots of sources and putting it into a consistent framework that you can probe in ways that you can’t think about all at once.”


Although the heart is in some ways simple—it’s a pump—it is nevertheless complex as it depends on electrical activation, mechanical contraction, and fluid dynamics. This is perhaps why multi-scale research related to the heart has moved farther and faster than it has for other organs. “Lots of different people can all offer a piece of the puzzle,” Smith says, and none can understand it alone.


Heart physiome models integrate an impressive number of scales and data types. For example, to understand how a mutation in the myosin regulatory light chain filters up through the scales to alter the dynamics of the heart beat, producing heart failure in humans and in a genetically engineered mice model, Andrew McCulloch, PhD, professor of bioengineering at the University of California, San Diego and his colleagues created a multi-scale computer model of the mouse heart. The ingredients included models of molecular motors, whole cell twitch forces, tissue stresses and 3-D muscle fiber stresses, ventriclular geometry, and hemodynamic loading conditions. The output showed that changes in phosphorylation of the regulatory light chain (due to a mutation) reduce the twist of the mouse ventricle during systole, which can be an early indicator of heart failure. The changes were then validated in mice in vivo.


When a large deformation mechanical model of the heart is coupled to a Navier-Stokes model of blood flow within its chambers, researchers can track streamlines through the ventricular volume and observe deformation of the finite element geometry. Courtesy of Nic Smith, Matthew McCormick and David Nordsletten at Oxford University in the UK.The heart provides a great testbed for application of the physiome approach to direct outcomes, whether clinical or commercial.  For example, if the heart stops contracting synchronously and loses efficiency, patients may benefit from a treatment called cardiac resynchronization therapy: an implanted pacemaker is used to help both ventricles of the heart contract simultaneously. But only about two-thirds of patients given resynchronization therapy respond. So McCulloch’s team is developing heart models to optimize this therapy for individual patients. The models rely on patient-specific data that is clinically availible: catheter systems map the heart’s electroanatomical activity and hemodynamics; echocardiography measures heart function; and CT scans measure heart structures. The resulting model reconstructs the heart’s baseline function and predicts the likely outcome of specific cardiac resyncronization therapy plans. These predictions are then compared to the patient’s outcomes at three months. “At this point, we are just following diagnosis and treatment to see if the model can predict what actually occurs,” McCulloch says. “If we could better identify responders, or could better identify ways to optimize the therapy to improve results, then the models could have clinical predictive utility.”


EuHeart, the VPH program for advancing the heart physiome, is also developing physiome models for optimizing cardiac resynchronization therapy in particular patients, Smith says. “The clinical decisions are in many cases still relatively high level. For example, ‘should we put the pacemaker at the back or the front,’” Smith notes. “This means there are often big clinical windows. We don’t have to get it exactly right straight away. What we do need is to demonstrate an improvement over best practice. I think this is now possible in an increasing number of contexts."


McCulloch and euHeart researchers are now working together to share clinical data. “One of the huge problems with clinical studies is patient variability,” McCulloch says. “Any one model is not going to be that useful.  We need to understand sources of variation: What’s different such that some patients respond and some don’t?”


Another euHeart project models patient-specific coronary blood flow to help doctors determine whether the best treatment for a blockage is a stent, medical therapy, or angioplasty. “We want to be sure we’ve chosen the right therapy for the right patients,” Smith says. Through euHeart, Smith has funding to simulate a number of different patient cases.  “Our goal is to get to the point where we have compelling evidence to do clinical trials,” he says.


Other members of euHeart are determining how to best stop atrial fibrillation.  If medication is not successful, the typical treatment is ablation—essentially burning scars in the heart muscle to block the wave of electrical conduction so that it doesn’t end up in a cardiac spiral. Currently, more than 25 percent of treated patients have to come back for additional treatments after about three months, says Olivier Ecabert, PhD, of Philips Research Laboratories in Aachen, Germany, who is also joint coordinator of euHeart with Smith. The problem is: where to burn?  “Ideally,” Ecabert says, “doctors would have a patient-specific model and simulate several ablation line options to see how the patient will recover.”  The physician could then select the ablation line that seems most promising or, pushing it further, the computer could optimize the surgical plan.


Other euHeart projects include predicting when valves are wearing out and should be replaced; and figuring out how to make left-ventricular assist devices (LVADs) work best for the patients who have them.


Philips Research Laboratories, euHeart's project coordinator and one of its industrial partners, joined euHeart because the models might result in some kind of “proof of principle” for software or hardware that Philips could then develop. For example, Philips already developed a geometric heart model that can be adapted to automatically analyze data from 3-D images of the heart. “Now we would like to learn what is necessary to integrate physiological information into the model and then incorporate that into imaging equipment software,” Ecabert says.


At mid-contraction during an ischemic event, this model of coronary perfusion within the cardiac ventricles shows large gradients in the concentration of oxygenated blood delivered to the heart around the ischemic region but relatively constant perfusion elsewhere. Courtesy of Nic Smith.The fact that industry is interested suggests that physiome modeling, at least of the heart, is coming of age. “We are convinced that this type of physiological model will be more and more applied in the future by clinicians. It’s on the rising side of the curve and Philips would like to join the trend early,” Ecabert says.



This multi-scale model displays the distribution of ventilation in the lung. The model couples the elasticity of the alveolar tissue to a model of airflow through the entire conducting airway tree. Red represents highest flow; blue represents lowest flow. The vertical distribution occurs because deformation of the lung tissue under its weight makes the tissue in the base of the lung effectively more compliant, so it expands readily when the lung breathes in.  Courtesy of Merryn Tawhai.The lungs sit within the chest cavity where they expand and recoil as the diaphragm contracts and relaxes. Embedded within the sponge-like tissue of the lungs is a branching tree of conducting airways that expand and recoil with each breath. Air flows in through the larger to smaller branches to reach the gas exchange surface, and back out again. Gas exchange—which occurs in the alveoli—also requires a matching supply of blood to and from the gas exchange surfaces.


Multi-scale models of the lung’s complexity are starting to yield some interesting findings but are still a step or two away from clinical application. One group of collaborating researchers in New Zealand and Iowa are coupling subject-specific imaging with geometric lung modeling and computational fluid dynamics.  The result is a multi-scale lung model that incorporates the entire airway from the oropharynx to the alveoli.


“We’re really taking a very systematic and structured approach, similar to what’s been done with the heart, in creating anatomically realistic models,” says Merryn Tawhai, PhD, associate professor in the Auckland Bioengineering Institute at the University of Auckland, New Zealand. “We’re building up patient-specific databases and then working down toward modeling cellular functions and putting that into our whole organ model.”


One of Tawhai's collaborators is Eric Hoffman, PhD, professor of radiology, medicine and biomedical engineering at the University of Iowa. He acquires detailed images of the lung using state-of-the-art spiral computed tomography imaging. The imaging is then converted into a 3-D mesh model of the airway down to the 28th generation of branching using a combination of imaging and mathematical algorithms. This provides a far more realistic image of the airway tree than previous lung models, which typically extend to only the 6th-9th generation at most. The uppermost airways in the model are shaped specifically to match the individual subject’s imaging. To fill in the remaining lung tissue down to the airways just before the alveoli, the team uses a volume-filling approach developed by Tawhai’s lab and previously validated. Next comes the computational fluid dynamics, to look at airflow, and regional ventilation. And this is where some exciting results are starting to filter in.


Ching-Long Lin, PhD, professor of mechanical and industrial engineering at the University of Iowa, developed a parallel computational fluid dynamics model to predict airflow in Hoffman and Tawhai’s model lung. The team demonstrated that the model can capture detailed flow structures in regions of interest, and can match experimental studies of regional ventilation for the whole lung in a subject-specific way.


The complete model should prove useful for studying the progression of diseases such as asthma and for predicting particle deposition in individual patients, which is important for dosing of inhaled medication, Lin says. One problem, however, is computational cost. To capture laminar-transitional-turbulent flow in the multi-scale airway model required about one week on the TeraGrid Lonestar and Ranger clusters at the Texas Advanced Computing Center for one human subject. “In terms of clinical applications we have to reduce that,” Lin says. “Doctors probably don’t want that much detail anyway. But the multi-scale computational framework of the human lungs can provide the detailed information needed to understand the interplays between pulmonary structure and function at their most fundamental level.”


Tawhai’s group is also studying airway hyper-responsiveness in collaboration with several other groups. Together with James Sneyd, PhD, professor of mathematics at the University of Auckland, they’ve developed a model of contraction within the airway smooth muscle cell that is then embedded in a model of the cross section of the airway and the surrounding parenchymal tissue. This is in turn embedded within the whole anatomically structured airway tree model, which is embedded in the lung tissue. “It produces a lung that breathes and develops different forces depending where you are within the lung and so each airway experiences its own particular force balance,” Tawhai says. Experiments by collaborators at the University of Massachusetts, McGill University, and the University of Vermont informed and validated the model. “So we’re starting to get a handle on the emergence of patterns of broncho-constriction within the lung and how those vary in different parts of the lung,” she says.  “Some parts are more susceptible to airway closure than others.  This is ongoing work.”


In a different project, Tawhai’s lab is trying to understand the safest level of heat and humidity for air delivery to mechanically ventilated patients (when an endotracheal tube bypasses the nose and mouth). To get at that question, they had to start at the cell level inside the lungs. On top of the ciliated epithelial cells that line the airway, there’s a layer of liquid that must be maintained to a very specific depth in order to achieve mucus clearance. Nicolas Warren, PhD, a graduate student in Tawhai’s  group in the ABI and co-supervised with Edmund Crampin, DPhil, from the ABI , developed and validated a model of such cells joined together with liquid moving through multiple cells. Tawhai’s team then put the cell model into the whole organ model, distributing cells along airways and through the airway tree, and then directing the lung to breathe with different temperatures and humidity. They found that the epithelial cells alone couldn’t transport enough moisture to maintain the depth of the surface liquid during normal breathing. “So there has to be some other significant source of moisture,” Tawhai says. “And it’s something we couldn’t have seen without putting it into the real anatomical framework.” Possibly submucosal glands or transport of fluid from the lung periphery provide the additional fluid needed, Tawhai says, but it’s really not known. Still, now there’s a model on which experimentalists can test various hypotheses. Tawhai’s team is currently working on adapting the epithelial cell model to make it more specific to disorders such as cystic fibrosis.


The epithelial cell model is also the starting point for a new NIH grant led by Lin. It will integrate in vitro cell data and in vivo image data together with Lin's in-house computational fluid-structure-interaction technologies and the cell model to understand the interplay between organ, tissue, and cells. A predictive computational lung model across these scales will allow researchers to assess individuals’ response to therapy over time.  Ultimately, Lin says, “We will be able to use this information to better tailor a treatment plan for the individual at the most basic level.”



A clinical application of a multi-scale model is used several times a year at Viceconti's Institute to help monitor children who’ve received bone transplants as a treatment for a rare type of bone cancer called Ewing's sarcoma. These children need aggressive rehabilitation, but doctors don’t want to risk fracturing the reconstructed bone. So Rizzoli bioengineers do a full scan and gait-analysis with markers, collecting all possible data over a whole day.  They then generate a whole body and organ level model for the bone and simulate rehabilitation exercise to predict loads acting on the bone and determine fracture risk. “With this information, we can assist in determining the rehabilitation program for each patient,” Viceconti says. “This is the only real-world application we have in clinical practice today.” Courtesy of Marco Viceconti.Physiome efforts for neuromuscular modeling are ramping up. A relatively new and major effort is Europe’s VPH Osteoporosis project (VPHOP), a collaboration among 19 European academic and industrial partners, led by Marco Viceconti, PhD, technical director of the Medical Technology Laboratory at the Istituto Ortopedico Rizzoli di Bologna in Italy. The project seeks to predict the risk of fracture in people with osteoporosis. As people age, their bones weaken and lose calcium, causing a condition known as osteoporosis. Meanwhile, they lose neuromuscular control, which can lead to falls. These changes happen at the cellular level in the bones and muscles and manifest as changes in morphology at the tissue level.  “So in order to predict risk of fracture over time, you have to account for whole body, organ and tissue scales,” Viceconti says.  “That’s what we’re doing in VPHOP.”  
By September of this year, two years into the VPHOP project, Viceconti expects to run a very large probabilistic model that accounts directly or indirectly for all factors that act or contribute to the risk of fracture in one patient at any possible scale. The simulation should answer the question: “Of the dozens of possible parameters that can define the multi-scale phenomenon, which ones really are important and make a difference?” he says. “That answer will drive the most critical part of the project—not the modeling itself, but the ability to measure in patients the information we need with the accuracy we need.”


The VPHOP is partnering closely with industry to develop the technologies for measuring this key patient information in a cost-effective way. For example, they’ve developed ACTIBELT, a device embedded in a belt buckle that can record the kinematics of the body for five days. Also, jointly with Philips, VPHOP is developing a system, based on Philips Medical Systems XpertCT imaging technology, that can generate 3-D images of bone at the tissue scale—primarily for patients at high risk for a fracture. And with BioSpaceMed, they are building a very low-dose whole body X-ray machine called EOS-QT that can generate a 3-D model of the patient skeleton using sophisticated morphing technology and possibly even estimate densitometry at each point and provide a preliminary estimate of the risk of fracture. “We are trying to push limits of the current imaging technology by using all possible tricks,” Viceconti says.


The VPHOP project has recently started a cooperation project with Simbios, a National Center for Biomedical Computing at Stanford (and publishers of this magazine). Much of Simbios’ neuromuscular modeling work has a multi-scale aspect, which opens the door to musculoskeletal physiome modeling. VPHOP and Simbios hope to connect their online communities and integrate their tools. Eventually, Viceconti says, “we’d like to join forces to attack a grand problem where the multi-scale approach can make a difference.”


For example, both Viceconti and Scott Delp, PhD, professor of bioengineering and mechanical engineering at Stanford University and co-PI of Simbios, are interested in exploring probabilistic approaches. “Probabilistic approaches provide a rigorous method to account for variability between subjects,” says Delp.


Viceconti says the deterministic nature of existing models—one input produces one output—really limits their ability to bring the models into clinical practice. "As far as input is good, the output is good. But this is not real life,” he says. “If you include a probabilistic approach, you can factor in your ignorance. So you can let a parameter vary widely to see if it matters, and if it doesn’t then you can leave it out.”



Progress on modeling the physiome reaches well beyond the heart, lung and musculoskeletal examples covered here. Researchers are taking a physiome approach to the kidneys, digestive tract, lymphatic system, and even to some extent the nervous system and the brain, Hunter says.


Before now, Bassingthwaighte says, people have been thinking too narrowly. “But many bright molecular biologists are now trying to be more integrational,” he says. “The physiome provides context for that, and for inspiring people to think more broadly.”


Standardizing the Physiome

Multi-scale quantitative models need to be validated and reproducible if they are to be useful for clinical workflows, says Hunter. The Physiome infrastructure developed by Hunter, Dr Poul Nielsen and their colleagues (and provided at makes that process more robust and transparent, he says. Researchers can confidently download an annotated model from knowing that it’s reproducible. The model can then be incorporated into larger scale workflows for use in a clinical setting.


“Having the means to incorporate the outputs of different groups through standards and interoperability is quite a worthwhile goal,” Hunter says. “And an essential one if we’re to get the modeling of biology into clinical processes.”  
Models held by the model repository use CellML, a markup language for biophysical models of cells. A repository at the European Bioinformatics Institute (EBI) contains models marked up with SBML, a language for systems biology models.  Hunter’s group is also creating a new standard called FieldML for integrating spatial information. In recent years, Hunter says, the CellML and SBML communities have become more integrated. “SBML and CellML are now working together jointly to curate models and develop standards around metadata.”


From funding agencies’ point of view, “We don’t want people to have to reinvent models,” Peng says. But at this point, “The different formats are all co-existing.  No one wants to stand up and say one is better than another.”


It’s also true that some multi-scale models require information that goes beyond what CellML or SBML can provide, McCulloch says. “It’s not possible to describe everything in our cardiovascular model using that system.” So McCulloch is building a database that includes metadata about his models that will be consistent with CellML and other model description formats but goes beyond them to include additional information.


Nic Smith agrees that standards are useful for sharing between different academic centers, but he says, an important step to embedding multi-scale models in clinical workflows is a demonstration that they add extra information that can be made available to physicians in a familiar format. “We are working on developing interfaces and putting them in a context where physicians are used to seeing them—in connection with imaging and clinical data accessed directly from the hospital’s computer system."

Post new comment

The content of this field is kept private and will not be shown publicly.
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
Enter the characters shown in the image.