Student or Learner
Does "computational systems biology models of the circuitry underlying each toxicity pathway" mean "computational systems biology models of the circuitry that is the foundation of each toxicity pathway"?
In 2007, the U.S. National Academy of Sciences released a report, Toxicity Testing in the 21st Century: A Vision and a Strategy, that envisions a not-so-distant future in which virtually all routine toxicity testing would be conducted in human cells or cell lines in vitro by evaluating cellular responses in a suite of toxicity pathway assays using high-throughput tests, that could be implemented with robotic assistance. Risk assessment based on results of these types of tests would shift towards the avoidance of significant perturbations of these pathways in exposed human populations. Dose-response modeling of perturbations of pathway function would be organized around computational systems biology models of the circuitry underlying each toxicity pathway. In vitro to in vivo extrapolations would rely on pharmacokinetic models to predict human blood and tissue concentrations under specific exposure conditions. All of the scientific tools needed to affect these changes in toxicity testing practices are either currently available or in an advanced state of development. A broad scientific discussion of this new vision for the future of toxicity testing is needed to motivate a departure from the traditional high dose animal-based toxicological tests, with its attendant challenges for dose and species extrapolation, towards a new approach more firmly grounded in human biology. The present paper, and invited commentaries on the report that will appear in Toxicological Sciences over the next year, are intended to initiate a dialog to identify challenges in implementing the vision and address obstacles to change.