Toward reducing animal testing while predicting a chemical’s effects on human health, researchers at the National Institutes of Health (NIH)’s National Center for Advancing Translational Sciences (NCATS) and their colleagues have developed an in vitro robotic screening tool able to systematically screen thousands of chemicals in human cell lines. In a study published today (January 26) in Nature Communications, the NIH-led team demonstrates an ability to test environmental chemicals found in drugs, food and food packaging, consumer products, and chemicals produced during manufacturing and industrial processes using cell-based assays.
The work is part of Tox21, a collaboration among four government agencies—the NIH, the Environmental Protection Agency (EPA), the National Toxicology Program (NTP), and the Food and Drug Administration (FDA)—that officially kicked off in 2008.
“I think this is one of the best examples of big data entering [the field of] toxicology,” said Thomas Hartung, director of the Centers for Alternatives to Animal Testing at Johns Hopkins University who was not involved with the work. “Because of the high quality of the data set and its transparency and data-sharing, this is really an enabling step” toward in vitro toxicology testing.
“[The group] is testing chemicals of broad environmental relevance including drugs, cosmetics, and food ingredients,” said Richard Judson, a bioinformatician, at the EPA’s National Center for Computational Toxicology in Research Triangle, North Carolina, who collaborates with the authors as part of the Tox21 effort but was not involved in the present study.
Traditionally, toxicity testing of chemicals and compounds has been conducted in animals as a surrogate for human health safety. In an effort to reduce harm to animals and decrease the cost and time it takes to generate animal-safety data, Ruili Huang, an informatics group leader at NCATS, and her colleagues screened 10,000 chemicals through 30 different automated, cell-based assays. All of the chemicals were of special interest to the EPA, NTP, or the NIH.
“The system is very efficient,” Huang told The Scientist. “We can test all the chemicals at 15 different concentrations each and in three independent experiment runs in one week. With animal testing, this would take years.”
“The many replicates and concentrations tested make the data more reliable and directly usable for estimating the concentrations at which chemicals can be active in the real world,” Judson told The Scientist.
“The data gives us an indication of the tissue concentration that we should be worried about,” added Hartung.
Each assay assesses the ability of the chemicals to interact with either a nuclear receptor pathway or a cellular stress response pathway, including mitochondrial signaling pathways. The nuclear receptor assays include testing environmental chemicals for their abilities to disrupt endocrine signaling, which can result in harmful effects on development, reproduction, and neurological functions. Several of the assays test the ability of the chemicals to alter signaling of PPAR, a fatty acid receptor and master regulator of fat cell development. Such chemicals, linked to fat gain, are called obesogens. (The EPA’s chemical screening program has adopted several of the endocrine disruptor assays.)
The team observed that certain chemicals clustered together by function, in their ability to activate certain signaling pathways such as the estrogen receptor pathway. “We can partly predict animal and human toxicity data but the information is not perfect, we still need to add chemical structure data to achieve a more accurate prediction,” explained Huang. With the addition of more assays, the information generated could be used to create models to more accurately predict in vivo toxicity, added Huang.
“What is interesting is that the data has so far been more predictive of human rather than animal toxicity,” said Fiona Sewell, a toxicology and regulatory sciences researcher at the U.K.’s National Centre for Replacement, Refinement, and Reduction of Animal in Research (NC3Rs) who was not involved with the work. “Regulatory agencies are often more comfortable with using animal data to predict human toxicity but we now need to switch to using ore human relevant in vitro assays which may prove to be as or more informative than animal studies,” Sewell added. “This [study] is a positive first step but there is still a long way to go before we reach the ultimate goal of being able to assess the safety of chemicals to humans without using animals.”
The team will next optimize some of cell-based assays and increase the coverage of cellular targets relevant for toxicity by developing additional tests. The group also hopes to increase its library of chemical compounds to include around 80,000 manmade chemicals that are currently released into the environment. The researchers are also homing in on some of the most interesting chemicals—such as those identified from the mitochondria toxicity assay—to validate their in vitro results with further testing, including in animals.
“What is really remarkable is that, in the U.S., this is being driven from the top down, by government agencies,” said Hartung. “There is a revolution taking place of how safety assessment will be done in the future.”
R. Huang et al., “Modelling the Tox21 10 K chemical profiles for in vivo toxicity prediction and mechanism characterization,” Nature Communications, doi:10.1038/ncomms10425, 2016.
Clarifications (January 26): The first sentence of this article has been updated to note that the researchers’ results are a step toward reducing animal testing (not toward eliminating it, as was previously written). The second-to-last paragraph has been updated to note that the group hopes to grow the compound library. (It was previously written that the group planned to do so.)