Courtesy of Marisa Dolled-Filhart, Robert L. Camp, and David L. Rimm
If there's anyone who can appreciate tissue microarrays, it's histology technician Sabina Magedson. Having worked in a pathology laboratory at M.D. Anderson Cancer Center in Houston for years, Magedson knows all too well the tedium of staining and analyzing hundreds upon hundreds of individual tissue sections--all in the name of one part, of one experiment.
Increasingly, such low-throughput monotony is giving way to 'omics-style science, thanks to tissue microarrays (TMAs). Originally developed in the mid-1980s, tissue arrays never really caught on until Juha Kononen, who was then a postdoctoral fellow at the National Human Genome Research Institute, developed a relatively simple way to construct them in 1997.1 Today, TMAs can contain from tens to hundreds of minute tissue samples (0.6 to 2 mm in diameter) arranged on one slide. By reducing the amount of time and effort required to process them, not to mention the amount of necessary tissue and reagents, these chips are accelerating the pace of research for oncologists, drug discoverers, and other scientists seeking to make sense of the data being churned out of genomics and proteomics laboratories. "You're getting much more work done with a lot less work and a lot less money," said Magedson, who is now a TMA technical specialist.
POPULARITY CONTEST Though not as simple to create as DNA microarrays, researchers can make their own tissue arrays using commercially available instruments; both manual and automated versions exist. Alternatively, they can buy them off-the-shelf from a growing list of suppliers. Additionally, a number of companies and university core facilities will prepare custom arrays from either user-supplied tissue blocks, or by tapping into in-house tissue collections.
Most arrays use formalin-fixed tissue samples but some companies, such as Clinomics Biosciences, specialize in frozen-tissue arrays; the lack of a fixing step tends to preserve the integrity of the RNA and proteins. "For any meaningful genomics procedure, you want to use frozen tissues that haven't suffered any RNA degradation from the formalin fixing," Clinomics CEO Steve Turner says.
But, counters researcher Gregory Fuller, associate professor of pathology at M.D. Anderson, "Many studies have been done beautifully" in tissues fixed with formalin. "That's why tissue microarrays have become so popular."
They've also gained popularity because they use only a fraction of the reagents, stains, antibodies, and--perhaps most importantly--tissue that their traditional counterparts do.1 This is especially important because researchers often rely on archived tissue samples, which can't be replaced, for retrospective studies. And when it comes to validating a biomarker, testing it against hundreds of samples and correlating it to clinical outcome is key.
INTO THE ARCHIVES Two recent studies highlight this point. Yale researchers recently published a study on HER2 expression in breast cancer tissue using TMAs, in which they found that higher levels of HER2 protein correlated with poorer clinical outcomes.2 The research used 300 archived tissue specimens, which were taken from patients diagnosed with invasive breast carcinoma from 1962 to 1977. But the scientists took just two 0.6-mm diameter cores from each sample, thereby preserving the archived tissues for future studies. In an earlier report, the same team studied the prognostic value of beta-catenin expression in 310 colon carcinoma specimens collected between 1971 and 1982.3
This latter study illustrates another benefit of TMA technology: quantitative analysis. Traditionally, pathologists use a four-point scale to rate specimens. Having a pathologist score each specimen is not only slow and laborious, but also yields results that are subjective, difficult to reproduce, and that don't reflect subtleties.
When the team analyzed these tissue sections using the traditional four-point scale, they saw no correlation between the amount of nuclear beta-catenin and clinical prognosis. But when the group stratified the differing amounts of expression among the samples using a continuous 1,000-point scale, they found that tissue cores in the top 10% of nuclear beta-catenin expression correlated with significantly worse prognosis.
"Quantitative measurements ultimately allow us to make predictions about patient outcomes and their response to therapy," says David Rimm, director of the Yale Cancer Center Tissue Microarray Facility and senior author of both studies.
But for most, the promise of TMAs remains unfulfilled, because scientists lack methods of high-speed automated quantitative analysis. "Analysis of data is killing them," says Steven Hewitt, director of the Tissue Array Research Program at the National Cancer Institute. "The biggest hurdle they have to get past is to get outstanding image quality and in a format that's portable and can be analyzed."
TISSUE VISUALIZATION The first step in speeding up analysis is automating tissue core visualization. Companies have developed a variety of hardware and software solutions to this problem. Bacus Laboratories' BLISS system uses a tiling approach that scans the array piecemeal and then stitches together all the tiles to produce a single composite image.
Aperio Technologies' imaging method is 10 to 15 times faster than the tiling approach, says CEO Dirk Soenksen. The company's ScanScope® digitizes the entire array slide by applying the "linear detector" technology used in fax machines. "The ability to achieve optimal focus is dramatically better than the tiling system," he says. "And better images lead to better accuracy in analysis."
A newcomer on the block is Trestle, with its just-introduced MedScan™. Unlike Aperio's line detection method, MedScan employs area scanning, which is three times faster, says chief scientific officer Jack Zeineh. Further speeding the process, the device compresses images as they are scanned; this is an additional step for many competing systems.
Applied Imaging's Ariol™ imaging and analysis system can image both colorimetric and fluorescently labeled samples. It has two light sources and shutters that block one or the other depending on what's being imaged and analyzed, according to Paddy O'Kelly, the company's vice president of operations. O'Kelly says he believes that the use of brightfield microscopy to image colorimetric stains will continue to be more prevalent. "Brightfield is much more practical," he said. "Imaging fluorescence takes much longer."
ANALYSIS SOFTWARE Joining these new imaging instruments are newly developed analysis programs tasked with prescreening the tissue cores to relieve--and perhaps eventually replace--the pathologist.
Extracting information from a tissue array is a greater challenge than with DNA and protein microarrays, says Hewitt, because tissue cores have much more detail that need to be examined. Merely registering positive for antibody binding is only the beginning. Pathologists need to know where on the tissue core the antibody or cDNA probe is bound: cancer cells or normal cells, stromal tissue or glandular tissue, cell membrane or cytoplasm.
Beecher Instruments, where tissue microarray inventor Kononen is now vice president and chief scientific officer, is developing a promising product. Borrowing technology from German company Definiens, which specializes in satellite imaging software, Beecher is producing an analysis package based upon "contextual information rather than pixel information," Beecher CEO Dan Rohwer-Nutter says. "The software isn't looking at colors, brightness, and shapes," as most other systems attempt to do.
For example, he explains, the software identifies a finger and then knows that the adjacent object must be a hand, which the software knows is connected to an arm, and so forth, based on a set of installed rules. Beecher is writing rule sets for different types of tissues and diseases; researchers interested in the software can decide on which rules to purchase depending on their areas of study.
"We hope that our software will replace human scoring of tissues within one year," Rohwer-Nutter says. But, he concedes that with automated high-speed processing and analysis of the arrays, researchers are sacrificing the precision for each individual tissue core. "That's why you can't use tissue microarrays for a diagnostic when the information is based on just a 0.6-mm spot," he says. Instead, he adds, the true value of TMAs lies in achieving the large numbers required to achieve statistical significance in finding biomarkers, assessing protein and gene expression, and other information.
Still, Todd Joron, senior vice president at TissueInformatics, says his company's analytical software offers a high level of precision and detail. Called TissueAnalytics Arrayf(x)™, the software can give information about the subcellular location of staining and detect the presence of rare events, proteins expressed at low levels.
Yale's Rimm and his research team have come up with an alternative system, which he calls Professor Marvel, after the wizard behind the curtain in the movie The Wizard of Oz. The imaging system, coupled with AQUA (Automated Quantitative Analysis) software, could ultimately help clinicians assess protein expression in tissue samples without the aid of a pathologist, according to a press release.4
AQUA relies on an algorithm that uses "fluorescent tags to separate tumors from stroma and to define subcellular compartments," Rimm wrote in the colon carcinoma paper.3 "A novel, rapid exponential subtraction algorithm ... dramatically improves the assignment of pixels to a particular subcellular compartment." Rimm and Yale recently founded a new company called Histometrix to commercialize the imaging system.
DATA MANAGEMENT As databases of array results continue to be flooded, overwhelmed researchers are trying to find methods of managing, organizing, and culling all the data. "With tissue microarrays, you get so much data that it's hard to keep in your head," says cancer researcher Matt van de Rijn, associate professor of pathology at Stanford University Medical Center.
Courtesy of ChromaVision Medical Systems
Drawing from his experience in dealing with DNA microarray studies, van de Rijn has developed a system in which staining results are recorded in a Microsoft Excel worksheet, which is then reformatted by TMA-Deconvoluter, a new program that prepares data for hierarchical clustering, statistical, and other types of analyses. A Web-based program called Stainfinder enables users to link between clustered data and a digital image database.5
"We have over 70,000 digital images stored that we can rapidly access," van de Rijn says, pointing out that linking from the numerical data to the exact digital image takes only a few seconds. "Our software rapidly shows data sets in a way that the human brain can envision," he said. "Then you have a better chance of coming out sane at the end of the session."
Kevin Coombes, at M.D. Anderson Cancer Center, developed another TMA data management system, called TAD (Tissue Array Database). Consisting of an SQL back-end and an ASP front-end, the system tracks all information pertinent to setting up and scoring tissue array experiments.
A GLOBAL ENTERPRISE Even with institutions coming up with good data management systems for themselves, such data is only minimally useful if it isn't accessible to others in the scientific community. Researchers at different institutions studying the same cancer tissue type or the same tissue specimen could benefit from the synergy of sharing results.
To see this come to fruition, Jules Berman at the National Cancer Institute, with support from the Association of Pathology Informatics, is leading efforts to come up with guidelines in formatting data so that it can be shared, exchanged, and understood by all. He recently coauthored a paper6 that describes, based on a series of workshops at the National Cancer Institute, what information is necessary to describe an array, as well as a list of common data elements.
In the few years since they were first described, tissue microarrays have exploded in popularity, if the 3,000 hits retrieved by the Google search engine are any indicator. That popularity likely will continue to expand as array fabrication, imaging, and analysis improve, and that could lead to benefits in the drug discovery arena. Pharmaceutical companies will be able to use tissue microarrays to identify clinical trial participants who will respond favorably to a particular drug. "If you can tell which patients will respond, then you can show increased efficacy of the drug," says Yale's Rimm. "Then you'll increase your chances of getting FDA approval."
Laura Lane (firstname.lastname@example.org) is a freelance writer in San Francisco.
1. A.M. DeMarzo, H. Fedor, "The principles, uses and construction of tissue microarrays in pathology research," Presentation at the 92nd Annual Meeting of the United States and Canadian Academy of Pathology, March 27, 2003, Washington DC.
2. R.L. Camp et al., "Quantitative analysis of breast cancer tissue microarrays shows that both high and normal levels of HER2 expression are associated with poor outcome," Cancer Res, 63:1445-8, April 2003.
3. R.L. Camp et al., "Automated subcellular localization and quantification of protein expression in tissue microarrays," Nat Med, 8:1323-7, 2002.
4. "Making magic with Professor Marvel," Breast Cancer Alliance, www.breastcanceralliance.com/whatsnew/rimm.asp
5. C.L. Liu et al., "Software tools for high-throughput analysis and archiving of immunohistochemistry staining data obtained with tissue microarrays," Am J Path, 161:1557-65, 2002.
6. J.J. Berman et al., "The tissue microarray data exchange specification: A community-based, open source tool for sharing tissue microarray data," BMC Med Inform Decis Mak, 3:5, May 23, 2003.