Tuesday, February 26, 2013

Portable, Quantitative Detection of Bacillus Bacterial Spores Using Surface-Enhanced Raman Scattering

The following abstract describes a recent paper that has been accepted for publication in Anal. Chemistry. The full article may be downloaded at the American Chemical Society.

Portable rapid detection of pathogenic bacteria such as Bacillus is highly desirable for safety in food manufacture and under the current heightened risk of biological terrorism. Surface enhanced Raman scattering (SERS) is becoming the preferred analytical technique for bacterial detection, due to its speed of analysis and high sensitivity. However in seeking methods offering the lowest limits of detection, the current research has tended towards highly confocal, microscopy-based analysis, which requires somewhat bulky instrumentation and precisely synthesised SERS substrates. By contrast, in this study we have improved SERS for bacterial analyses using silver colloidal substrates, which are easily and cheaply synthesised in bulk, and which we shall demonstrate permit analysis using portable instrumentation. All analyses were conducted in triplicate to assess the reproducibility of this approach, which was excellent. We demonstrate that SERS is able to detect and quantify rapidly the dipicolinate (DPA) biomarker for Bacillus spores at 5 ppb (29.9 nM) levels which are significantly lower than those previously reported for SERS and well below the infective dose of 104 B. anthracis cells for inhalation anthrax. Finally we show the potential of multivariate data analysis to improve detection levels in complex DPA extracts from viable spores.

Friday, February 22, 2013

New Rapid Methods have been Added to Our RMM Product Matrix

Two new rapid method technologies have been added to our RMM Product Matrix.

The Advencis Lynx technology is a non-destructive, growth-based system that rapidly detects and enumerates micro-colonies using high magnification imaging.

The Vivione Biosciences RAPID-B technology is based on flow cytometry and has the capability to enumerate microorganisms and detect target microorganisms via DNA dyes and immunoprobes.

The following information is provided for both systems: scientific method, applications, time to result, sample size or type, sensitivity levels, organisms detected and workflow. Please click on the RMM Product Matrix to read more about these and more than 60 other rapid methods.

Isothermal PCR Gets Off The Dime

Last year, EnviroLogix and Douglas Scientific signed a collaborative agreement to develop and optimize technology that they say will enable high-throughput real-time and endpoint nucleic acid analysis.

Put another way, the companies intend to make polymerase chain reaction (PCR) and its need for thermal cycling instrumentation obsolete by leveraging EnviroLogix isothermal DNAble chemistry and Douglas’ Array Tape™ platform into a novel high-throughput platform for fast PCR.

PCR has long been the method of choice for scientists wishing to make sufficient amounts of identical genetic material for study and analysis. Applications for PCR include genotyping, SNP analysis, drug target validation, and quantitative gene expression analysis, which require quantification of the amount of DNA in a starting sample.

And to get results in real time, researchers use real-time PCR, or quantitative PCR (qPCR) that monitors PCR amplification as it occurs. qPCR can also be combined with reverse transcription (RT-qPCR) to quantify the amount of RNA in a starting sample.

How PCR Works

Closely patterned after the natural DNA replication cycle that occurs in cells, each PCR cycle involves a denaturation, annealing, and extension step in a programmed thermal cycler. The cycler automatically controls and alternates the temperatures for each stage of the reaction, raising and lowering the temperature of the chemical components at specific times and for a preset number of cycles.

Two factors crucial in enabling PCR to be used as an efficient laboratory technique were the discovery of thermostable DNA polymerases, derived from thermophilic bacterium (Thermus aquaticus) and the invention of the automated thermal cycler. Unlike regular DNA polymerases, T. aquaticus or Taq polymerases can withstand heating to 95°C meaning they do not need to be added at each new PCR cycle. Using Taq polymerase, the extension step can be performed at 72°C.

Today’s modern thermocyclers comprise metal blocks where tubes containing the PCR reaction can be inserted. The thermocycler is pre-programmed to move between temperatures for highly accurate and fast cycling, enabling most PCR reactions to be completed in less than two hours.

PCR is now the most widely used technique in molecular biology, biochemistry, and medicine, with the market growing at an annual rate of over 10%. By 2015, the global market is expected to surpass the $38 billion mark.

Currrently, there are more than 50 PCR techniques in use, with qPCR, RT-PCR, Hot Start, Multiplex, and Taqman techniques having the market lead, and expected to see continued growth.

But none of these change the basic PCR paradigm, which remains expensive, time-consuming, instrument and reagent-intensive, and requires extensive sample preparation. And high throughput for real-time PCR assays has been hard to achieve because of the relatively long reaction times required for the PCR cycle of denaturing, extension, and reannealing.

Alternatives to PCR

Isothermal amplification technology, or DNA amplification without the need for a thermal cycler, has potentially offered a cheaper, much faster, less reagent-intensive, and more adaptable method than PCR. These include methods based on, for example, strand displacement amplification, nucleic acid sequence-based amplification, self-sustained sequence replication, rolling circle amplification, loop-mediated amplification, and helicase-dependent amplification.

Scientists say while these methods may overcome some of the technical and cost barriers associated with PCR and do provide amplification of target sequences equal to PCR, they may require complex protocols, the use of multiple enzymes and/or special reagents and, in particular, either a high-temperature denaturation step or an enzyme-based method for promoting nicking and strand displacement.

But according to Douglas Scientific (manufacturer of the Array Tape platform) and its partner, EnviroLogix (which provides its proprietary DNAble novel DNA amplification chemistry), combining these technologies into an automated solution in one platform could represent a new paradigm in real-time nucleic acid analysis.

Meet DNAble

DNAble, EnviroLogix says, is a very rapid, highly specific isothermal nucleic acid amplification technology that can amplify both RNA and DNA targets with single-base resolution a billion-fold in 5–15 minutes at a single and constant temperature. It features EnviroLogix’ DNAble v2.0 chemistry, which reportedly produces highly specific, rapid, multiplexed, quantitative results rivaling qPCR. Unlike qPCR, which requires purified DNA, the company says DNAble can amplify a target sequence from a crude sample preparation and do so with minimal equipment.

DNAble works by using a nicking enzyme and a strand-displacing polymerase to generate small pieces of DNA that feed a DNA extension reaction; it consists of alternating cycles of nicking and extension processes, leading to exponential amplification. This is much faster, the company says, than the time-consuming thermal cycling that relies on costly equipment. But in contrast to other isothermal methods that rely on DNA strand nicking and displacement synthesis, DNAble chemistry v2.0 has eliminated off-target amplification of nonspecific products.

Further, the company said, DNAble shows little inhibitory effects from a crude extraction or exogenous DNA. As a result, DNAble can reportedly allow for highly sensitive detection with minimal to no sample preparation. The company says its data confirms the high specificity of the assay with no cross-reactivity to closely related bacteria or synthetic mismatch targets.

Lars Erik Peters, Envirologix’ director of product development for molecular diagnostics, told GEN, “Isothermal methods are prone to amplify huge haystacks of nonspecific byproducts but finding the needles of specific product has been difficult. You do not have the high temperature control that you have with PCR to control specificity.” He explained that EnviroLogix “set out to make isothermal technology into a process that is directly comparable to PCR regarding assay design and simplicity of use. This was a real technology endeavor because it not only required optimization of chemistry, but also development of assay design software and instrumentation.”

He further explained that the “background noise in isothermal was extraordinary. But now we have solved the background generation problem and have data that looks like PCR data.”

And Douglas Scientific’s president and COO Dan Malmstrom said, “When we delivered Array Tape for PCR we increased throughput by 10X, reduced reagent costs up to 80–90%, and offered an automated solution that saved laboratories huge amounts of money. We are taking it to the next level—this is a game changer.”

Both companies say that primary targets for their platform are the agbio and plant genomics markets, where they say they already have a good footprint. “This is only possible,” they said, “because in contrast to PCR the technology works with crude extracts, and DNA preparation for PCR has often been more expensive than the PCR reactions themselves.”

Whether this technology produces the anticipated revolution in DNA amplification and extends its applicability remains to be seen.

Source: Genetic Engineering & Biotechnology News 

Monday, February 18, 2013

DARPA Moves Bio-Agent Detection Capabilities to DoD

Defense department researchers are transitioning their antibody-based biosensors detection programs that have developed advanced, more durable sensors over to the Department of Defense.

The Defense Advanced Research Projects Agency (DARPA) said on Feb. 15 that it is transitioning antibody detection technology to the Defense Department’s Critical Reagents Program. The technology, it said, provides more stability and accuracy in detection of biological toxins and gives the DoD more cost-efficient and more effective bio threat sensors.

The DoD, it said, has been using antibody-based biosensors as its immediate detection tool for dangerous antigens like Anthrax and biological toxins. The biosensors rely on the fact that antibodies bind to antigens, however, sensors have had limitations in to functioning over time and the ability to keep a tight bond between antibody and antigen. The tighter the bond --  or affinity  -- between antigen and antibody, the more sensitive a biosensor is over a wider range of threats. Existing DoD biosensors, while effective, had restricted shelf lives, were quickly rendered inoperable by high temperatures and offered limited affinity, it said.

DARPA said it launched the Antibody Technology Program (ATP) in 2009 to address the technological limitations of current antibody-based biosensors. The program set out to achieve revolutionary improvements in the stability of antibodies over time, even in extreme conditions; and control affinity in biosensors to enable detection of numerous antigens by a single unit.

DARPA’s said the ATP ended in 2012 having achieved both goals and with a plan in place to transition the technologies to DoD’s Critical Reagents Program, part of the Joint Program Executive Office – Chemical and Biological Defense (JPE-CBD), for biosensor deployment throughout the military services.

Specifically, DARPA said performers demonstrated the ability to increase antibody temperature stability at 70 degrees Celsius (158 degrees Fahrenheit) to 48 hours, up from the current limit of five to ten minutes.  When transitioned to DoD biosensors, the results are projected to eliminate the need for refrigeration while increasing the shelf life by a factor of 36, extending survivability at room temperature (approx. 25 degrees Celsius or 77 degrees Fahrenheit) from one month to approximately three years. DARPA also increased antibody affinity by a factor of 400, thus opening the door to vastly more sensitive, multiplexed biosensors that can test for numerous antigens.

“When you consider the locations of warfighters who have the most potential for biological weapons to be used against them, they are typically environments with extreme temperatures and harsh conditions, and the warfighters themselves are probably operating in small groups,” said Mildred Donlon, the DARPA program manager for ATP. “If it’s going to be useful to these teams, DoD equipment needs to be ruggedized to survive conditions and be easy to use by non-experts. The ATP technology hits these goals.

“By removing temperature stability as a limiting factor, troops will now be able to carry sensors with them without worrying about refrigeration and wondering if the sensor will return an accurate reading,” she said. “According to the Chemical Biological Medical Systems Joint Project Management Office at JPE-CBD, eliminating the need for cold-chain logistics in transport and deployment of sensors is estimated to save DoD in the range of $10 million per year,” Donlon said. “The new stability also means antibodies can be attached to new materials to make potentially more practical sensors to take the place of current beads and strips. Most importantly, by pairing more stable sensors with a huge increase in sensitivity, DARPA is giving troops the confidence to trust the results of what can be literally life-or-death measurements.”

 According to DARPA, ATP achieved the results by altering the amino acid sequences within the antibody molecules. Rather than creating an additive stabilizing material, it said, ATP devised methods to make the altered amino acids an integral part of the structure of the antibody molecule.

“Antibody-based biosensors have been in use for roughly 30 years,” Donlon said. “DARPA used recent advances in understanding of protein structure and analysis to determine new ways to alter amino acids, integrate them into an antibody structure, and do so at a sustainable scale.”

DARPA partnered with the U.S. Army’s Edgewood Chemical Biological Center (ECBC) from the beginning of ATP, to first assist with evaluation of performer research proposals, then later in the program to provide ATP performers with unaltered antibodies, conduct testing on the performers’ altered antibodies, and validate results. To ensure that the production methods for modifying antibodies are scalable and cost effective, performers had to submit one-gram samples for testing. The positive results mean that existing DoD antibody stockpiles can be altered to incorporate the new properties of stability and high affinity.

Program performers for ATP included: Affomix Corp. (Branford, CT), purchased by Illumina, Inc. (San Diego, CA.); AnaptysBio, Inc. (San Diego, CA); the Naval Research Laboratory (Washington, D.C.); StableBody Technologies, LLC (Lemont, IL); The University of Texas at Austin (Austin, TX); and the ECBC (Aberdeen, MD), which participated as the validation laboratory. AxioMx, Inc. (Branford, CT) was created to rapidly generate high-quality recombinant antibodies.

Tuesday, February 12, 2013

DNAzymes and Gold Nanoparticles: A Colorimetric Assay for Diagnostics

Infectious diseases such as malaria and syphilis can be diagnosed rapidly and reliably in the field by using a simple test developed by Canadian scientists. The test is based on the use of DNAzymes and gold nanoparticles. As the researchers report in the journal Angewandte Chemie, their test allows for the sensitive detection of bacteria, viruses, and parasites.

Dangerous infectious diseases must be identified in time in order to prevent them from spreading. The DNA of pathogens is an ideal biomarker and can easily be identified by PCR. However, this is only possible if expensive laboratory equipment and trained personnel are on hand. This may not be the case in remote locations or developing nations. Alternative methods that are simple and inexpensive while also remaining sensitive and specific are needed. 

Kyryl Zagorovsky and Warren C.W. Chan at the University of Toronto (Canada) have now combined two modern technologies in a novel way: They have used DNAzymes as signal amplifiers and gold nanoparticles for detection. Gold nanoparticles (GNPs) absorb light. The wavelength of the light absorbed depends on whether the nanoparticles are separate or aggregated. The difference in color can be seen with the naked eye. A solution of individual particles appears red, whereas aggregates are blue-violet in color. 

DNAzymes are synthetic DNA molecules that can enzymatically split other nucleic acid molecules. The researchers separated a DNAzyme into two inactive halves that both selectively bind to a specific gene segment of the pathogen to be detected. The act of binding reunites the halves and activates them. 

For their test procedure, the scientists produced two sets of GNPs that bind to two different types of DNA strand, type A and type B. In addition, they synthesized a three-part "linker" made of DNA. One end of the linker is the complement to type A DNA; the second end is the complement to type B DNA. The center part is designed to be split by active DNAzymes. 

In the test sample with no pathogen present, the DNAzymes remain inactive and the linkers remain intact. They bind to a GNP at each end and link the GNPs into larger aggregates, causing the solution to turn blue-violet. In contrast, if there is pathogen in the sample, the DNAzymes are activated and proceed to split the linkers. Now only the bridging parts of the linker can bind to DNA strands of the GNPs, so they cannot link the GNPs together. The solution stays red. Because every activated DNAzyme splits many linkers, it amplifies the signal. 

This new type of test is simple and inexpensive; it can be made to detect every kind of pathogen, as the researchers demonstrated by detecting gonorrhea, syphilis, malaria, and hepatitis B. In a freeze-dried state, the reagents can be stored with no problem – an important requirement for use in the field. 

For more information, please review Chan, W. et al. A Plasmonic DNAzyme Strategy for Point-of-Care Genetic Detection of Infectious Pathogens, Angewandte Chemie International Edition (dx.doi.org/10.1002/anie.201208715). 

Source: Phys.org 

Building a Biochemistry Lab On a Chip

Miniaturized laboratory-on-chip systems promise rapid, sensitive, and multiplexed detection of biological samples for medical diagnostics, drug discovery, and high-throughput screening. Using micro-fabrication techniques and incorporating a unique design of transistor-based heating, researchers at the University of Illinois at Urbana-Champaign are further advancing the use of silicon transistor and electronics into chemistry and biology for point-of-care diagnostics.

The image shown above is a cross-section of the device with a droplet. The left side shows an unheated droplet with the DNA FRET construct in the double-stranded form. The right side shows a heated droplet where the FRET construct has denatured, resulting in an increase in fluorescence. (Click on the image for a larger illustration. Credit: University of Illinois College of Engineering)

Lab-on-a-chip technologies are attractive as they require fewer reagents, have lower detection limits, allow for parallel analyses, and can have a smaller footprint.

"Integration of various laboratory functions onto microchips has been intensely studied for many years," explained Rashid Bashir, an Abel Bliss Professor of electrical and computer engineering and of bioengineering at Illinois. "Further advances of these technologies require the ability to integrate additional elements, such as the miniaturized heating element, and the ability to integrate heating elements in a massively parallel format compatible with silicon technology.

"In this work, we demonstrated that we can heat nanoliter volume droplets, individually and in an array, using VLSI silicon based devices, up to temperatures that make it interesting to do various biochemical reactions within these droplets."

"Our method positions droplets on an array of individual silicon microwave heaters on chip to precisely control the temperature of droplets-in-air, allowing us to perform biochemical reactions, including DNA melting and detection of single base mismatches," said Eric Salm, first author of the paper, "Ultralocalized thermal reactions in subnanoliter droplets-in-air," published in the Proceedings of the National Academy of Science (PNAS) on February 12.

According to Salm, approaches to perform localized heating of these individual subnanoliter droplets can allow for new applications that require parallel, time-, and space multiplex reactions on a single integrated circuit. Within miniaturized laboratory-on-chips, static and dynamic droplets of fluids in different immiscible media have been used as individual vessels to perform biochemical reactions and confine the products.

"This technology makes it possible to do cell lysing and nucleic acid amplification reactions within these individual droplets -- the droplets are the reaction vessels or cuvettes that can be individually heated," Salm added.

"We also demonstrate that ssDNA probe molecules can be placed on heaters in solution, dried, and then rehydrated by ssDNA target molecules in droplets for hybridization and detection," said Bashir, who is director of the Micro and Nanotechnology Laboratory at Illinois. "This platform enables many applications in droplets including hybridization of low copy number DNA molecules, lysing of single cells, interrogation of ligand-receptor interactions, and rapid temperature cycling for amplification of DNA molecules.

"Notably," Bashir added, "our miniaturized heater could also function as dual heater/sensor elements, as these silicon-on-insulator nanowire or nanoribbon structures have been used to detect DNA, proteins, pH, and pyrophosphates.

By using microfabrication techniques and incorporating the unique design of transistor-based heating with individual reaction volumes, 'laboratory-on-a-chip' technologies can be scaled down to 'laboratory-on-a-transistor' technologies as sensor/heater hybrids that could be used for point-of-care diagnostics."

In addition to Salm and Bashir, co-authors of the study included Carlos Duarte Guevara, Piyush Dak, Brian Ross Dorvel, and Bobby Reddy, Jr. at the University of Illinois; and Muhammad Ashraf Alam, Birck Nanotechnology Center and the School of Electrical and Computer Engineering at Purdue University.

Source: Science Daily

Wednesday, February 6, 2013

Volume 4 of the Encyclopedia of Rapid Microbiological Methods is Now Available

In January 2013, Volume 4 of the Encyclopedia of Rapid Microbiological Methods was released and the chapters, written mostly by end-users, contain many case studies on the validation and implementation of rapid methods for a wide variety of pharmaceutical microbiology applications.

Volume 4 may be ordered through the PDA Bookstore by Clicking Here, and Volumes 1-3 may be ordered by Clicking Here.

One of the highlights in Volume 4 is an excellent introduction by Dr. Bryan Riley, New Drug Microbiology Staff at FDA’s Center for Drug Evaluation and Research, and the Agency’s expert on rapid method technologies.  Dr. Riley explains that modern approaches to process control (including Process Analytical Technology) require the availability of results in real-time (or at least close to real-time) to enable the operator to use the test results to make process decisions and adjustments. Although real-time results are only currently available for a limited category of microbiological tests, there are many microbiological methods that are significantly more rapid than the traditional test methods.

He continues to state that the rapid methods available today vary a great deal in their mechanisms of operation. Some of these methods still rely on a period of microbial growth using traditional media but reduce their time to result by using an alternate method of microbial detection. Other rapid methods do away with growth entirely and utilize a stain or inherent microbial auto fluorescence to detect microorganisms even down to the level of a single microbial cell. Furthermore, some of the available methods are quantitative, some are qualitative, and they vary in their time to result (from real-time to several days) but all of these methods seem to have found a niche in the pharmaceutical microbiologist’s arsenal. These current rapid microbiological test methods are now able to start providing some of the advantages (from a process control and economic return standpoint) long enjoyed by our colleagues in the clinical and food microbiology labs. Pharmaceutical microbiologists would be well served by considering which of their samples would provide a benefit with a more rapid result and then assessing the current alternate microbiological methods to see if any of them are a good fit for their needs. Dr. Riley concludes that the Encyclopedia of Rapid Microbiological Methods will be an excellent resource to start that assessment.

Volume 4 includes up-to-the-minute advances and details regarding quality control, choosing appropriate methods, future use and technologies, mass spectrometry, genotypic methods for identification, new case studies, application of USP and other guidelines, environmental monitoring, validation, sterility testing, Mycoplasma testing, the application of rapid microbiological methods as they relate to both bio-processing and regulatory considerations, many product-specific method advances and much, much more. Below is a preview of each of the chapters.

Chapter 1 discusses the application of modern microbial methods to the Quality Control testing of probiotics, including master and working cell banks, release and stability testing, viable cel counts, identification and strain typing, absence of bacterial pathogens, antibiotic resistance, adherence to the intestinal wall and acid and bile resistance.

Chapter 2 provides considerations when aligning a rapid microbiological method with an end-user’s particular needs. Topics include the drivers for rapid methods, time savings, same day results, sample compatibility, automation, using a qualitative method as a screening tool, validation, identification, integration with LIMS and other data management platforms, false positives, false negatives and limit of detection.

In Chapter 3, I look to the future of rapid and automated microbial identification systems. Here, an overview of technologies based on the growth of microorganisms, the detection of cellular components, optical spectroscopy, nucleic acid amplification and MEMS is provided. Examples include the utilization of biochemical and carbohydrate substrates, fatty acid analysis, MALDI-TOF and SELDI-TOF mass spectrometry, FT-IR, elastic and inelastic light scattering, ribotyping, PCR, microfluidics and microarrays.

Chapter 4 provides a more detailed case study when using MALDI-TOF mass spectrometry for the identification of microorganisms. Sample preparation, OQ, PQ, accuracy, precision, robustness and computer validation are some of the topics discussed.

Another case study on microbial identification is provided in Chapter 5, focusing on genotypic methods, amplification of DNA, automation and validation (accuracy, precision, robustness and specificity). Compliance of the new method with GMP principles is also discussed.

In Chapter 6, a supplier of a rapid growth-based microbial identification system provides an overview of enhancements to their existing technology. The workflow and applications are reviewed, in addition to validation considerations.

A case study of a new growth-based rapid microbiological method that detects the presence of specific organisms
 and provides an estimation of viable cell count is provided in Chapter 7. Sample preparation, the assay workflow, and applicability to a wide-range of microorganisms are discussed. Additionally, data from a validation case study, inclusivity and exclusivity testing, and a comparison to USP <61> for aerobic counts, yeast and mold, and Gram-negative bile tolerant microorganisms, are provided. 

Chapter 8 involves an end-user case study that describes an evaluation of a relatively new growth-based rapid method that utilizes a membrane filtration workflow coupled with a viability staining technique. A review of the technology and evaluation results are offered followed by a discussion of the system’s use for monitoring mammalian cell cultures and additional benefits.

In Chapter 9, an optical spectroscopy rapid method supplier describes an environmental monitoring system and how to apply to validation recommendations of USP <1223> to their technology. Accuracy, precision, limit of detection and quantification, linearity, range, ruggedness and robustness are some of the parameters that were examined.

In Chapters 10 and 11, I review a comprehensive evaluation using an optical spectroscopy technology for the real-time and continuous monitoring of airborne microorganisms in cleanroom and isolator environments. Complete with plenty of pictures, the reader will be immersed in a study of real-time monitoring during an aseptic fill, the transfer of sterilized components, interventions and when the integrity of isolator gloves has been breached.

A rapid method for the release testing of both sterile (sterility testing) and non-sterile products (bioburden assessments) is the focus of Chapter 12. Here, an ATP bioluminescence technology is validated, and the authors discuss their qualification workflow and results.

Chapter 13 describes an end-user’s validation approach for a rapid, growth-based detection system as an alternate sterility test for cellular immunotherapy products. Challenges with the conventional method are discussed, followed by their approach to feasibility testing, method validation, and a regulatory path for commercial approval.

Chapter 14 is another case study by an end-user of a rapid, solid-phase cytometry technology. The author describes their validation strategy, including accuracy, precision, limit of detection, robustness, ruggedness and the use of statistical models when comparing the results to the acceptance criteria. Additional considerations are discussed, including the use of stressed cells and matrix effects on the overall validation plan.

Another supplier describes their ATP bioluminescent system as an alternative to the conventional sterility test, in Chapter 15. Validation strategies, the use of challenge microorganisms, system suitability, understanding background values, and how to conduct product specific feasibility testing, are just some of the topics that are discussed.

Chapter 16 focuses on the statistics of validating an alternative sterility test. Subjects include probabilities and multiplicity, limit of detection and a comparison of what is statistically different versus what is statistically equivalent. This is a must read for anyone wanting to validate a rapid sterility test and how to design the studies and use statistics to justify the results.

Chapter 17 provides an over of a validation approach for a next-generation ATP monitoring technology. The principle of the new method, how to evaluate limit of detection and validation strategies for a wide-range of fluid samples are provided.

A novel qPCR-based system for the detection of specific microorganisms is the focus in Chapter 18. An overview of the technology by the system’s supplier is provided, in addition to a preliminary study on validation parameters, specificity, limit of detection and data analysis.

Chapter 19 addresses the use of rapid methods for the detection of Mycoplasma. These end-users provide an overview of Mycoplasma and the importance of detection in the biopharmaceutical industry, traditional and alternative methods, and a case study using a nucleic acid amplification platform. A review of regulatory author requirements for nucleic acid amplification systems is also discussed.

A new nucleic acid amplification and microarray-based rapid method for the detection of Mycoplasma is highlighted in Chapter 20. The technology workflow and advantages over traditional methods is discussed, and regulatory guidelines for Mycoplasma detection are examined.

Chapter 21 provides an overview of rapid viral detection methods. Classic versus molecular biology approaches are discussed, in addition to the prospects for viral safety testing. Experiences with Vesivirus, MVM and other public viral incidents are explored, as well as other topics associated with the future directions in using molecular methods for detection.

Speaking of future directions, Chapter 22 investigates the new microbiology technology wave: alternative and rapid methods for the QC laboratory. Regulations, skill sets for the pharmaceutical microbiologists, and the microbiological curricula are but a few topics provided.

Finally, Chapter 23 goes into great depth in discussing the application of rapid microbiological methods for bioprocessing. A review of biopharmaceutical manufacturing, regulations, testing requirements and contamination events sets the stage for considering a wide range of rapid method applications.

Monday, February 4, 2013

Outbreak Detection Since Jack in the Box: A Public Health Evolution

In 1993, 623 people in the western U.S. fell ill with a little-known bacteria called E. coli O157:H7. Ultimately, four children would die from their infections; many others suffered long-term medical complications. The bug was later traced to undercooked hamburger served at Jack in the Box restaurants. This outbreak thrust foodborne illness onto the national stage as a real and present threat, sparking a sea change in the way Americans and the government treat this issue. To commemorate the 20th anniversary of the 1993 Jack in the Box outbreak, Food Safety News has produced a series of retrospective stories chronicling the outbreak itself and how food safety in America has changed since that time.

In some ways, it was good that the 1993 Jack in the Box E. coli O157:H7 outbreak happened in Washington state. Tragic as it was, it could have been much, much worse.

When children started falling seriously ill, it was a mystery, but state health officials were uniquely positioned to crack the case.

To start, the state was pretty good at foodborne illness surveillance. In the 1990s, Washington reported more foodborne illness outbreaks than any other state except for New York – not because it had more outbreaks, but because health officials there were good at detecting them.

State epidemiologists had also been collaborating with the University of Washington and the local children’s hospital on an E. coli O157:H7 study. Long before the pathogen made national headlines, they knew it was turning up in the state and they knew it was a risk in undercooked hamburgers.

Armed with this knowledge, Washington became the first state to make E. coli O157:H7 a reportable disease, which meant doctors had to report their cases to the state health department. (By 1993, only a fifth of states had followed suit).

“We had a strong history of foodborne disease surveillance. That really helped us out,” says John Kobayashi, the head epidemiologist for the Washington State Department of Health at the time. “When you look at 50 to 60 clusters and outbreaks per year, you get really good at it.”

The experience came in handy, because Kobayashi’s team had to rely solely on old-fashioned epidemiology to figure out what was making people so sick.

This was before PulseNet, the national network of public health laboratories that use pulsed-field gel electrophoresis (PFGE) to connect illnesses that have the same pattern, or DNA “fingerprint.”

PFGE technology existed in 1993, but it was too time consuming and expensive to be useful during outbreaks. Having a network like PulseNet would also have been next to impossible without the Internet or access to email.

Even with relatively good surveillance in Washington, it took 39 days to determine that there was a serious outbreak afoot. And this discovery, too, happened due to some luck. Phil Tarr, a gastroenterologist at Children’s hospital, who happened to be the nation’s leading pediatric expert on E. coli O157:H7, noticed an uptick in children coming in with bloody diarrhea and immediately alerted Kobayashi at the state health department.

In mid-January 1993, the state’s team of epidemiologists (which was bolstered by Centers for Disease Control and Prevention fellows serving at the time), sprang into action, alerting emergency rooms to look for symptoms of infection and interviewing patients about where they had eaten recently. They pinpointed undercooked hamburgers served at Jack in the Box as the likely source of the outbreak in less than a week.

Their rapid sleuthing prevented 250, 000 potentially contaminated hamburgers from being consumed. It’s estimated that this prevented another 800 cases and an unknown number of deaths.

The power of PFGE

After the national attention on the outbreak subsided, it took several months to determine the subtype, or fingerprint, of the bacteria that had wreaked havoc in the Pacific Northwest.

To get the PFGE patterns from the clinical and beef isolates, epidemiologists in Washington had to send them to the federal Centers for Disease Control and Prevention (CDC) in Atlanta. When they got the results back over the summer, all the samples matched.

“It became clear that if we’d had the method, we would have been able to detect these more quickly,” says Kobayashi. “We may not have even had a Jack in the Box outbreak, because there were a small number of cases in December that matched the outbreak strain.”

Dr. Bala Swaminathan at CDC, who had recently been tapped to lead the agency’s new food safety and diarrheal disease branch, had the same revelation.

“Why can’t we use this powerful technology [PFGE] to detect outbreaks when they’re happening?” asked Swaminathan. “Not too many people were on board with my idea that this could be done. People didn’t think this could be done in a decentralized fashion.”

The Internet, improvements in technology and software and federal funding eventually came together to make a network like PulseNet possible.

President Clinton’s Food Safety Initiative, launched in the fall in 1997, would provide much of the funding needed to build the capacity for the program, which initially focused on E. coli O157:H7.

Under the initiative, CDC launched the Epidemiology and Laboratory Capacity for Infectious Diseases (ELC) program, which doles out funding to improve foodborne illness outbreak detection and the state and local levels.

“ELC was a great source of funding for PulseNet,” says Swaminathan, who helped build the program from the ground up. “Money started to go out to the states for the equipment they needed.”

In those days, the equipment a state health lab would need to conduct PFGE and participate in PulseNet ran between $50,000 and $60,0000, he recalls.

“The Jack in the Box outbreak really played a major role in building PulseNet,” says Swaminathan. “No one was questioning the need for something like this. We didn’t have to sell the program. It sold itself.”

Building a national system

In 1996, when PulseNet first became operational, participating labs uploaded about 350 patterns. By 1999, the program was uploading more than 10,000 and it had been awarded the prestigious Innovations in American Government Award.

By 2001, the network covered all 50 states. Within another year, public health labs were uploading more than 25,000 patterns and the program was recognized as one of the top 15 programs to have ever been awarded the innovations award.

PulseNet was effective from the get-go, in part because there was a very strict, standardized protocol for submitting PFGE patterns from early on. At the time, variation in processes made it difficult to compare PFGE patterns between different labs or across states. With standardization, comparing and matching patterns became possible.

PulseNet is also more than a network; it’s a community. There’s a lot of online communication and a sense of purpose among the public health experts who contribute to the system.

“It’s a pretty tight knit group,” says Dave Boxrud, the Molecular Epidemiology Supervisor at Minnesota Department of Health, who has been at the health department since 1992.  “It’s been an extremely effective system because of the level of collaboration.”

For the past several years, between 50,000 and 75,000 patterns have been uploaded to PulseNet each year. Using this detection system, health officials are monitoring between 5 and 15 new clusters each week. Today, labs across the country can perform PFGE in 24 hours.

At any given point, according to Peter Gerner-Smidt, who now heads PulseNet at CDC, epidemiologists are dealing with somewhere in the neighborhood of 28 outbreaks or disease clusters of various sizes.

“The beauty of PulseNet is that a large number of people can access a large about of information in a way that’s useful,” says Hugh Maguire, Program Manager at the Colorado Department of Public Health and Environment, whose team worked on the deadly Listeria cantaloupe outbreak in 2011.

But building the infrastructure that makes PulseNet possible took many years, and now many in public health worry that budget constraints could weaken the system going forward.


The biggest challenge facing PulseNet, says Maguire, is “first and foremost: funding.”

“I know it’s become almost eye roll-inducing to talk about funding, but it’s the reality,” he adds. “If you want to have well trained people doing this testing, you have to have the funding.”

On top of the general pressure to cut government while the U.S. economy struggles to recover, it’s hard for states to plan when the federal budget continues to be made up of a patchwork of continuing resolutions, instead of yearlong budgets. In Colorado, for example, the state lab relies heavily on CDC funding, with less than 10 percent of the its budget comes from general state funds.

The ELC program, which still supports PulseNet, peaked at about $78 million in grant funds in 2002. In 2010, the grant money was down to $52 million and the future funding levels are uncertain, as is CDC’s overall budget. The uncertainty comes as state and local governments are also making public health cuts.

“One thing people need to realize is that because [PulseNet] is a decentralized system, it needs the resources to type these isolates in a timely fashion,” says Swaminathan. “PulseNet is only as strong as its weakest link.”

Source: Food Safety News

Friday, February 1, 2013

Scientists Use PCR and Gene Sequencing to Find Lifeforms Six Miles Above Earth’s Surface

Microorganisms have been found in virtually every corner of the Earth, from deep sea volcanoes to the tops of frozen mountains. They've also been discovered high up in the atmosphere — but scientists haven't been entirely sure as to nature and extent of these elusive high-altitude organisms. Now, new research suggests that there's a surprising amount of bacteria and fungi as high as 30,000 feet. And remarkably, these microbes could be affecting the climate, as well as contributing to the spread of diseases down on Earth.

To reach this conclusion, a team of scientists led by Athanasios Nenes rode aboard several NASA DC-8 aircraft to the middle and upper troposphere regions of the atmosphere above the Caribbean Sea and portions of the Atlantic Ocean. There, at about 6 miles (10 km) above the surface, they took samples of rarified air using onboard instruments and an array of filters.

The various trips that encompassed the Genesis and Rapid Intensification Processes campaign were timed so that the researchers could take samples both before and after a pair of tropical hurricanes swept through the regions below, namely hurricanes Earl and Karl in 2010.

Once back at the lab, the scientists used genetic techniques (i.e. polymerase chain reaction (PCR) and gene sequencing) to determine which kinds of microorganisms were present at such extreme altitudes, and in what quantities. According to the researchers, it was the first study of its kind — a paper that has now been published in the Proceedings of the National Academy of Sciences.

What they discovered came as a complete surprise — both in terms of the diversity of microbial life, and also the quantity. The troposphere, it would appear, is home to its own high-altitude microbiome.

Specifically, the researchers found mostly marine bacteria, while air masses that originated over land had mostly terrestrial bacteria. Clearly, the hurricanes were spewing these organisms high up into the atmosphere, creating a dynamic mixture. In terms of numbers, 20% of the particles detected were from bacterial cells ranging from 0.25 to 1 microns in diameter; and in terms of distribution, there were about 144 bacterial cells found in every cubic foot of air.

They also found fungi, but not nearly as much as the bacteria.

Of the bacteria discovered, there were 17 different types, including some capable of metabolizing atmospheric carbon compounds like oxalic acid. This may explain why the hardy bacteria is capable of surviving at such a height.

But not only that, researchers also speculate that these particular microorganisms could be affecting the weather. They theorize that these organisms could be contributing to cloud formation via ice accumulation — a kind of bacterial cloud seeding. "[T]he microbiome," write the researchers, "is a dynamic and underappreciated aspect of the upper troposphere with potentially important impacts on the hydrological cycle, clouds, and climate."

As another interesting note, the researchers also discovered Escherichia and Streptococcus bacteria — microbes that are associated with human and animal feces — and by consequence, diseases. The scientists now wonder — because what goes up must come down — if the tropospheric layer may be contributing to the spread of illnesses around the globe.

Finally, because these bacteria are able to survive such harsh conditions, this research is also of significance to the search for extraterrestrial life and the field of astrobiology. It's conceivable, given these observations, that microbial life may be discovered in the atmospheres of exoplanets.

Read the entire study here.

Source: io9.com.

PCR Used to Show that Patients Emit Small, Influenza-Containing Particles Into the Air During Routine Care

A new study suggests that patients with influenza can emit small virus-containing particles into the surrounding air during routine patient care, potentially exposing health care providers to influenza. Published in The Journal of Infectious Diseases, the findings raise the possibility that current influenza infection control recommendations may not always be adequate to protect providers from influenza during routine patient care in hospitals.

Werner E. Bischoff, MD, PhD, and colleagues from the Wake Forest School of Medicine in North Carolina screened 94 patients for flu-like symptoms during the 2010-2011 influenza season. Study participants had been admitted to the emergency department (52 patients) or an inpatient care unit (42 patients) of Wake Forest Baptist Medical Center, where vaccination for influenza is mandatory for health care providers.

Nasopharyngeal swabs were collected from each patient. Samples were analyzed by rapid testing and by PCR analysis. Air samples were obtained by placing three six-stage air samplers from within 1 foot, 3 feet, and 6 feet of patients. No aerosol-generating procedures -- such as bronchoscopy, sputum induction, intubation, or cardiopulmonary resuscitation -- were conducted while air sampling took place. During air sampling, the number of patients' coughs and sneezes were counted and assessed for severity. Patients also completed a questionnaire at admission to report symptoms and the number of days they were sick.

Of the 94 patients enrolled, 61 patients (65 percent) tested positive for influenza virus. Twenty-six (43 percent) released influenza virus into the air. Five patients (19 percent) emitted up to 32 times more virus than others. This group of patients with influenza, described by the researchers as "super-emitters," suggested that some patients may be more likely to transmit influenza than others. High concentration of influenza virus released into the air was associated with high viral loads in nasopharyngeal samples. Patients who emitted more virus also reported greater severity of illness.

The current belief is that influenza virus is spread primarily by large particles traveling up to a maximum of 3 to 6 feet from an infected person. Recommended precautions for health providers focus on preventing transmission by large droplets and following special instructions during aerosol-generating procedures. In this study, Dr. Bischoff and his team discovered that the majority of influenza virus in the air samples analyzed was found in small particles during non-aerosol-generating activities up to a 6-foot distance from the patient's head, and that concentrations of virus decreased with distance. The study addressed only the presence of influenza-containing particles near patients during routine care, not the actual transmission of influenza infection to others.

Fitted respirators are currently required for health care providers during aerosol-generating procedures with patients. During routine, non-aerosol-generating patient care, the current precautions recommend that providers wear a non-fitted face mask. Based on their findings, Dr. Bischoff and investigators are concerned that providers may still be exposed to infectious dosages of influenza virus up to 6 feet from patients with small wide-spreading particles potentially exceeding the current suggested exposure zones.

These findings suggest that current infection control recommendations may need to be reevaluated, the study authors concluded. The detection of "super-emitters" raises concerns about how individuals with high viral load may impact the spread of influenza, they noted. "Our study offers new evidence of the natural emission of influenza and may provide a better understanding of how to best protect health care providers during routine care activities," the study authors wrote. However, studies of influenza virus transmission will be necessary before the role of super-emitters can be firmly established, they noted.

In an accompanying editorial, Caroline Breese Hall, MD, from the University of Rochester School of Medicine and Dentistry in New York, highlighted the importance of a better understanding of influenza transmission as global travel has increased the likelihood of a rapid worldwide influenza outbreak. Although the study did not show that influenza transmission actually occurred, Dr. Hall noted, the findings "question the traditional belief that influenza is primarily spread by close contact with an infected person or by direct contact with infectious secretions."

While the study adds to the current understanding of the risks of influenza infection among patients and health providers, the findings also help define questions that still need to be answered, Dr. Hall noted. (Editor's Note: Dr. Hall died on Dec. 10, 2012, at the age of 73, shortly after completion of the editorial accompanying this study.)

Whatever protective equipment or infection control practices are used for preventing influenza transmission, vaccination of health providers remains a fundamental and key part of protecting them from influenza, noted Dr. William Schaffner, professor medicine and chair of the department of preventive medicine at Vanderbilt University School of Medicine in Nashville, Tenn., who was not involved with the study. "Influenza vaccination, although not perfect, is the best tool we have to protect health care workers -- and their patients -- from influenza illness.

Source: Science Daily.