Information

Is there a minimum load below which an infectious agent will not cause disease?


Suppose a single smallpox virus is injected in an human adult's body. Will it cause disease in the host? Is there a minimum microbial load below which it will not cause disease?


It depends on the pathogen, delivery method, environment, etc. In most cases, a single pathogen is not enough to cause disease -- it may need many millions or billions -- but there are some cases where it does seem that a single pathogen can consistently cause infection. For example

The minimum infectious dose of ASFV in liquid was 100 [i.e 1] 50% tissue culture infectious dose (TCID50), compared with 104 TCID50 in feed. The median infectious dose was 101.0 TCID50 for liquid and 106.8 TCID50 for feed.

--Infectious Dose of African Swine Fever Virus When Consumed Naturally in Liquid or Feed

If you're asking specifically about smallpox, I don't think anyone has done the experiment (the natural and most susceptible host being humans, I think you'd have some trouble finding volunteers to find to lethal dose). A single virus of the related ectromelia (mouse pox) virus can cause death in particularly susceptible mouse strains:

The intranasal lethal mousepox model employing the A/Ncr mouse strain is used to evaluate anti-orthopoxvirus therapies. These infections mimic large droplet transmission and result in 100% mortality within 7-10 days with as little as 1 PFU [plaque-forming unit] of ectromelia virus.

--Mousepox in the C57BL/6 strain provides an improved model for evaluating anti-poxvirus therapies

As a side note, the notion of what a "single virus" actually is, is more fuzzy than you might think, but that's a more complicated question; to a rough approximation a single TCID50 or PFU can be considered to be a single virus.


According to these sources ( A, B, C, D ), infective dose varies widely among the diseases which are caused by pathogens. It is reported that as few as 10 enterohemorrhagic strains of Escherichia coli cells can cause an infection .

For an infection to occur, the pathogen must overcome and pass the physical, physiological, immunological and environmental barriers. Higher number of cells or viruses increases the possibility of overcoming these barriers. But, you should't forget that a pathogen cell or a virus particle doesn't need another one in order to infect the host, reproduce and cause a disease as a result. One single pathogen has all the necessary properties to do so.

So, in regard to the possibility of an infection, we can't say that one single pathogen can not infect a host and cause a disease. But, it is apparent that as a result of the biological properties of the pathogen and the characteristics of the infection mechanism, a particular pathogen has an observed infective dose that is different from other pathogens.

In addition to this consideration, don't confuse the number of pathogen in the environment (outside the host's body) with the number of pathogen injected into the host's body. Via the injection, the pathogen has already been transported behind some of the barriers.


Does a high viral load or infectious dose make covid-19 worse?

Does being exposed to more coronavirus particles mean you will develop a more severe illness? Rumours circulating on social media suggest that hospital workers or their household members exposed to a higher “viral load” become sicker than the general population. But emerging research indicates the relationship between infection and covid-19 severity may be more complex – and differ from that of other respiratory illnesses.

The average number of viral particles needed to establish an infection is known as the infectious dose. We don’t know what this is for covid-19 yet, but given how rapidly the disease is spreading, it is likely to be relatively low – in the region of a few hundred or thousand particles, says Willem van Schaik at the University of Birmingham, UK.

Viral load, on the other hand, relates to the number of viral particles being carried by an infected individual and shed into their environment. “The viral load is a measure of how bright the fire is burning in an individual, whereas the infectious dose is the spark that gets that fire going,” says Edward Parker at the London School of Hygiene and Tropical Medicine.

Advertisement

Read more: You could be spreading the coronavirus without realising you’ve got it

If you have a high viral load, you are more likely to infect other people, because you may be shedding more virus particles. However, in the case of covid-19, it doesn’t necessarily follow that a higher viral load will lead to more severe symptoms.

For instance, health workers investigating the covid-19 outbreak in the Lombardy region of Italy looked at more than 5,000 infected people and found no difference in viral load between those with symptoms and those without. They reached this conclusion after tracing people who had been in contact with someone known to be infected with the coronavirus and testing them to see if they were also infected.

Similarly, when doctors at the Guangzhou Eighth People’s Hospital in China took repeated throat swabs from 94 covid-19 patients, starting on the day they became ill and finishing when they cleared the virus, they found no obvious difference in viral load between milder cases and those who developed more severe symptoms.

Although it is difficult to draw firm conclusions at this stage, such studies “may impact our assumptions about whether a high number of viral particles predisposes to a more serious disease”, says van Schaik.

Read more: We’re beginning to understand the biology of the covid-19 virus

However, a study of patients hospitalised with covid-19 in Nanchang, China, found a strong association between disease severity and the amount of virus present in the nose. “Those with more severe disease had a higher level of virus replication, although we have no evidence to relay the initial exposure dose to disease outcome,” says Leo Poon at Hong Kong University, who was involved in the study. “That rumour is still an open question to me.”

It is early days, but if the initial amount of virus a person is infected by doesn’t correlate with the severity of disease symptoms, this would mark covid-19 out as different from influenza, MERS and SARS.

For influenza, a higher amount of virus at infection has been associated with worse symptoms. It has been tested by exposing volunteers to escalating doses of influenza virus in a controlled setting and carefully monitoring them over several weeks. This hasn’t been done with covid-19, and is unlikely to happen, given its severity.

Read more: How soon will we have a coronavirus vaccine? The race against covid-19

Animals infected with higher doses of the SARS and MERS coronaviruses also experienced worse outcomes, says van Schaik. “I think we just have to conclude that while this virus is related to SARS, there are also important differences that are currently poorly understood,” he says.

Even if the initial level of virus at infection isn’t related to disease severity, it still pays to try and minimise our exposure to the virus because this will reduce our chances of falling ill in the first place. “We want to be taking every precaution we can to prevent ourselves getting infected, which will also reduce our ability to pass the virus on to others,” says Parker. “Any measures we can take to avoid infection are worth taking.”

Article amended on 3 April 2020

We clarified what correlates with disease severity in influenza, and may not do so for covid-19.


Biological Waste Guide

Biological and regulated medical solid waste shall be disposed of through the Biological Solid Waste Stream established by the Department of Environmental Health and Safety. You play an important role in UConn’s biological waste program if you generate biological waste in a research, teaching, clinical laboratory or clinical area. This guide will help you dispose of your biological waste in an easy and legal manner. Our program is designed to protect the people who handle, transport and dispose of your waste, to protect the environment and minimize UConn's regulatory liability. Some waste generators may attempt to work around this program. These attempts are counter productive because they place other people and the University at risk. The cost associated with one injury or violation can easily exceed annual operational costs. If you have complaints, concerns or suggestions for program improvement, we would rather have you tell us than have you implement unauthorized procedures. Environmental Health and Safety will continually work to improve this program and to control its costs.

Definitions

At The University of Connecticut, biological waste is defined as infectious waste, pathological waste, chemotherapy waste and the receptacles and supplies generated during its handling and/or storage. This definition is in accordance with the definition of biological waste as defined by the Connecticut Department of Energy and Environmental Protection (DEEP). It is further defined as waste that, because of its quantity, character or composition, has been determined to require special handling.

Infectious waste is defined by seven categories of waste:

  1. Cultures and stocks: Agents infectious to humans and associated biologicals, waste from biological production, live and attenuated vaccines and anything used to contain, mix or transfer agents. This includes but is not limited to petri dishes, pipettes, pipette tips, microtiter plates, disposable loops, eppendorfs and toothpicks.
  2. Human blood, blood products and infectious body fluids: This category includes blood that is not contained by a disposable item or is visibly dripping, serum, plasma, and other blood products or non-glass containers filled with such discarded fluids. It further includes any substance which contains visible blood, semen, vaginal secretions, cerebrospinal fluid, synovial fluid, peritoneal fluid and pericardial fluid. Glass containers filled with such discarded fluids shall be considered sharps. Intravenous bags which did not contain blood or blood products shall not be considered a blood product. Dialysates are not considered blood or body fluids.
  3. Sharps: needles, scalpel blades, hypodermic needles, syringes (with or without attached needles) and needles with attached tubing regardless of contact with infectious agents are considered by EPA and DEP to be regulated medical waste. Other sharps: pasteur pipettes, disposable pipettes, razor blades, blood vials, test tubes, pipette tips, broken plastic culture dishes, glass culture dishes and other types of broken and unbroken glass waste (including microscope slides and cover slips) that may have been in contact with infectious material. Items that can puncture or tear autoclave bags.
  4. Research animal waste: contaminated carcasses, body parts and bedding of animals that were intentionally exposed to infectious agents during research or testing. Animal carcasses and body parts not intentionally exposed to infectious agents during research or testing are disposed of by Inserve and are not picked up by the Biosafety section.
  5. Isolation waste: biological waste and discarded material contaminated with body fluids from humans or animals which are isolated because they are known to be infected with a highly communicable disease (biosafety level 4 agent).
  6. Any material collected during or resulting from the cleanup of a spill of infectious or chemotherapy waste.
  7. Any waste mixed with infectious waste that cannot be considered as chemical hazardous waste or radioactive waste.

Potentially Infectious Material is defined by the OHSA Bloodborne Pathogens Standard as:

  1. Human body fluids: semen, vaginal secretions, cerebrospinal fluid, synovial fluid, pleural fluid, pericadial fluid, peritoneal fluid, amniotic fluid, saliva in dental procedures, any body fluid that is visibly contaminated with blood, and all body fluids in situations where it is difficult or impossible to differentiate between body fluids,
  2. Any unfixed tissue or organ (other than intact skin) from a human (living or dead) including cell or tissue cultures and
  3. HIV-containing cell or tissue cultures, organ cultures and HIV- or HBV-containing culture medium or other solutions and blood, organs, or other tissues from experimental animals infected with HIV or HBV.

"Look - a - Like" infectious waste is defined as: laboratory materials that can be used to contain, transfer or mix infectious agents but has been used with non-infectious agents. For example: disposable micropipette tips may have transferred sterile water or broth, but an identical tip in the same laboratory may have transferred an infectious agent. In the trash you could not distinguish between them. These "look- a -like" materials will be handled as infectious waste if the facility routinely generates infectious or potentially infectious biological waste or is engaged in a temporary project that generates infectious or potentially infectious biological waste.

Disposal Procedures

RADIOACTIVE WASTE IS DISPOSED OF THROUGH THE RADIATION SAFETY SECTION (860) 486-3613. HAZARDOUS CHEMICAL WASTE IS DISPOSED OF THROUGH THE CHEMICAL SAFETY SECTION (860) 486-3613.

  1. Sharps waste: All sharps as described by category 3 must be discarded in an approved sharps container. These containers are provided by Environmental Health & Safety. Some sharps containers may melt if autoclaved in which case decontamination of the contents may be accomplished by chemical means. If chemical means are used, the liquid must be drained from the containers before they are sealed and placed in the box-bag units. Alternately, untreated sealed sharps containers may be placed in the box-bag units with other untreated biological waste. A University address label provided by the Biological Safety section must be affixed to each sharps container, treated or untreated, that is placed in the box-bag unit. For chemical decontamination, the disinfectant shall be an EPA registered tuberculocidal agent. An example is standard household bleach diluted to the final concentration of 5250ppm (10%). Fill leak-proof receptacle with the appropriate dilution of disinfectant and let stand over-night. Empty liquid, seal and label receptacle and put in box-bag unit.
  2. Non-sharps: There are three acceptable methods for disposal:
  1. Certain biological waste can be disposed of as non-biohazardous/ non-infectious waste, if approved in writing by Biological Safety. The waste must have been decontaminated by autoclave, chemical disinfection or other appropriate decontamination method. If the treatment of choice is a validated decontamination procedure, the waste will be labeled as "non-biohazardous/non-infectious" and can go as regular trash. See below for validation procedures.
  2. If a non-validated decontamination autoclave is available, autoclave the waste in an autoclave bag, affix autoclave indicator tape and place in an autoclave safe tray. CT DEEP regulation requires that autoclaves be monitored for effective kill. See paragraphs d, e and f (validation procedure). After autoclaving and the bag has cooled, drain off any remaining liquid and place the sealed waste in the box-bag unit for pickup. Do not pour liquefied agar media down the drain. See below for box-bag unit instructions.
  3. If an autoclave is not available the waste may be collected in orange/red autoclave bags, closed with tape and placed in the box-bag unit as untreated biological waste. Environmental Health & Safety will pick up all box-bag units on, at least, a weekly basis.

Do not autoclave containers or other receptacles containing bleach. The combination of bleach and residual cotton and oil (improperly cleaned autoclaves) may result in an explosive combustion within the autoclave.

  1. Liquid waste: The sanitary sewer was designed for the disposal of certain liquid wastes. Use of the sanitary sewer reduces the chance for leaks or spills during transport and reduces disposal costs. Biological liquid waste can be poured down the drain (sanitary sewer), under running water after it has been decontaminated by autoclave or chemical means. Human or animal blood and body fluids do not need to be disinfected before being poured down the drain. The sink should be rinsed well and disinfected if necessary, after the disposal procedure.
  2. Mixed waste: Follow the formula below to determine which waste stream.

Biological + Radiation = Radiation Waste

Biological + Hazardous Chemical = Chemical Waste

Transport and Storage of Biological WasteThe transport of biological waste outside of the laboratory, for decontamination purposes or storage until pick-up, must be in a closed leakproof container that is labeled "biohazard". Labeling may be accomplished by the use of red or orange autoclave bags or biohazard box-bag units. Biological Safety must authorize the transport or transfer of regulated medical waste or biohazardous biological waste through public streets or roadways in order to comply with DOT regulations. Biological waste must not be allowed to accumulate. Material should be decontaminated and disposed of daily or on a regular basis, as needed. If the storage of contaminated material is necessary, it must be done in a rigid container away from general traffic and preferably in a secured area. Treated biological waste, excluding used sharps, may be stored at room temperature until the storage container or box-bag unit is full, but no longer than 48 hours from the date the storage container is first put into service. It may be refrigerated for up to 1 week from the date of generation. Biological waste must be dated when refrigerated for storage. If biological waste becomes putrescent during storage it must be moved offsite within 24 hours for processing and disposal. Sharps containers may be used until 2/3-3/4 full at which time they should be decontaminated, preferably by autoclaving, and disposed of as regulated medical waste. Biological waste generated at regional campuses is picked up directly by University contracted biological waste vendors. Coordinate all biological waste pick-ups at regional campuses by calling Environmental Health and Safety at 860-486-3613.

Labeling of Biomedical Waste

Materials that are put into the supplied box-bag units must be labeled with a University of Connecticut address label. Each individual bag or sharps container must have a separate label. The box-bag unit must be labeled with the generator’s building and room number. It must indicate whether or not the waste in the box is treated or untreated.

When a biological waste pick-up is desired, submit a biowaste pickup/supply delivery form on our web site. A waste inspection and removal approval may be required for some waste. The inspector will seal approved waste and affix an approval label. Normally, the waste will be picked up within 48 hours after the request has been submitted. Non-biohazardous/non-infectious waste (validated decontamination method) will be tagged with labels provided by Biological Safety. Autoclave indicator tape should be used as evidence of decontamination.

Box-Bag Unit Assembly Instructions

Instructions can be found in the presentation Biowaste Management

Validation Procedures for Steam Sterilization Units

Steam treatment units shall subject loads of biological waste to sufficient temperature, pressure, and time to demonstrate a minimum Log 4 kill of Bacillus stearothermophilus spores placed at the center of the waste load, and shall be operated in accordance with the following:

  1. Before placing a steam treatment unit into service, operating parameters such as temperature, pressure, and treatment time shall be determined according to the following:
    • Test loads of biological waste, which consist of the maximum weight and density of biological waste to be treated, shall be prepared. Separate loads of autoclave bags, sharps containers, boxes, and compacted waste shall be prepared if they are to be treated separately.
    • Prior to treatment, Bacillusstearothermophilus spores are placed at the bottom and top of each treatment container, at the front of each treatment container at a depth of approximately one-half of the distance between the top and bottom of the load, in the approximate center of each treatment container, and in the rear of each treatment container at a depth of approximately one-half of the distance between the top and bottom of the load.
    • If the operating parameters used during the treatment of the test loads demonstrate a minimum Log 4 kill of Bacillus stearothermophilus spores at all locations, the steam treatment unit shall operate under those parameters when placed into service. If the operating parameters fail to provide a minimum Log 4 kill of Bacillusstearothermophilus spores at all locations, treatment time, temperature, or pressure shall be increased and the tests must be repeated until a minimum Log 4 kill of Bacillus stearothermophilus spores is demonstrated at all locations. The steam treatment unit shall be operated under those parameters when placed into service. Tests shall be repeated and new parameters established if the type of biological waste to be treated is changed.
  2. When operating parameters have been established and documented using the criteria outlined above, the steam treatment unit may be placed into service.
  3. The steam treatment unit shall be serviced for preventive maintenance in accordance with the manufacturer's specifications. Records of maintenance shall be onsite and available for review.
  4. Unless a steam treatment unit is equipped to continuously monitor and record temperature and pressure during the entire length of each treatment cycle, each package of biological waste to be treated will have a temperature tape or equivalent test material such as a chemical indicator placed on a non-heat conducting probe at the center of each treatment container in the load that will indicate if the treatment temperature and pressure have been reached. Waste shall not be considered treated if the tape or equivalent indicator fails to show that a temperature of at least 250 degrees F (121 degrees C) was reached during the process.
  5. Each steam treatment unit shall be evaluated for effectiveness with spores of Bacillusstearothermophilus at least once each 40 hours of operation for generators who treat their own biological waste. The spores shall be placed at the center of the waste load. Evaluation results shall be maintained onsite and available for review.
  6. A written log shall be maintained for each steam treatment unit. The following shall be recorded for each usage:
    1. The date, time, and operator name
    2. The type and approximate amount of waste treated
    3. The post-treatment confirmation results by either
    4. recording the temperature, pressure, and length of time the waste was treated, or b. the temperature and pressure monitoring indicator
    5. Dates and results of calibration and maintenance and
    6. The results of sterilization effectiveness testing with B. stearothermophilus or equivalent.
    1. Set the parameters, determined from testing, that provide consistent treatment such as exposure time, temperature, and pressure.
    2. Identify the standard treatment containers and placement of the load in the steam treatment unit.
    3. Provide for and conduct an ongoing program of training for all users.
    4. Provide for a quality assurance program to assure compliance with the biological waste management plan.

    Authorization

    Only individuals trained and authorized by Environmental Health and Safety may sign waste transport manifests.


    Transportation between Campuses (Ground)

    Transportation of biological samples between campuses (i.e., BUMC Boston University Medical Campus and CRC Charles River Campus ) is subject to the general conditions described above. Biological materials should not be transported on the Boston University Shuttle (BUS). In addition, because the transportation takes place through the public domain, the following other conditions apply:

    Packaging

    • All biological samples must be packed according to Department of Transportation/International (DOT)/International Air Transport Association (IATA) regulations this includes triple-packaging all samples, even if exempt materials.
    • The specimen should be placed inside a primary container with a tight-fitting, leak-resistant top (e.g., full round-threaded screw cap with seal or stopper).
    • The primary receptacle or secondary container should be labeled with the universal biohazard symbol if it contains bloodborne pathogen materials, as per required under the OSHA Bloodborne Pathogen standard 29 CFR 1910 1030.
    • The primary receptacle is placed within a secondary (outermost) container that must meet the following specifications:
      • Shatter- and leak-resistant.
      • Enough extra space to hold absorbent and cushioning materials around the primary receptacle

      Labeling

      • Label information must include the category of the infectious biological material or agent, (i.e. Category A, Category B, or exempt human or animal specimen), and the sending and receiving laboratory identification.
      • Each individual container must have enough label information to identify its contents. In addition, a sheet containing a description of contents should be placed inside the container between the outer and secondary packaging.
      • Any dry ice or other coolant can now be added between the secondary and outer packaging layers. This coolant material should be placed in a shipping box that contains a styrofoam liner or other appropriate material to ensure that the outer box is not damaged by moisture from cold packs or other coolants.
      • All required DOT/IATA labeling and marking information should be on the outside of the package.

      Transportation

      • The BU shuttle system, MBTA, taxi cabs, ZipCars, or other payment for transport methods must not be used for transportation of infectious agents or other biohazardous materials.
      • If the package contains exempt human or animal specimens, or materials that fall under the “Category B Infectious Substances” category, the package may be moved over U.S. roadways by a member of the laboratory.
        • This exclusion, called “Materials of Trade (MOT)” by the U.S. Department of Transportation allows some materials that are exempt or Category B Infectious materials to be transported by a research or clinical laboratory personnel Courier services fall under the “Exclusive Use” Exemption under DOT.
        • This exclusion does not apply to Category A infectious substances or other categories of Dangerous Goods.
        • This individual must have undergone shipping training in the last two years.
        • This package must follow all requirements as described above.
        • Contact the Office of Research Safety, EHS Environmental Health & Safety for further information and questions about these DOT exemptions.

        Transmission of Level 4 Diseases

        Most commonly, level 4 diseases are transmitted to humans through direct contact with urine, fecal matter, or saliva from infected rodents. Ebola virus, for instance, is believed to have originated from the fruit bats of the Pteropodidae family, and was introduced into the human population through close contact with blood and body fluids of the infected bats. Human-to-human transmission occurs when broken skin becomes exposed to blood, semen, breast milk, or other body fluids of an infected person.


        Is there a minimum load below which an infectious agent will not cause disease? - Biology

        As molecular techniques for identifying and detecting microorganisms in the clinical microbiology laboratory have become routine, questions about the cost of these techniques and their contribution to patient care need to be addressed. Molecular diagnosis is most appropriate for infectious agents that are difficult to detect, identify, or test for susceptibility in a timely fashion with conventional methods.

        The tools of molecular biology have proven readily adaptable for use in the clinical diagnostic laboratory and promise to be extremely useful in diagnosis, therapy, and epidemiologic investigations and infection control (1,2). Although technical issues such as ease of performance, reproducibility, sensitivity, and specificity of molecular tests are important, cost and potential contribution to patient care are also of concern (3). Molecular methods may be an improvement over conventional microbiologic testing in many ways. Currently, their most practical and useful application is in detecting and identifying infectious agents for which routine growth-based culture and microscopy methods may not be adequate (47).

        Nucleic acid-based tests used in diagnosing infectious diseases use standard methods for isolating nucleic acids from organisms and clinical material and restriction endonuclease enzymes, gel electrophoresis, and nucleic acid hybridization techniques to analyze DNA or RNA (6). Because the target DNA or RNA may be present in very small amounts in clinical specimens, various signal amplification and target amplification techniques have been used to detect infectious agents in clinical diagnostic laboratories (5,6). Although mainly a research tool, nucleic acid sequence analysis coupled with target amplification is clinically useful and helps detect and identify previously uncultivatable organisms and characterize antimicrobial resistance gene mutations, thus aiding both diagnosis and treatment of infectious diseases (5,8,9). Automation and high-density oligonucleotide probe arrays (DNA chips) also hold great promise for characterizing microbial pathogens (6).

        Although most clinicians and microbiologists enthusiastically welcome the new molecular tests for diagnosing infectious disease, the high cost of these tests is of concern (3). Despite the probability that improved patient outcome and reduced cost of antimicrobial agents and length of hospital stay will outweigh the increased laboratory costs incurred through the use of molecular testing, such savings are difficult to document (3,10,11). Much of the justification for expenditures on molecular testing is speculative (11) however, the cost of equipment, reagents, and trained personnel is real and substantial, and reimbursement issues are problematic (3,11). Given these concerns, a facility's need for molecular diagnostic testing for infectious diseases should be examined critically by the affected clinical and laboratory services. In many instances, careful overseeing of test ordering and prudent use of a reference laboratory may be the most viable options.

        Practical Applications of Molecular Methods in the Clinical Microbiology Laboratory

        Commercial kits for the molecular detection and identification of infectious pathogens have provided a degree of standardization and ease of use that has facilitated the introduction of molecular diagnostics into the clinical microbiology laboratory (Table 1). The use of nucleic acid probes for identifying cultured organisms and for direct detection of organisms in clinical material was the first exposure that most laboratories had to commercially available molecular tests. Although these probe tests are still widely used, amplification-based methods are increasingly employed for diagnosis, identification and quantitation of pathogens, and characterization of antimicrobial-drug resistance genes. Commercial amplification kits are available for some pathogens (Table 1), but some clinically important pathogens require investigator-designed or "home-brew" methods (Table 2). In addition, molecular strain typing, or genotyping, has proven useful in guiding therapeutic decisions for certain viral pathogens and for epidemiologic investigation and infection control (2,12).

        Detection and Identification of Pathogens Without Target Amplification

        Commercial kits containing non-isotopically labeled nucleic acid probes are available for direct detection of pathogens in clinical material and identification of organisms after isolation in culture (Table 1). Use of solution-phase hybridization has allowed tests to be performed singly or in batches in a familiar microwell format.

        Although direct detection of organisms in clinical specimens by nucleic acid probes is rapid and simple, it suffers from lack of sensitivity. Most direct probe detection assays require at least 10 4 copies of nucleic acid per microliter for reliable detection, a requirement rarely met in clinical samples without some form of amplification. Amplification of the detection signal after probe hybridization improves sensitivity to as low as 500 gene copies per microliter and provides quantitative capabilities. This approach has been used extensively for quantitative assays of viral load (HIV, hepatitis B virus [HBV] and hepatitis C virus [HCV]) (Table 1) but does not match the analytical sensitivity of target amplification-based methods, such as polymerase chain reaction (PCR), for detecting organisms.

        The commercial probe systems that use solution-phase hybridization and chemiluminescence for direct detection of infectious agents in clinical material include the PACE2 products of Gen-Probe and the hybrid capture assay systems of Digene and Murex (Table 1). These systems are user friendly, have a long shelf life, and are adaptable to small or large numbers of specimens. The PACE2 products are designed for direct detection of both Neisseria gonorrhoeae and Chlamydia trachomatis in a single specimen (one specimen, two separate probes). The hybrid capture systems detect human papilloma virus (HPV) in cervical scrapings, herpes simplex virus (HSV) in vesicle material, and cytomegalovirus (CMV) in blood and other fluids. All these tests have demonstrated sensitivity exceeding that of culture or immunologic methods for detecting the respective pathogens but are less sensitive than PCR or other target amplification-based methods.

        The signal amplification-based probe methods for detection and quantitation of viruses (HBV, HCV, HIV) are presented in an enzyme immunoassay-like format and include branched chain DNA probes (Chiron) and QB replicase (Gene-Trak) methods (Table 1). These methods are not as sensitive as target amplification-based methods for detection of viruses however, the quantitative results have proven useful for determining viral load and prognosis and for monitoring response to therapy (13).

        Probe hybridization is useful for identifying slow-growing organisms after isolation in culture using either liquid or solid media. Identification of mycobacteria and other slow-growing organisms such as the dimorphic fungi (Histoplasma capsulatum, Coccidioides immitis, and Blastomyces dermatitidis) has certainly been facilitated by commercially available probes. All commercial probes for identifying organisms are produced by Gen-Probe and use acridinium ester-labeled probes directed at species-specific rRNA sequences (Table 1). Gen-Probe products are available for the culture identification of Mycobacterium tuberculosis, M. avium-intracellulare complex, M. gordonae, M. kansasii, Cryptococcus neoformans, the dimorphic fungi (listed above), N. gonorrhoeae, Staphylococcus aureus, Streptococcus pneumoniae, Escherichia coli, Haemophilus influenzae, Enterococcus spp., S. agalactiae, and Listeria monocytogenes. The sensitivity and specificity of these probes are excellent, and they provide species identification within one working day. Because most of the bacteria listed, plus C. neoformans, can be easily and efficiently identified by conventional methods within 1 to 2 days, many of these probes have not been widely used. The mycobacterial probes, on the other hand, are accepted as mainstays for the identification of M. tuberculosis and related species (7).

        Nucleic Acid Amplification

        Nucleic acid amplification provides the ability to selectively amplify specific targets present in low concentrations to detectable levels thus, amplification-based methods offer superior performance, in terms of sensitivity, over the direct (non-amplified) probe-based tests. PCR (Roche Molecular Systems, Branchburg, NJ) was the first such technique to be developed and because of its flexibility and ease of performance remains the most widely used molecular diagnostic technique in both research and clinical laboratories. Several different amplification-based strategies have been developed and are available commercially (Table 1). Commercial amplification-based molecular diagnostic systems for infectious diseases have focused largely on systems for detecting N. gonorrhoeae, C. trachomatis, M. tuberculosis, and specific viral infections (HBV, HCV, HIV, CMV, and enterovirus) (Table 1). Given the adaptability of PCR, numerous additional infectious pathogens have been detected by investigator-developed or home-brew PCR assays (5) (Table 2). In many instances, such tests provide important and clinically relevant information that would otherwise be unavailable since commercial interests have been slow to expand the line of products available to clinical laboratories. In addition to qualitative detection of viruses, quantitation of viral load in clinical specimens is now recognized to be of great importance for the diagnosis, prognosis, and therapeutic monitoring for HCV, HIV, HBV, and CMV (13). Both PCR and nucleic acid strand-based amplification systems are available for quantitation of one or more viruses (Table 1).

        The adaptation of amplification-based test methods to commercially available kits has served to optimize user acceptability, prevent contamination, standardize reagents and testing conditions, and make automation a possibility. It is not clear to what extent the levels of detection achievable by the different amplification strategies differ. None of the newer methods provides a level of sensitivity greater than that of PCR. In choosing a molecular diagnostic system, one should consider the range of tests available, suitability of the method to workflow, and cost (6). Choosing one amplification-based method that provides testing capabilities for several pathogens is certainly practical.

        Amplification-based methods are also valuable for identifying cultured and non-cultivatable organisms (5). Amplification reactions may be designed to rapidly identify an acid-fast organism as M. tuberculosis or may amplify a genus-specific or "universal" target, which then is characterized by using restriction endonuclease digestion, hybridization with multiple probes, or sequence determination to provide species or even subspecies delineation (4,5,14). Although identification was initially applied to slow-growing mycobacteria, it has applications for other pathogens that are difficult or impossible to identify with conventional methods.

        Detecting Antimicrobial-Drug Resistance

        Molecular methods can rapidly detect antimicrobial-drug resistance in clinical settings and have substantially contributed to our understanding of the spread and genetics of resistance (9). Conventional broth- and agar-based antimicrobial susceptibility testing methods provide a phenotypic profile of the response of a given microbe to an array of agents. Although useful for selecting potentially useful therapeutic agents, conventional methods are slow and fraught with problems. The most common failing is in the detection of methicillin resistance in staphylococci, which may be expressed in a very heterogeneous fashion, making phenotypic characterization of resistance difficult (9,15). Currently, molecular detection of the resistance gene, mec A, is the standard against which phenotypic methods for detection of methicillin resistance are judged (9,15,16).

        Molecular methods may be used to detect specific antimicrobial-drug resistance genes (resistance genotyping) in many organisms (Table 3) (8,9). Detection of specific point mutations associated with resistance to antiviral agents is also increasingly important (17,18). Screening for mutations in an amplified product may be facilitated by the use of high-density probe arrays (Gene chips) (6).

        Despite its many potential advantages, genotyping will not likely replace phenotypic methods for detecting antimicrobial-drug resistance in the clinical laboratory in the near future. Molecular methods for resistance detection may be applied directly to the clinical specimen, providing simultaneous detection and identification of the pathogen plus resistance characterization (9). Likewise, they are useful in detecting resistance in viruses, slow-growing or nonviable organisms, or organisms with resistance mechanisms that are not reliably detected by phenotypic methods (9,19). However, because of their high specificity, molecular methods will not detect newly emerging resistance mechanisms and are unlikely to be useful in detecting resistance genes in species where the gene has not been observed previously (19). Furthermore, the presence of a resistance gene does not mean that the gene will be expressed, and the absence of a known resistance gene does not exclude the possibility of resistance from another mechanism. Phenotypic antimicrobial susceptibility testing methods allow laboratories to test many organisms and detect newly emerging as well as established resistance patterns.

        Molecular Epidemiology

        Figure. Pulsed-field gel electrophoresis (PFGE) profiles of Staphylococcus aureus isolates digested with Sma 1. A variety of PFGE profiles are demonstrated in these 23 isolates.

        Laboratory characterization of microbial pathogens as biologically or genetically related is frequently useful in investigations (12,20,21). Several different epidemiologic typing methods have been applied in studies of microbial pathogens (Table 4). The phenotypic methods have occasionally been useful in describing the epidemiology of infectious diseases however, they are too variable, slow, and labor-intensive to be of much use in most epidemiologic investigations. Newer DNA-based typing methods have eliminated most of these limitations and are now the preferred techniques for epidemiologic typing. The most widely used molecular typing methods include plasmid profiling, restriction endonuclease analysis of plasmid and genomic DNA, Southern hybridization analysis using specific DNA probes, and chromosomal DNA profiling using either pulsed-field gel electrophoresis (PFGE) or PCR-based methods (12,20). All these methods use electric fields to separate DNA fragments, whole chromosomes, or plasmids into unique patterns or fingerprints that are visualized by staining with ethidium bromide or by nucleic acid probe hybridization (Figure). Molecular typing is performed to determine whether different isolates give the same or different results for one or more tests. Epidemiologically related isolates share the same DNA profile or fingerprint, whereas sporadic or epidemiologically unrelated isolates have distinctly different patterns (Figure). If isolates from different patients share the same fingerprint, they probably originated from the same clone and were transmitted from patient to patient by a common source or mechanism.

        Molecular typing methods have allowed investigators to study the relationship between colonizing and infecting isolates in individual patients, distinguish contaminating from infecting strains, document nosocomial transmission in hospitalized patients, evaluate reinfection versus relapse in patients being treated for an infection, and follow the spread of antimicrobial-drug resistant strains within and between hospitals over time (12). Most available DNA-based typing methods may be used in studying nosocomial infections when applied in the context of a careful epidemiologic investigation (12,21). In contrast, even the most powerful and sophisticated typing method, if used indiscriminately in the absence of sound epidemiologic data, may provide conflicting and confusing information.

        Financial Considerations

        Molecular testing for infectious diseases includes testing for the host's predisposition to disease, screening for infected or colonized persons, diagnosis of clinically important infections, and monitoring the course of infection or the spread of a specific pathogen in a given population. It is often assumed that in addition to improved patient care, major financial benefits may accrue from molecular testing because the tests reduce the use of less sensitive and specific tests, unnecessary diagnostic procedures and therapies, and nosocomial infections (11). However, the inherent costs of molecular testing methods, coupled with variable and inadequate reimbursement by third-party payers and managed-care organizations, have limited the introduction of these tests into the clinical diagnostic laboratory.

        Not all molecular diagnostic tests are extremely expensive. Direct costs vary widely, depending on the test's complexity and sophistication. Inexpensive molecular tests are generally kit based and use methods that require little instrumentation or technologist experience. DNA probe methods that detect C. trachomatis or N. gonorrhoeae are examples of low-cost molecular tests. The more complex molecular tests, such as resistance genotyping, often have high labor costs because they require experienced, well-trained technologists. Although the more sophisticated tests may require expensive equipment (e.g., DNA sequencer) and reagents, advances in automation and the production of less-expensive reagents promise to decrease these costs as well as technician time. Major obstacles to establishing a molecular diagnostics laboratory that are often not considered until late in the process are required licenses, existing and pending patents, test selection, and billing and reimbursement (22).

        Reimbursement issues are a major source of confusion, frustration, and inconsistency. Reimbursement by third-party payers is confounded by lack of Food and Drug Administration (FDA) approval and Current Procedural Terminology (CPT) codes for many molecular tests. In general, molecular tests for infectious diseases have been more readily accepted for reimbursement however, reimbursement is often on a case-by-case basis and may be slow and cumbersome. FDA approval of a test improves the likelihood that it will be reimbursed but does not ensure that the amount reimbursed will equal the cost of performing the test.

        Perhaps more than other laboratory tests, molecular tests may be negatively affected by fee-for-service managed-care contracts and across-the-board discounting of laboratory test fees. Such measures often result in reimbursement that is lower than the cost of providing the test. Although molecular tests may be considered a means of promoting patient wellness, the financial benefits of patient wellness are not easily realized in the short term (11). Health maintenance organizations (HMOs) and managed-care organizations often appear to be operating on shorter time frames, and their administrators may not be interested in the long-term impact of diagnostic testing strategies.

        Molecular screening programs for infectious diseases are developed to detect symptomatic and asymptomatic disease in individuals and groups. Persons at high risk, such as immunocompromised patients or those attending family planning or obstetrical clinics, are screened for CMV and Chlamydia, respectively. Likewise, all blood donors are screened for bloodborne pathogens. The financial outcome of such testing is unknown. The cost must be balanced against the benefits of earlier diagnosis and treatment and societal issues such as disease epidemiology and population management.

        One of the most highly touted benefits of molecular testing for infectious diseases is the promise of earlier detection of certain pathogens. The rapid detection of M. tuberculosis directly in clinical specimens by PCR or other amplification-based methods is quite likely to be cost-effective in the management of tuberculosis (7). Other examples of infectious disease that are amenable to molecular diagnosis and for which management can be improved by this technology include HSV encephalitis, Helicobacter pylori infection, and neuroborreliosis caused by Borrelia burgdorferi. For HSV encephalitis, detection of HSV in cerebrospinal fluid (CSF) can direct specific therapy and eliminate other tests including brain biopsy. Likewise, detection of H. pylori in gastric fluid can direct therapy and obviate the need for endoscopy and biopsy. PCR detection of B. burgdorferi in CSF is helpful in differentiating neuroborreliosis from other chronic neurologic conditions and chronic fatigue syndrome.

        As discussed earlier, molecular tests may be used to predict disease response to specific antimicrobial therapy. Detection of specific resistance genes (mec A, van A) or point mutations resulting in resistance has proven efficacious in managing disease. Molecular-based viral load testing has become standard practice for patients with chronic hepatitis and AIDS. Viral load testing and genotyping of HCV are useful in determining the use of expensive therapy such as interferon and can be used to justify decisions on extent and duration of therapy. With AIDS, viral load determinations plus resistance genotyping have been used to select among the various protease inhibitor drugs available for treatment, improving patient response and decreasing incidence of opportunistic infections.

        Pharmacogenomics is the use of molecular-based tests to predict the response to specific therapies and to monitor the response of the disease to the agents administered. The best examples of pharmacogenomics in infectious diseases are the use of viral load and resistance genotyping to select and monitor antiviral therapy of AIDS and chronic hepatitis (17,18). This application improves disease outcome shortens length of hospital stay reduces adverse events and toxicity and facilitates cost-effective therapy by avoiding unnecessary expensive drugs, optimizing doses and timing, and eliminating ineffective drugs.

        Molecular strain typing of microorganisms is now well recognized as an essential component of a comprehensive infection control program that also involves the infection control department, the infectious disease division, and pharmacy (10,21). Molecular techniques for establishing presence or absence of clonality are effective in tracking the spread of nosocomial infections and streamlining the activities of the infection control program (21,23). A comprehensive infection control program uses active surveillance by both infection control practitioners and the clinical microbiology laboratory to identify clusters of infections with a common microbial phenotype (same species and antimicrobial susceptibility profile). The isolates are then characterized in the laboratory by using one of a number of molecular typing methods (Table 4) to confirm or refute clonality. Based on available epidemiologic and molecular data, the hospital epidemiologist then develops an intervention strategy. Molecular typing can shorten or prevent an epidemic (23) and reduce the number and cost of nosocomial infections (Table 5) (10). Hacek et al. (10) analyzed the medical and economic benefits of an infection control program that included routine determination of microbial clonality and found that nosocomial infections were significantly decreased and more than $4 million was saved over a 2-year period (Table 5).

        The true financial impact of molecular testing will only be realized when testing procedures are integrated into total disease assessment. More expensive testing procedures may be justified if they reduce the use of less-sensitive and less-specific tests and eliminate unnecessary diagnostic procedures and ineffective therapies.

        Dr. Pfaller is professor and director of the Molecular Epidemiology and Fungus Testing Laboratory at the University of Iowa College of Medicine and College of Public Health. His research focuses on the epidemiology of nosocomial infections and antimicrobial-drug resistance.


        Background

        The classification of an infectious agent as airborne and therefore ‘aerosol-transmissible’ has significant implications for how healthcare workers (HCWs) need to manage patients infected with such agents and what sort of personal protective equipment (PPE) they will need to wear. Such PPE is usually more costly for airborne agents (i.e. aerosol-transmissible) than for those that are only transmitted by large droplets or direct contact because of two key properties of aerosols: a) their propensity to follow air flows, which requires a tight seal of the PPE around the airways, and b) for bioaerosols, their small size, which calls for an enhanced filtering capacity.

        Several recent articles and/or guidance, based on clinical and epidemiological data, have highlighted the potential for aerosol transmission for Middle-East Respiratory Syndrome-associated coronavirus (MERS-CoV) [1, 2] and Ebola virus [3, 4]. Some responses to the latter have attempted to put these theoretical risks in a more practical light [4], and this nicely illustrates the quandary of how to classify such emerging or re-emerging pathogens into either the large droplet (short-range) versus airborne (short and possibly long-range) transmission categories. However, this delineation is not black and white, as there is also the potential for pathogens under both classifications to be potentially transmitted by aerosols between people at close range (i.e. within 1 m).

        Definitions

        Strictly speaking, ‘aerosols’ refer to particles in suspension in a gas, such as small droplets in air. There have been numerous publications classifying droplets using particle sizes over the years [5,6,7,8,9,10]. For example it is generally accepted that: i) small particles of < 5–10 μm aerodynamic diameter that follow airflow streamlines are potentially capable of short and long range transmission particles of < 5 μm readily penetrates the airways all the way down to the alveolar space, and particles of < 10 μm readily penetrates below the glottis (7) ii) large droplets of diameters > 20 μm refer to those that follow a more ballistic trajectory (i.e. falling mostly under the influence of gravity), where the droplets are too large to follow inhalation airflow streamlines. For these particle sizes, for example, surgical masks would be effective, as they will act as a direct physical barrier to droplets of this size that are too large to be inhaled into the respiratory tract around the sides of the mask (which are not close-fitting) iii) ‘intermediate particles’ of diameters 10–20 μm, will share some properties of both small and large droplets, to some extent, but settle more quickly than particles < 10 μm and potentially carry a smaller infectious dose than large (> 20 μm) droplets.

        ‘Aerosols’ would also include ‘droplet nuclei’ which are small particles with an aerodynamic diameter of 10 μm or less, typically produced through the process of rapid desiccation of exhaled respiratory droplets [5, 6]. However, in some situations, such as where there are strong ambient air cross-flows, for example, larger droplets can behave like aerosols with the potential to transmit infection via this route (see next section below).

        Several properties can be inferred from this, for example the penetration of the lower respiratory tract (LRT), as at greater than 10 μm diameter, penetration below the glottis rapidly diminishes, as does any potential for initiating an infection at that site. Similarly, any such potential for depositing and initiating an LRT infection is less likely above a droplet diameter of 20 μm, as such large particles will probably impact onto respiratory epithelial mucosal surfaces or be trapped by cilia before reaching the LRT [6].

        The Infectious Diseases Society of America (IDSA) has proposed a scheme that is essentially equivalent [7], defining “respirable particles” as having a diameter of 10 μm or less and “inspirable particles” as having a diameter between 10 μm and 100 μm, nearly all of which are deposited in the upper airways. Some authors have proposed the term “fine aerosols”, consisting of particles of 5 μm or less, but this has been in part dictated by constraints from measurement instruments [8]. Several authors lump together transmission by either large droplets or aerosol-sized particles as “airborne transmission” [9], or use “aerosol transmission” to describe pathogens that can cause disease via inspirable particles of any size [10].

        However, we think that it is important to maintain a distinction between particles of < 10 μm and larger particles, because of their significant qualitative differences including suspension time, penetration of different regions of the airways and requirements for different PPE. In this commentary, we use the common convention of “airborne transmission” to mean transmission by aerosol-size particles of < 10 μm.

        If the infected patients produce infectious droplets of varying sizes by breathing, coughing or sneezing, transmission between individuals by both short-range large droplets and airborne small droplet nuclei are both possible, depending on the distance from the patient source. Figure 1 illustrates these potential routes of short and long-range airborne transmission, as well as the downstream settling of such droplets onto surfaces (fomites). From such fomites, they may be touched and transported by hands to be self-inoculated into mucosal membranes e.g. in the eyes, nose and mouth) to cause infection, depending on the survival characteristics of individual pathogens on such surfaces, and the susceptibility (related to available, compatible cell receptors) of the different exposed tissues to infection by these pathogens.

        An illustration of various possible transmission routes of respiratory infection between an infected and a susceptible individual. Both close range (i.e. conversational) airborne transmission and longer range (over several meters) transmission routes are illustrated here. The orange head colour represents a source and the white head colour a potential recipient (with the bottom right panel indicating that both heads are potential recipients via self-inoculation from contaminated surface fomite sources). Here ‘Expiration’ also includes normal breathing exhalation, as well as coughing and/or sneezing airflows. Airborne droplets can then settle on surfaces (fomites) from where they can be touched and carried on hands leading to further self-inoculation routes of transmission

        For example, when the infectious dose (the number of infectious agents required to cause disease) of an organism is low, and where large numbers of pathogen-laden droplets are produced in crowded conditions with poor ventilation (in hospital waiting rooms, in lecture theatres, on public transport, etc.), explosive outbreaks can still occur, even with pathogens whose airborne transmission capacity is controversial, e.g. the spread of influenza in a grounded plane where multiple secondary cases were observed in the absence of any ventilation [11].

        The more mechanistic approaches (i.e. arguing from the more fundamental physical and dynamic behavior of small versus larger particle and droplet sizes in the absence of any biological interactions) to classifying which pathogens are likely to transmit via the airborne route have been published in various ways over the years [12,13,14,15,16,17], but may have to be considered in combination with epidemiological and environmental data to make a convincing argument about the potential for the airborne transmissibility of any particular agent – and the number of possible potential exposure scenarios is virtually unlimited).

        The importance of ambient airflows and the of aerosols

        One should note that “aerosol” is essentially a relative and not an absolute term. A larger droplet can remain airborne for longer if ambient airflows can sustain this suspension for longer, e.g. in some strong cross-flow or natural ventilation environments, where ventilation-induced airflows can propagate suspended pathogens effectively enough to cause infection at a considerable distance away from the source.

        One of the standard rules (Stoke’s Law) applied in engineering calculations to estimate the suspension times of droplets falling under gravity with air resistance, was derived assuming several conditions including that the ambient air is still [13,14,15,16,17]. So actual suspension times will be far higher where there are significant cross-flows, which is often the case in healthcare environments, e.g. with doors opening, bed and equipment movement, and people walking back and forth, constantly. Conversely, suspension times, even for smaller droplet nuclei, can be greatly reduced if they encounter a significant downdraft (e.g. if they pass under a ceiling supply vent). In addition, the degree of airway penetration, for different particle sizes, also depends on the flow rate.

        In the field of dentistry and orthopedics, where high-powered electric tools are used, even bloodborne viruses (such as human immunodeficiency virus – HIV, hepatitis B and hepatitis B viruses) can become airborne when they are contained in high velocity blood splatter generated by these instruments [18, 19]. Yet, whether they can cause efficient transmission via this route is more debatable. This illustrates another point, that although some pathogens can be airborne in certain situations, they may not necessarily transmit infection and cause disease via this route.

        Outline

        Over time, for a pathogen with a truly predominant airborne transmission route, eventually sufficient numbers of published studies will demonstrate its true nature [13]. If there are ongoing contradictory findings in multiple studies (as with influenza virus), it may be more likely that the various transmission routes (direct/indirect contact, short-range droplet, long-, and even short-range airborne droplet nuclei) may predominate in different settings [16, 20], making the airborne route for that particular pathogen more of an opportunistic pathway, rather than the norm [21]. Several examples may make this clearer.

        The selected pathogens and supporting literature summarized below are for illustrative purposes only, to demonstrate how specific studies have impacted the way we consider such infectious agents as potentially airborne and ‘aerosol-transmissible’. It is not intended to be a systematic review, but rather to show how our thinking may change with additional studies on each pathogen, and how the acceptance of “aerosol transmission” for different pathogens did not always followed a consistent approach.


        Clinical Use of PVL Testing

        A number of studies have compared the prognostic values of PVL testing and other traditional markers of risk for acquired immunodeficiency syndrome (AIDS). A study conducted in a large group of HIV-infected men found that PVL was the single best predictor of clinical outcome, followed (in order of predictive value) by CD4+ T-lymphocyte counts, neopterin levels, β2-microglobulin levels, and thrush or fever.11 A similar study in HIV-infected women also demonstrated an association between PVL and disease prognosis.17

        Other studies concluded that the combination of PVL and CD4+ cell counts provided more prognostic information than either factor alone.18 , 19 These investigations also confirmed the ability of baseline PVL and CD4+ cell counts independently to predict clinical outcome and noted that after the initiation of antiretroviral drug therapy, changes in these markers can predict outcome. Each 0.5-log reduction in PVL has been associated with a 30 percent reduction in the risk of clinical progression, whereas each 10 percent increase in CD4+ cell count has been associated with a 15 percent reduction in risk.10 Moreover, at least in pregnant HIV-infected women, the PVL predicts transmission risk.20

        All of the widely used guidelines for the management of HIV-infected patients have incorporated PVL testing for staging disease and determining prognosis.

        STARTING ANTIRETROVIRAL DRUG THERAPY

        Multiple analyses in more than 5,000 patients who participated in approximately 18 antiretroviral drug trials have shown a significant association between a decrease in PVL and improved clinical outcome.6 Therefore, the U.S. Department of Health and Human Services and the Henry J. Kaiser Foundation,6 as well as the International AIDS Society–USA Panel,7 currently suggest that the results of PVL testing should be an essential parameter in decisions on initiating or changing antiretroviral drug therapy. Measurements of PVL and CD4+ cell count should be performed periodically throughout the course of HIV infection (Table 1) .6 – 9

        Given the inherent variability in PVL assays, testing should be performed on at least two separate samples, using the same type of assay and preferably the same laboratory, before treatment decisions are made. Because recent illness or vaccination can lead to transient changes in PVL and CD4+ cell count, assays should be avoided at such times.

        The major guidelines vary slightly in the PVL and CD4+ cell cutoff values that are used for recommendations on starting, considering or deferring antiretroviral drug therapy (Table 2) .6 , 7 PVL measurements ranging from 10,000 to 30,000 copies per mL and CD4+ cell counts of less than 350 to 500 per mm 3 (0.35 to 0.50 × 10 9 per L) are cited as indications of the need to initiate antiretroviral drug therapy in most patients.

        PVL and CD4+ Cell Cutoff Values for Antiretroviral Therapy

        PVL: > 10,000 to 20,000 copies per mL

        CD4+ cell count: < 500 per mm 3 (0.50 × 10 9 per L)

        CD4+ cell count: < 350 per mm 3 (0.35 × 10 9 per L)

        PVL: 5,000 to 30,000 copies per mL

        CD4+ cell count: 350 to 500 per mm 3

        CD4+ cell count: > 500 per mm 3

        CD4+ cell count: > 500 per mm 3

        PVL = plasma viral load CD4+ = presence of CD4 cell marker on T lymphocytes NR = not reported .

        Information from Guidelines for the use of antiretroviral agents in HIV-infected adults and adolescents (January 28, 2000). Retrieved September 2000, from: http://www.hivatis.org/guidelines/adult/text/index/htr , and Carpenter CC, Cooper DA, Fischl MA, Gatell JM, Gazzard BG, Hammer SM, et al. Antiretroviral therapy in adults: updated recommendations of the International AIDS Society–USA Panel. JAMA 2000283:381� .

        PVL and CD4+ Cell Cutoff Values for Antiretroviral Therapy

        PVL: > 10,000 to 20,000 copies per mL

        CD4+ cell count: < 500 per mm 3 (0.50 × 10 9 per L)

        CD4+ cell count: < 350 per mm 3 (0.35 × 10 9 per L)

        PVL: 5,000 to 30,000 copies per mL

        CD4+ cell count: 350 to 500 per mm 3

        CD4+ cell count: > 500 per mm 3

        CD4+ cell count: > 500 per mm 3

        PVL = plasma viral load CD4+ = presence of CD4 cell marker on T lymphocytes NR = not reported .

        Information from Guidelines for the use of antiretroviral agents in HIV-infected adults and adolescents (January 28, 2000). Retrieved September 2000, from: http://www.hivatis.org/guidelines/adult/text/index/htr , and Carpenter CC, Cooper DA, Fischl MA, Gatell JM, Gazzard BG, Hammer SM, et al. Antiretroviral therapy in adults: updated recommendations of the International AIDS Society–USA Panel. JAMA 2000283:381� .

        Concerns about treatment complexities, adverse effects, possible emergence of viral resistance and limitation of future options are just as important as specific numeric cutoffs in decisions regarding antiretroviral drug therapy. Not all patients will be able to achieve the goal of durable viral suppression, and treatment regimens need to be individualized. The substantial cost, complexity and side effects of long-term therapy require careful attention to the patient's preferences about treatment.

        Of note, the viral load appears to be lower in women than in men early in HIV infection, but as immune deficiency advances, gender differences generally disappear.21 Thus, treatment recommendations are the same for women and men.

        ASSESSING THE EFFECTIVENESS OF ANTIRETROVIRAL DRUG THERAPY

        PVL testing also has a role in optimizing antiretroviral drug therapy. This application has the potential to improve clinical outcomes and decrease the use of antiviral agents that are no longer effective, thereby limiting the emergence of drug-resistant HIV strains. According to current recommendations, the preferred initial antiretroviral drug regimen is one that is most likely to reduce and maintain plasma HIV RNA below the level of detection.6 , 7

        CD4+ cell counts and HIV RNA levels are important tools for evaluating treatment response. As mentioned previously, a minimum of two CD4+ cell counts and PVL measurements should be obtained on separate visits before treatment is changed.22 Ideally, the HIV RNA level should decline rapidly after antiretroviral drug therapy is initiated. Guidelines on the expected PVL reductions vary. A typical goal is a 1- to 2-log reduction within four to eight weeks (e.g., from 50,000 copies per mL to 500 copies per mL).6 , 7 Failure to achieve the target level of less than 50 copies per mL after 16 to 24 weeks of treatment should prompt consideration of drug resistance, inadequate drug absorption or poor compliance. Maximal viral suppression often takes longer in patients with higher baseline HIV RNA levels (e.g., greater than 100,000 copies per mL). HIV RNA levels should be obtained periodically during antiretroviral drug therapy, although precise data are not available on the optimal frequency of such monitoring (Table 1) .6 – 9

        For patients in whom a PVL below detectable level has been achieved, a general guideline is to change antiretroviral drug therapy if the plasma HIV RNA concentration is found to be increasing. Ideally, any confirmed detectable plasma HIV RNA is an indication to change therapy in order to prevent the emergence of drug-resistant viral mutants. In some patients, it may be reasonable to wait to change treatment until there is a documented increase in the plasma HIV RNA level to greater than 2,000 to 5,000 copies per mL. In patients with an initially significant decrease in HIV RNA, (but not to below the detection level), a confirmed increase to greater than 5,000 to 10,000 copies per mL suggests the need for a treatment change.7

        Caution should be exercised in interpreting the results of PVL tests. Intra-assay and biologic variability may affect the findings, and concomitant illness or vaccination may cause transient HIV RNA elevations. In addition, all specimens must be processed promptly (ideally, within two to four hours). Because of the rapid pace of viral replication in vivo, patients who miss even a few doses of anti-retroviral drugs before their visit may already be experiencing viral rebound, and their anti-retroviral drug therapy could be incorrectly judged to be failing.4

        Before ordering a PVL test, the physician should review the patient's adherence to the antiretroviral drug regimen and should postpone testing if recent doses have been missed. If the HIV RNA level has fallen to near the lower limit of detection by week 24 but is not yet below the detection level, it is not yet clear whether an attempt to change or add to (i.e., intensify) the regimen is indicated. Because lack of adherence to a complete regimen is often the primary reason for treatment failure, alteration of a failing regimen may not directly address the underlying problem.7

        Although PVL testing is important, it is not the only factor to consider in evaluating an antiretroviral drug regimen and making decisions on treatment changes. A change in anti-retroviral drug therapy should also be considered if the CD4+ cell count is declining, clinical disease is progressing, medications have unacceptable toxicity or intolerable side effects, or the patient is not adhering to the treatment regimen.7

        Numerous clinical guidelines are available to guide physicians and patients through the complicated process of finding the optimal treatment regimen. Skillful selection of initial therapy is important, as the failure of some medications can compromise the subsequent use of other antiretroviral drugs, and a number of medications are more effective when used in specific combinations. The reference list for this article includes several up-to-date sources that can provide guidance in the selection of antiretroviral drug therapy.


        Epidemic theory (effective & basic reproduction numbers, epidemic thresholds) & techniques for analysis of infectious disease data (construction & use of epidemic curves, generation numbers, exceptional reporting & identification of significant clusters)

        Communicable disease control is considered in detail in a separate section of the MFPH Part A syllabus – See Section 2G: Communicable Disease. This page covers the basic principles of epidemic theory.

        Basic reproduction number (R0)

        The basic reproduction number (R0) is used to measure the transmission potential of a disease. It is the average number of secondary infections produced by a typical case of an infection in a population where everyone is susceptible. 1 For example, if the R0 for measles in a population is 15, then we would expect each new case of measles to produce 15 new secondary cases (assuming everyone around the case was susceptible). R0 excludes new cases produced by the secondary cases.

        The basic reproductive number is affected by several factors:

        • The rate of contacts in the host population
        • The probability of infection being transmitted during contact
        • The duration of infectiousness.

        In general, for an epidemic to occur in a susceptible population R0 must be >1, so the number of cases is increasing. 1

        In many circumstances not all contacts will be susceptible to infection. This is measured by the effective reproductive rate (R)

        Effective reproductive number (R)

        A population will rarely be totally susceptible to an infection in the real world. Some contacts will be immune, for example due to prior infection which has conferred life-long immunity, or as a result of previous immunisation. Therefore, not all contacts will become infected and the average number of secondary cases per infectious case will be lower than the basic reproduction number. The effective reproductive number (R) is the average number of secondary cases per infectious case in a population made up of both susceptible and non-susceptible hosts. If R>1, the number of cases will increase, such as at the start of an epidemic. Where R=1, the disease is endemic, and where R<1 there will be a decline in the number of cases.

        The effective reproduction number can be estimated by the product of the basic reproductive number and the fraction of the host population that is susceptible (x). So:

        For example, if R0 for influenza is 12 in a population where half of the population is immune, the effective reproductive number for influenza is 12 x 0.5 = 6. Under these circumstances, a single case of influenza would produce an average of 6 new secondary cases. 1

        To successfully eliminate a disease from a population, R needs to be less than 1.

        Herd immunity

        Herd immunity occurs when a significant proportion of the population (or the herd) have been vaccinated (or are immune by some other mechanism), resulting in protection for susceptible (e.g. unvaccinated) individuals. The larger the number of people who are immune in a population, the lower the likelihood that a susceptible person will come into contact with the infection. It is more difficult for diseases to spread between individuals if large numbers are already immune as the chain of infection is broken.

        The herd immunity threshold is the proportion of a population that need to be immune in order for an infectious disease to become stable in that community. If this is reached, for example through immunisation, then each case leads to a single new case (R=1) and the infection will become stable within the population.

        If the threshold for herd immunity is surpassed, then R<1 and the number of cases of infection decreases. This is an important measure used in infectious disease control and immunisation and eradication programmes.

        An epidemic is defined as an increase in the frequency of occurrence of a disease in a population above its baseline, or expected level, in a given time period. 2 The term is used broadly and the number of cases and time period are often unspecified. It is generally more widespread than an outbreak, which usually implies two or more epidemiologically linked cases, although the two terms have been used interchangeably. Additionally, the term has also been used to describe increasing levels of non-communicable disease, such as an ‘epidemic of cardiovascular disease.’

        The definition above is general, but the term has been defined quantitatively for certain infections and a threshold is selected above which the term ‘epidemic’ is applied. For example, in England levels of influenza are routinely monitored drawing on data from GP consultations and lab diagnoses. The Royal College of General Practitioners (RCGP) has defined the baseline threshold for ‘normal seasonal activity’ in England as 30 to 200 GP consultations for influenza-like illness per week per 100,000 population. The epidemic threshold would be reached if the number of consultations surpassed 200 per week per 100,000. 3

        Other thresholds are used in epidemic theory. The Critical Community Size (CCS) is the total population size needed to sustain an outbreak once it has appeared, and the Outbreak Threshold is the number of infected individuals that are needed to ensure that an outbreak is unlikely to go extinct without intervention. 4

        Epidemic curves

        An epidemic curve is a graph that illustrates the distribution of the onset of new cases of an infectious disease in relation to the onset of illness. The time interval for the onset of illness used will be determined by the incubation period (see “Definitions including: incubation, communicability and latent period susceptibility, immunity, and herd immunity” in Section 2G for a definition of this and related terms).

        Epidemic curves are a useful tool in outbreak investigations, helping to:

        • Determine the type of epidemic (continuous source, point source, propagated)
        • Determine the difference between the maximum and minimum incubation period
        • Estimate the likely time of exposure, and thus help focus investigation on a particular time period
        • Determine the incubation period in cases where the time of exposure is known
        • Identify outliers (below)

        It would be worth looking here for examples of different types of epidemic curve 5 .

        Index Case and Generation Time

        The original case of an outbreak is labelled as the primary case. Secondary cases contract the infection from primary cases, and tertiary cases contracted theirs from secondary cases, and so on. The index case is the term given to the first recognised case, or cases, in an outbreak. Note that the index case may not turn out to be a primary case, and the primary case of an outbreak may only be identified on further investigation, if at all. The generation time describes the duration from the onset of infectiousness in the primary case to the onset of infectiousness in a secondary case (infected by the primary case).

        Exception Reporting

        Infectious disease surveillance ensures that the frequency of certain diseases or symptoms are monitored. If there is an abrupt increase in the frequency of a particular disease, outside of predefined limits, it will be flagged as an “exception” and thus functions as an early indicator that further investigation is required.

        Significant Clusters

        A cluster, or significant cluster, is an aggregation of cases related in time or place that is suspected to be greater than the number expected (although the “expected” number may not be known). The term can relate to both communicable and non-communicable disease. Clusters can be identified using spot maps (where each case is represented on a map by a coloured dot), although such maps may show apparent “clusters” in areas that are densely populated (and thus would have a higher number of expected cases). Alternatively, maps which colour areas in different shades depending on the rate of disease in each area can be used, although if the defined areas are too large it will mask real clusters.


        13.1 Controlling Microbial Growth

        Roberta is a 46-year-old real estate agent who recently underwent a cholecystectomy (surgery to remove painful gallstones). The surgery was performed laparoscopically with the aid of a duodenoscope, a specialized endoscope that allows surgeons to see inside the body with the aid of a tiny camera. On returning home from the hospital, Roberta developed abdominal pain and a high fever. She also experienced a burning sensation during urination and noticed blood in her urine. She notified her surgeon of these symptoms, per her postoperative instructions.

        Jump to the next Clinical Focus box.

        To prevent the spread of human disease, it is necessary to control the growth and abundance of microbes in or on various items frequently used by humans. Inanimate items, such as doorknobs, toys, or towels, which may harbor microbes and aid in disease transmission, are called fomite s. Two factors heavily influence the level of cleanliness required for a particular fomite and, hence, the protocol chosen to achieve this level. The first factor is the application for which the item will be used. For example, invasive applications that require insertion into the human body require a much higher level of cleanliness than applications that do not. The second factor is the level of resistance to antimicrobial treatment by potential pathogens. For example, foods preserved by canning often become contaminated with the bacterium Clostridium botulinum , which produces the neurotoxin that causes botulism . Because C. botulinum can produce endospore s that can survive harsh conditions, extreme temperatures and pressures must be used to eliminate the endospores. Other organisms may not require such extreme measures and can be controlled by a procedure such as washing clothes in a laundry machine.

        Laboratory Biological Safety Levels

        For researchers or laboratory personnel working with pathogens, the risks associated with specific pathogens determine the levels of cleanliness and control required. The Centers for Disease Control and Prevention (CDC) and the National Institutes of Health (NIH) have established four classification levels, called “ biological safety levels ” ( BSLs ). Various organizations around the world, including the World Health Organization (WHO) and the European Union (EU), use a similar classification scheme. According to the CDC, the BSL is determined by the agent’s infectivity, ease of transmission, and potential disease severity, as well as the type of work being done with the agent. 2

        Each BSL requires a different level of biocontainment to prevent contamination and spread of infectious agents to laboratory personnel and, ultimately, the community. For example, the lowest BSL, BSL-1, requires the fewest precautions because it applies to situations with the lowest risk for microbial infection.

        BSL-1 agents are those that generally do not cause infection in healthy human adults. These include noninfectious bacteria, such as nonpathogenic strains of Escherichia coli and Bacillus subtilis , and viruses known to infect animals other than humans, such as baculoviruses (insect viruses). Because working with BSL-1 agents poses very little risk, few precautions are necessary. Laboratory workers use standard aseptic technique and may work with these agents at an open laboratory bench or table, wearing personal protective equipment (PPE) such as a laboratory coat, goggles, and gloves, as needed. Other than a sink for handwashing and doors to separate the laboratory from the rest of the building, no additional modifications are needed.

        Agents classified as BSL-2 include those that pose moderate risk to laboratory workers and the community, and are typically “indigenous,” meaning that they are commonly found in that geographical area. These include bacteria such as Staphylococcus aureus and Salmonella spp., and viruses like hepatitis, mumps, and measles viruses. BSL-2 laboratories require additional precautions beyond those of BSL-1, including restricted access required PPE, including a face shield in some circumstances and the use of biological safety cabinets for procedures that may disperse agents through the air (called “aerosolization”). BSL-2 laboratories are equipped with self-closing doors, an eyewash station, and an autoclave , which is a specialized device for sterilizing materials with pressurized steam before use or disposal. BSL-1 laboratories may also have an autoclave.

        BSL-3 agents have the potential to cause lethal infections by inhalation. These may be either indigenous or “exotic,” meaning that they are derived from a foreign location, and include pathogens such as Mycobacterium tuberculosis , Bacillus anthracis , West Nile virus , and human immunodeficiency virus ( HIV ). Because of the serious nature of the infections caused by BSL-3 agents, laboratories working with them require restricted access. Laboratory workers are under medical surveillance, possibly receiving vaccinations for the microbes with which they work. In addition to the standard PPE already mentioned, laboratory personnel in BSL-3 laboratories must also wear a respirator and work with microbes and infectious agents in a biological safety cabinet at all times. BSL-3 laboratories require a hands-free sink, an eyewash station near the exit, and two sets of self-closing and locking doors at the entrance. These laboratories are equipped with directional airflow, meaning that clean air is pulled through the laboratory from clean areas to potentially contaminated areas. This air cannot be recirculated, so a constant supply of clean air is required.

        BSL-4 agents are the most dangerous and often fatal. These microbes are typically exotic, are easily transmitted by inhalation, and cause infections for which there are no treatments or vaccinations. Examples include Ebola virus and Marburg virus, both of which cause hemorrhagic fevers, and smallpox virus. There are only a small number of laboratories in the United States and around the world appropriately equipped to work with these agents. In addition to BSL-3 precautions, laboratory workers in BSL-4 facilities must also change their clothing on entering the laboratory, shower on exiting, and decontaminate all material on exiting. While working in the laboratory, they must either wear a full-body protective suit with a designated air supply or conduct all work within a biological safety cabinet with a high-efficiency particulate air (HEPA)-filtered air supply and a doubly HEPA-filtered exhaust. If wearing a suit, the air pressure within the suit must be higher than that outside the suit, so that if a leak in the suit occurs, laboratory air that may be contaminated cannot be drawn into the suit (Figure 13.2). The laboratory itself must be located either in a separate building or in an isolated portion of a building and have its own air supply and exhaust system, as well as its own decontamination system. The BSLs are summarized in Figure 13.3.