DIMACS International Conference on Computational and Mathematical Epidemiology

June 28 - July 2, 2002
DIMACS Center, Rutgers University, Piscataway, NJ

Organizers:
Simon Levin, Princeton University, slevin@eno.princeton.edu
Fred S. Roberts, Rutgers University, froberts@dimacs.rutgers.edu
Presented under the auspices of the Special Focus on Computational and Mathematical Epidemiology.

Co-sponsored by DIMACS and the Alfred P. Sloan Foundation.


Abstracts:



1.

Poster: A Mathematical Model for Lung Cancer: The Effects of
Second-Hand Smoke and Education

Carlos Acevedo-Estefania
University of Texas

In the United States, lung cancer is the leading cause of cancer
deaths. As of today, cigarette smoking causes 85 percent of lung
cancer deaths. In this study, a non-linear system of differential
equations is used to model the dynamics of a population which includes
smokers. The parameters of the model are obtained from data published
by cancer institutes, health and government organizations. The average
number of individuals who become smokers and the reduction of this
average by an education program are determined. The long-term impact
of educating a susceptible class before they enter the population
model and the effect it has on the epidemic is also studied.
Simulations using realistic parameters are carried out to illustrate
our theoretical results.

2. Poster: A Multicity Epidemic Model Julian Arino and Pauline van den Driessche University of Victoria Most epidemic models consider the spread of an epidemic in a homogeneous population. Spatial models have also been considered, in the form of PDE systems. However, in settings like Western Canada or the Canadian Arctic, such models are somewhat inadequate since the diffusive component is inexistent. A model is presented in which n cities are linked through short-term migrations. The disease can behave differently in the different cities, and the interplay between migration and the disease characteristics is studied. The basic reproduction number R0 is computed, giving conditions for which the disease can establish itself in the city network.
3. Poster: Explaining Rain Forest Diversity: The Role of Competition Caroline Bampfylde Oxford University Rain forests exhibit enormous tree species diversity, but the mechanisms for establishing and maintaining such diversity are unknown. To explore this phenomenon, we have developed model frameworks which allow us to investigate the interaction of different plausible ecological processes. We first examine a competition-colonisation model, consisting of a coupled system of non-linear ordinary differential equations. The model describes the time evolution of the population density of different tree species competing for light and space and their interactions, including recruitment, establishment, growth and death. The aim of this simple model is to try to identify the mechanisms that drive species diversity. In the second model, competition is removed entirely and tree species are assumed to colonise sites depending on whether or not they are present, rather than on their position in the competition hierarchy. Mathematical analysis shows that neither model can exhibit species diversity for realistic parameter values. Hence we need to search for an additional mechanism. Another mechanism observed in the rain forests of South-East Asia is random mast fruiting events. Tree species do not all flower and fruit annually, but instead gregarious fruiting events occur when the majority of species fruit at the same time. The frequency of the events is connected to the El Ni\~{n}o Southern Oscillation, which occurs about every 3-11 years. In order to incorporate this effect into the models the colonising ability of a tree species is made to decay exponentiallly with time after a fruiting event. Field observations indicate that there is a trade off between producing many fast growing seedlings and fewer long-lived seedlings. Within this model framework, we find that the inclusion of random fruiting events is the crucial factor necessary to successfully predict species coexistence and mimic correctly the field observations of rain forest tree dynamics.
4. Game Theory and Risk Analysis in Counterterrorism David Banks FDA Traditional game theory is a poor guide to human decision-making. This talk explores ways in which the standard method can be made more realistic through the use of statistical risk analysis, different payoff tables for different players, non-minimax rules, and conditional strategies. To illustrate the ideas, we focus upon decision-making in the context of bioterrorism, and evaluate various defense strategies that arise in the context of a smallpox threat.
5. Poster: Noisy Determinism in Childhood Diseases Chris Bauch and David J.D. Earn McMaster University Understanding complex incidence patterns of childhood diseases during the twentieth century has been a major goal of mathematical modelling of epidemics. The salient features of the incidence patterns of most pertussis and rubella time series can be explained by stability analysis of the standard (unforced) SEIR model, however this approach fails for measles and chicken pox. Conversely, the observed incidence patterns of measles and chicken pox can be explained as asymptotic states of a seasonally forced variant of the SEIR model, however this fails for pertussis and rubella. Here we show that the seasonally forced SEIR model can explain the incidence patterns of all four diseases if an appropriate stability analysis of the model is also carried out, in addition to the asymptotic analysis. An implication is that the dynamics of all four diseases can be explained in terms of noisy deterministic cycles as opposed to noise-perturbed fixed points or chaotic effects. In general, our approach provides a quantitative method to investigate the effects of noise-sustained transient dynamics in ecological systems.
6. Poster: Relating Lattice Models to Field Data via Point-Process Models Chris T. Bauch and Alison P. Galvani McMaster University Spatial effects are fundamental to ecological interactions and epidemiological processes, yet the incorporation of space into models is potentially complex. Lattice models are widely used to study spatial processes, but make unrealistic assumptions about spatial scales and structure. It can also be difficult to parameterize lattice models with empirical data. In spatial point-process models, individuals are distributed across continuous space and interact through a dispersal kernel. Thus point-process models are more realistic but also more complex than lattice models. Here we use moment closure techniques to illustrate a method which allows one to define a lattice model which is equivalent to a point-process model for predicting prevalence of an SIS epidemic. By showing that such an equivalence exists, our analysis supports the view that results from lattice models are relevant to real-world systems. Our method relies only on conventional moment closure arguments and hence should be applicable to other types of spatial point-process models as well.
7. Chaotic Epidemic Outbreaks: Deterministic or Random? Lora Billings Montclair State University Many diseases that occur in large populations tend to have oscillatory behavior, where the amplitudes of the number of cases appear to vary randomly. Examples are malaria, measles, influenza, and pertussis, just to name a few. These diseases are influenced by external environments, such as climate, as well as social factors, such as opening and closing of schools. In contrast, most deterministic population models predict regular, or periodic behavior. This talk will identify a global mechanism in a class of population models that induces chaos by stochastic perturbations, or population noise, where chaos does not naturally occur. Through a combination of computational and analytic techniques, this talk will present the necessary elements for this generic noise induced chaotic bifurcation. Based on the apparent random mixing properties of large and small epidemic outbreaks, large outbreaks may be predicted before they occur. From such predictions arise novel pulsed vaccination strategies, which will be demonstrated to control and prevent future outbreaks.
8. The Demos in Epidemiology: Individual-based Epidemic Models Donald Burke (with Joshua Epstein) Johns Hopkins University Epidemics (literally, epi "upon" and demos "the people") are usually modeled without regard to individuals or social networks. We report here our progress in development of an individual-based ("agent-based") model of smallpox epidemiology. The model, written in JAVA-based software with all assumptions adjustable by the user, is designed to study the dynamics of the introduction and subsequent epidemic spread of the virus in human societies, and to examine the impact of public health interventions such as vaccination, isolation, and/or quarantine. In a typical model run, an infected individual is introduced into a social meta-network of towns, with each town in turn composed of multiple household, workplace, and school sub-networks. Town networks are interconnected by hospitals and by a variable number of "commuters." In the model, infected individuals come into contact with other individuals in their own networks, and newly infected persons progress from asymptomatic infection, to symptoms and transmissibility, then death or recovery. Evolving epidemics are displayed on the monitor screen as infected individuals change colors as they become infected and progress through the illness. Simultaneously, population-level statistical analsyses (numbers infected, dead, etc) are continuously calculated and displayed. Basic model parameters are assigned and calibrated from data from known epidemic introductions into Europe and the USA in the era before smallpox eradication. Simulated epidemics in the model are highly stochastic, with some introductions failing to progress beyond one or a few cases, and others progressing to affect most of the population. Preliminary results suggest that pre-emptive "voluntary" vaccination, implemented before the virus is introduced, can have preventive effects at three levels: (1) the expected protection of vaccinated individuals (2) a predictable additional protection of some portion of the unvaccinated individuals through "herd immunity" and (3) protection of entire population through an increased probability - albeit highly stochastic - of abortive epidemics. We also demonstrate that reactive vaccination strategies targeted to hospital personnel can markedly attenuate epidemics, and that simple isolation strategies can also have profound epidemic attenuating effects. We propose that individual-based computational simulations can provide a powerful tool for exploration of public health policy options.
9. Models for the Transmission of Cultural Traits and Their Impact on Cultural Norms: The Case of Terrorists Carlos Castillo-Chavez Cornell University Epidemiological approaches are used to model the impact of peer pressure on individual behavior at the population level. Results are used to suggest a series of questions and approaches where mathematics can be used to reduced the impact of extreme groups.
10. Poster: Measured Response: A Homeland Security Simulation (Joint work with Shailendra Mehta) Alok Chaturvedi Purdue e-business Research Center, Purdue University Measured Response (MR) is a Homeland Security Simulation developed at Purdue University. MR is developed on the innovative Synthetic Environment for Analysis and Simulation (SEAS) technology. SEAS allows the creation of fully functioning synthetic economies that mirror the real one in its key aspects. SEAS simulations are based on traditional military war-gaming and enable the participants to see the consequences of their decisions and actions in real time. Measured Response is a scenario of a bio-terrorist attack in a mid-western city. In this scenario, several hundred thousand artificial agents mimic the behavior of the citizens of US. Over a dozen human players make decisions representing the various government agencies at the local, state, and federal levels such as Office of Homeland Security, Human and Health Services, Department of Transportation, CDC, FBI, DoD, the Coast Guard, and the National Guard. Citizens? interest groups such as the Red Cross and the private sector are also included. The scenario is calibrated with real data to allow the participants to identify and act upon key issues relating to preparedness, coordination, response and recovery. Players make decisions and test their effectiveness in an environment where there is no fear of adverse consequences. As a result, they gain a better understanding of crisis situations and learn how to prioritize, communicate, delegate and coordinate various actions. The simulation also showcases several technologies. It provides an example of the power and flexibility of the SEAS simulation platform. The artificial agents run on distributed tera-scale grid computing environment comprising of IBM SP2 supercomputers at Purdue and Indiana Universities that are connected by the I-Light Gigabit network. Wireless handheld devices allow the human players to interface with the environment while being fully mobile. High-resolution graphics displays allow the participants to obtain a high-level overview as well as detailed account of the data generated during the exercise. The actual simulation as well as the workshops structured around it help contribute to the dialogue currently underway that seeks to identify key issues in the area of Homeland Security pertaining to both research and practice, in the presence of key participants from various branches of government, academia, corporations and funding agencies. This research is funded by the National Science Foundation and Indiana State Research and Technology Fund.
11. Possible Mechanisms in the Evolution of Influenza A Freddy Bugge Christiansen University of Aarhus, Denmark Influenza A renews its virulence through antigenic drift, where major antigens of the virus changes through point mutations, and through occational antigenic shifts, where parts of the viral genome are exchanged through reasortment with virus from another species. During periods of antigenic drift prevailing variants of influenza A are very closely related, joined by a recent common ancestor. A related phenomenon characterizes antigenic shifts in that the novel type prevails and the old type goes extinct. Drift substitutions may be caused by cross immunity among contemporal vira, and I suggest a similar explanation for the interaction between the old and the new variants in an episode of antigenic shift.
12. FMD 2001: Using Statistics and Mathematics for Outbreak Control and Eradication Christl Donnelly Imperial College A statistical analysis of the recent foot and mouth disease epidemic in Great Britain is presented. A mathematical model of disease transmission that captures the differing spatial contact patterns between farms before and after the imposition of movement restrictions is used to estimate parameters, make predictions of future incidence and simulate the impact of control strategies. The talk will describe the political context as well as scientific issues surrounding this work.
13. Group Testing in Medical Examination Ding-Zhu Du University of Minnesota Group testing was initiated from motivation on possible application in blood testing. However, this possible application is realized in the real world only recently. Due to fast spread of HIV, a blood testing in a wide range of people is required for many countries. Group testing is suggested to use with a restriction on the size of groups. In this talk, a survey on this subject will be given.
14. Poster: The Transmissibility of Pneumonic Plague Ray Gani and Steve Leach The Centre for Applied Microbiology & Research (CAMR) Infection with Yersinia pestis, the causative agent of plague, is considered to pose a potential threat to civilian populations through an aerosolised release by bioterrorists or through importation from plague endemic areas. Y. pestis is primarily transmitted to humans through the bites of fleas that have previously bitten plague-infected rodents, which may lead to bubonic plague. However, if the lungs become infected, Y. pestis has the potential to be transmitted person-to-person as pneumonic plague. There has been much speculation on the transmissibility of pneumonic plague, with qualitative estimates varying widely. Here we present a quantitative assessment of the transmission rate for primary pneumonic plague based on historical data, and conclude that on average, one would expect 0.4 secondary cases from each primary case.
15. Variability, Invasion and Persistence of Crop Disease in the Landscape Chris Gilligan University of Cambridge Agriculture is changing fast and with it the landscape through which disease spreads. The change is stimulated by vast investment in molecular biology to introduce novel crops, as well as changing economic and environmental pressures leading to a bimodality in enterprise size with a mosaic of large-scale intensive growers interspersed with islands of low-input crop production. The geometry, size, diversification, cropping patterns and pesticide regimes of farms within the landscape are evolving to systems in which the epidemiological risks are untested but the consequences of invasion by a virulent parasite or a pesticide resistant form are severe. Starting with a brief summary of current threats from plant disease to agricultural production and semi-natural communities, I shall present a theoretical framework for the dynamics of crop disease that extends from the behaviour in individual fields through coupled fields and farms via metapopulations up to the regional invasion of new diseases. Some of the features that characterise the similarities and differences in the dynamics between animal and crop disease will be presented. These include: 1) the balance between primary and secondary infection in quenched epidemic systems; 2) temporal heterogeneities associated with pulsed disturbance as crops are sown and harvested; 3) spatial heterogeneity in the landscape; 4) the dynamics of control and containment policies including biological control and 5) transient dynamics and the evolution of variability in epidemics. Some areas for future work at the interface between experimentation, modelling and parameter estimation for the analysis of disease risk will be identified.
16. En Route to Reliable Policymaking Tools: Mathematical Models as Hypotheses (Joint work with C.E. Le Baron, R.L. Berkelman and B. Schwartz) John Glasser CDC Public health policymakers cannot identify optimal strategies by experimenting with human populations. Thus, at the Centers for Disease Control and Prevention, models are assisting increasingly in the design, evaluation and improvement of health policy. En route to reliable policymaking tools, mathematical epidemiologists must capture pathophysiology, estimate parameters, and replicate historical observations, ideally in disparate settings. But problematic diseases often are poorly understood, with published results and informed opinions inconsistent, if not contradictory. Crucial observations may be unavailable, and rarely can well-designed studies be executed before decisions must be reached. Moreover, existing observations generally are required for parameter estimation, leaving a paucity of information for model validation. In such circumstances, scientists learn to frame hypotheses and seek natural or design artificial experiments capable of disproving them. Hypotheses found wanting are revised and reevaluated or abandoned in favor of alternatives, which are evaluated in turn; others are scrutinized more closely. But mathematics is more explicit than other languages available for theorizing, and myriad quantitative methods are available for analyzing and simulating equations. These attributes of mathematical models, in turn, transform the ethical constraints that prevent medical epidemiologists from experimenting with human populations into assets insofar as they motivate realistic modeling instead. The arbitrary precision of quantitative results facilitates evaluation, and causes of disparate predictions are relatively easily diagnosed and remedied, whereupon improved models can be reevaluated at rates that would leave traditional experimentalists breathless. But, to increase our understanding of natural phenomena, model states and functional relations must correspond to the elements and processes responsible. We will illustrate this theoretical approach to such vaccine-preventable diseases as measles, pertussis, rubella, and smallpox or, simultaneously, varicella and zoster, which we have modeled realistically to design, evaluate and improve vaccination policy at home and abroad.
17. Data Privacy and Epidemiological Research Harry Guess Merck In this talk, I will discuss how current data privacy regulations affect the ability to conduct epidemiological research using large medical record systems. Included in the talk will be a brief discussion of the implications of these issues for mathematical research.
18. New Vaccination Strategies for Pertussis Herbert Hethcote University of Iowa Both disease-acquired and vaccine-acquired immunity to pertussis (whooping cough) wane with time, so that several infections can occur in an individual's lifetime. The severity of a pertussis infection depends on how much the immunity has declined since the previous vaccination, infection, or exposure. In the United States five DTaP (diphtheria-tetanus-acellular pertussis) vaccinations are recommended at ages 2, 4, 6, 15-18 months, and 4-6 years. The new acellular pertussis vaccine (aP) has fewer side effects, so that it is safe for adults. New strategies involving aP vaccinations of adolescents and adults are being considered that will reduce pertussis incidence in infants, who have more complications and deaths. These strategies include combining the aP vaccine with the current Td (tetanus-diphtheria) booster that is now recommended every ten years, giving the aP vaccine to adolescents at age 12 years or to young adults at age 20 years, and vaccinating families of newborn infants. The effects of these new vaccination strategies are analyzed using computer simulations of an age-structured vaccination model.
19. The Effects of Spatial Scale and Spatial Clumping in the Infection Process on the Spread of Macroparasites Valerie Isham University College, London For mathematical modellers, understanding the effects of spatial structure on the transmission dynamics of infectious diseases and making appropriate allowance for this structure in their models represents an important challenge. In this talk, I will describe some joint work with Stephen Cornell and Bryan Grenfell (Department of Zoology, University of Cambridge) in which the focus is on macroparasitic infections within a managed animal population, illustrated by gastrointestinal nematodes in a herd of sheep. In this context, spatial effects are caused by the spatial scale of the system (represented by the size of the host population) which has significant effects over and above host density, the spatial clumping of the infecting parasites, and the need for the parasite to mate within the host in order to reproduce. Such spatial structure will be shown to have important influences on the persistence/extinction of a parasite population and the enhanced invasion of treatment-resistant strains. These questions will be addressed through the use of a stochastic model representing the physical processes involved, as well as two simplified generic metapopulation models that seek to focus on particular aspects of the process, using a combination of analytic techniqes and simulation.
20. Medical Expenditures During the Last Year of Life: Findings from the 1992-96 Medicare Current Beneficiary Survey Donald Hoover Rutgers University Medical expenditures in the last year of life for elderly (>=65 years old) have recently become of great concern due to the potential that large amounts of money are being spent on hopelessly ill people. But estimation of these costs is complicated by the fact that most sources aggregate and report costs by calendar year. Unless the person dies on December 31st of the year the total costs in last year of life are not known. Estimation is further complicated by the fact that most available samples consist of individuals sampled through a complex hierarchical scheme. To overcome these problems we estimated mean medical expenditures for the last year of life using robust covariance polynomial models. In the 1992-96 Medicare Current Beneficiary Survey (MCBS) data from about 10,000 elderly persons each year about 5% of whom die in the calendar year. Based on these estimated means and the portions of persons dieing in a calendar year we were able to then estimate the percentage of total medical costs devoted to the last year of life and the mean costs spent for person not in the last year of life. Variances and 95% confidence intervals were obtained through conservative application of the delta method. From 1992-96, mean annual medical expenditures (1996 dollars) for persons 65 and older were $37,581 during the last year of life versus $7,365 for non-terminal years. Mean total last year of life expenditures did not differ greatly by age at death. However, non-Medicare last year of life expenditures were higher and Medicare last year of life expenditures were lower for those dying at older ages. Last year of life expenses constituted 22% of all medical, 26% of Medicare, 18% of all non-Medicare expenditures and 25% of Medicaid expenditures. While health services delivered near the end of life will continue to consume large portions of medical dollars, the portion paid by non-Medicare sources will likely rise as the population ages. Policies promoting improved allocation of resources for end of life care may not affect non-Medicare expenditures, which disproportionately support chronic and custodial care.
21. Poster: Network-based Infectious Disease Epidemiology Modeling Matthew N. Johnson and J. Randall Good Institute for Defense Analyses Like much of the public health practice, computer models of infectious disease (ID) epidemiology are based on populations and non-specific interactions between members of that population. The output of these models is a temporal estimation of the gross morbidity and mortality resulting from the disease. These models fail to provide spatial representation of the disease spread or the likelihood of disease penetration to specific subpopulations. We have developed a new computational tool for the modeling of ID epidemiology that integrates network analysis, infectious agent and disease characteristics, and real world intelligence. The output of this model is the development of a comprehensive knowledge base that can be further analyzed by medical, scientific, and public health professionals. Future versions of the model will have refined identity characteristics for the subpopulations and individuals as well as integrated disease interactions and environmental factors.
22. Modeling Bioterror Response Logistics: The Case of Smallpox (Joint presentation with David Craft and Larry Wein) Edward Kaplan Yale University To evaluate existing and alternative proposals for emergency response to a deliberate smallpox attack, we embed the key operational features of such interventions into a smallpox disease transmission model. Such modeling highlights the importance of variables such as the number of personnel available for contact tracing and vaccination, the rates with which the population can be vaccinated, and the accuracy of contact tracing in addition to standard epidemiological parameters such as the reproductive rate of infection and disease progression rates. We explicitly model the tracing/vaccine queues and quarantine requirements that would result from the existing policy for smallpox response, in addition to the number of smallpox cases, deaths, and persons vaccinated that result from this and alternative proposals. The use of probabilistic reasoning within an otherwise deterministic epidemic framework is featured throughout.
23. Epidemiology of Antimicrobial Resistance: Models, Data, and Questions Marc Lipsitch Harvard University The increasing prevalence of resistance to antimicrobial agents (antibacterial, antiviral and antiparasitic drugs) represents a growing threat to the efficacy of treatment in a wide range of infections, including HIV, malaria, tuberculosis, hospital-acquired infections, and a number of community-acquired bacterial infections. Mathematical models are one useful tool in the effort to understand why resistance is increasing, why it is increasing faster in some pathogen/drug combinations than others, and (in some cases) how much further increase we can expect in the near future. Models can help to elucidate mechanisms for observations about antimicrobial resistance at the individual patient and group level that may be otherwise difficult to understand. In addition, models are useful tools to "test" potential interventions to reduce resistance (such as reduced antibiotic use, improved treatment compliance, or infection control measures) in a wider range of settings than are practical for separate clinical trials, to set mechanistically based criteria for the success and failure of interventions, and to predict whether the success or failure of a particular intervention in one setting is generalizable to other settings. Finally, stochastic models of transmission of resistant pathogens can be useful to generate simulated data for the evaluation of study designs for measuring the effects of interventions to control resistance. These roles of models will be illustrated with examples from hospital- and community-acquired infections, with emphasis on (1) the relationship between individual-level measures and group (population- or unit-wide) measures of risk of resistant infections, (2) the differences in dynamics of resistant in hospital and community-acquired infections, and (3) the challenges of study design for evaluating interventions to control resistance when, because of transmission, patients do not represent independent outcomes.
24. Text Categorization in the Health Sciences: A Review and Some New Results (joint work with David D. Lewis) David Madigan Rutgers University Text categorization concerns the assignment of documents to predefined categories. Traditionally librarians and humen indexers have carried out such categorization tasks, sometimes on a large scale. For example, the US National Library of Medicine engages over 100 human indexers to assign medical subject headings to 400,000 medical articles a year. Applications such e-mail filtering, pornography detection, medical coding, and news filtering are creating a growing demand for automated text categorization, especially for categorization algorithms that can can learn from examples. The statistical challenges revolve around issues of scale - the number of predictor variables can run to the tens of thousands - and model structure. In recent empirical evaluations, support vector machines and boosting algorithms have overtaken more traditional probabilistic classifiers like Naive Bayes. This talk will describe these approaches. The talk will also present tractable variants of the probabilistic approach that perform well predictively.
25. Poster: Diseases with Chronic Stage in a Population with Varying Size Maia Martcheva Polytechnic University An epidemiological model of hepatitis C with a chronic infectious stage and variable population size is introduced. A non-structured baseline ODE model which supports exponential solutions is discussed. The normalized version where the unknown functions are the proportions of the susceptible, infected, and chronic individuals in the total population is analyzed. It is shown that sustained oscillations are not possible and the endemic proportions either approach the disease-free or an endemic equilibrium. The expanded model incorporates the chronic age of the individuals. Partial analysis of this age-structured model is carried out. The global asymptotic stability of the infection-free state is established as well as local asymptotic stability of the endemic non-uniform steady state distribution under some additional conditions. A numerical method for the chronic-age-structured model is introduced. It is shown that this numerical scheme is consistent and convergent of first order. Simulations based on the numerical method suggest that in the structured case the endemic equilibrium may be unstable and sustained oscillations are possible. Closer look at the reproductive number reveals that treatment strategies directed towards speeding up the transition from acute to chronic stage in effect contribute to the eradication of the disease.
26. Strategic Concerns in Malaria Control F. Ellis McKenzie NIH In a series of discrete-event models of Plasmodium falciparum transmission dynamics, spontaneous local extinctions of the parasite sometimes occurred under steady, perennial-transmission conditions, but only when the duration of human infection-blocking immunity was set at its maximum value, and, simultaneously, vector survivorship and the duration of human infectivity were set at their minimum values; extinctions increased with seasonal transmission, and decreased with the emergence of recombinant genotypes. This talk will examine the combined influences of seasonality, genotype cross-reactivity, meiotic recombination, and human population turnover on parasite persistence.
27. The Economics of Planning and Preparing for Bioterrorist Events and the Next Influenza Pandemic Martin Meltzer CDC, Office of Surveillance, Office of the Director Recently it has been determined that there is a credible risk of smallpox being used as a biological weapon. There is, then, a need to plan, prepare and practice preparations to limit and prevent the spread of smallpox after a deliberate release. Planning is done by a wide variety of organizations, each with differing data needs. I will briefly present 3 simple mathematical models that emphasize key concepts that planners need to concentrate on when considering options for response to a bioterrorist event. The first model is a risk-benefit model analyzing the value of pre-exposure smallpox vaccinations, the second model examines the economics of quarantine, and the third model examines the optimization of logistics to maximize the number of persons giving pre- or post-exposure prophylaxis. The results from these model will then be contrasted and compared to 2 previously published economic models examining the economics of responding to an anthrax attack and planning for the next influenza pandemic. The overall conclusion from these models is that simple models, that focus on a limited number of questions, can provide policy makers with valuable information. The simplicity of the models also helps reduce the "black box" syndrome, allowing users to readily appreciate what the models do and don't analyze. This can start a dialog between modelers and policy makers, allowing the introduction of more and more sophisticated models into the decision making process.
28. Small Worlds and Giant Epidemics Denis Mollison Heriot-Watt University, Edinburgh Key problems for models of disease spread relate to threshold, velocity of spread, final size and control. All of these depend crucially on the network structure of individual interactions. Networks of interest range from the local extreme where interactions are only between nearest neighbours in some low dimensional space, and the infinite-dimensional `mean-field' extreme where all interact equally with all. Intermediate cases of practical interest include `small-world' and meta-population models. I shall discuss the various structures of such models, their similarities and differences, and some approximations to them. The main aim is to identify what features of contact structure need to be captured when formulating a model for any specific problem of disease spread.
29. Towards a Theoretical (and Practical) Framework for Prodromic Surveillance (Joint work with Adam Karpati) Farzad Mostashari NYC Department of Health When syndromic surveillance seeks to detect an increase in mild symptoms that presages an outbreak of potentially fatal illness, this can be termed "prodromic surveillance" and has particular applicability to the timely detection of new and emerging infectious diseases, whether naturally occurring or intentional. Based on our experience with developing, implementing, and evaluating prodromic surveillance for biologic terrorism over the last 3 years, the authors conclude that:
  1. Prodromic Surveillance is only one component of bioterrorism surveillance
    • Prodromic surveillance aims to identify population-level increases in mild illness, not one or several cases of unusual severe illness.
  2. Data sources must be carefully considered. Many possibilities have been examined (e.g., pharmaceutical sales, school and work absenteeism, nurse's hotline calls, ambulance dispatches, emergency department visits). Ideal characteristics include:
    • Routinely collected for other purposes, imposing no additional burden on data collectors.
    • "Syndromic" can be categorized into illness categories (e.g. respiratory)
    • Includes geographic information (e.g., zip code).
    • Secure and timely electronic transmission possible
  3. Outbreak detection algorithms are needed that are epidemiologically informed and statistically sound, such that:
    • We move away from "the eyeball".
    • Algorithms detect clustering in space as well as time.
    • Different analytic approaches are used, depending on potential confounders and the availability of baseline data.
    • Alarm thresholds are set according to response and investigation capacity
  4. Prodromic surveillance must be complemented by a rapid epidemiologic response capability:
    • Health department must be able to support investigations 365 days/year.
    • Statistical and epidemiologic clues must be used to distinguish between true increases in illness and natural variability.
    • We must be able to quickly move from syndromic "signal" to diagnosis.
  5. Validation of syndromic surveillance systems requires mathematical modeling and simulated outbreaks
    • In the absence of large-scale bioterrorist attacks, validation of these systems is difficult. Approaches must include testing against naturally-occuring outbreaks (e.g. influenza, gastrointestinal outbreaks) but also use of standardized validation datasets "spiked" with simulated outbreaks in time and space.
    • Mathematical and agent-based modeling of a variety of bioterrorist release scenarios must be considered.

30. Mathematics and Epidemiology: Friends but not Intimate David Ozonoff Boston University "Modern" mathematics and epidemiology grew up during the same time period (mid-19th century to the present) but, with some exceptions, have influenced each other very little. This talk will discuss some highlights of this arms-length relationship, distinguishing various historical strands (statistical applications, dynamical systems/demography, morphology/biophysics) and look ahead to how the many branches of mathematical thought might become more engaged, interested and useful to epidemiologists and how epidemiologists might begin to realize the utility of mathematics to their discipline and even present new and interesting problems with mathematical content.
31. The Use of Interpoint Distances in Biosurveillance Data (Joint work with M. Bonnetti, K.D. Mandl, K. Olson, and B. Reis) Marcello Pagano Harvard University This talk describes the analysis of data obtained from an existing surveillance network which virtually integrates multiple hospital emergency department databases in real time. This network provides a real time picture of regional population patterns of disease. Current surveillance methods are extended to incorporate the use of interpoint distances to study these patterns and to increase the power of detection of an abnormal outbreak.
32. Modeling Influenza Infection and Vaccination Alan Perelson Los Alamos This talk will focus on the dynamics of influenza infection and and the effects of vaccination. I will present data obtained during experimental influenza infection of humans and evaluate how well various dynamic models can explain the data. I will then present an agent based simulation model of the humoral response to influenza and influenza vaccination, and use this model to illustrate how we can understand the potential interference effects that arise from repeated vaccination, a phenomenon called "original antigenic sin". Lastly, I will use this model to illustrate how one can develop vaccination strategies that minimize interfence effects and which hence can improve vaccine efficacy in high risk groups of people that receive annual influenza vaccination.
33. Aspects of the Ecology and Evolution of Influenza A Jonathan Dushoff Princeton University Simon Levin Princeton Univeristy Joshua Plotkin Institute for Advanced Study Continual mutations to the hemagglutinin gene of influenza A generate novel antigenic strains that cause annual epidemics. Because of interactions amoung these strains, an understanding of influenza dynamics must address both its evolution and its epidemiology. In this talk we summarize some outstanding questions about influenza dynamics, describe an empirical study of flu's genomic evolution, and discuss some current approaches to modelling multi-strain interactions. Using a database of 560 viral RNA sequences, we have studied the timeseries of hemagglutinin evolution over the past two decades. We detect a critical length scale, in amino-acid space, at which viral sequences aggregate into clusters. We compare the spatio-temporal distribution of viral clusters to the flu vaccines recommended by the World Health Organization. We also investigate the relationship between cluster structure and the antibody-combining regions of the hemagglutinin protein. In light of these empirical results, we discuss results from a family of models for multistrain evolution and epidemiology. We contrast diffusion models with stochastic, individual-based models that allow for the integration of multiple epitopes and can produce phylogenetic patterns typical of influenza.
34. Computational and Mathematical Epidemiology: Challenges Fred S. Roberts DIMACS, Rutgers University This talk will outline the plans for the DIMACS Special Focus on Computational and Mathematical Epidemiology and describe some of the challenges for research.
35. Optimal-Regime Estimation James Robins Harvard School of Public Health A physician must dynamically choose a treatment strategy for his HIV positive patients in the sense that he must, at monthly visits, choose the drugs and dosages to prescribe based on the each patient's past clinical, laboratory, and treatment history. I will describe recent methods developed for estimation of the optimal treatment strategy from either observational or randomized trial data. I show this method can also provide an improved solution to the high dimensional Markov decision process problems studied in the engineering, artificial intelligence, and operations research literature
36. Adaptive Design of Urban Malaria Control Programs: A case study of Dar es Salaam, Tanzania Burton Singer Princeton University Successful malaria control programs in the past have, virtually without exception, involved the use of multiple interventions applied simultaneously. Adaptive tuning of the package of interventions over time was an essential feature of program implementation that ultimately led to substantially reduced incidence and prevalence rates. Urban malaria poses special problems for control because of the extensive ecological variation within cities and the fact that the process of urbanization itself has countervailing influences on malaria risk over time. We review a successful urban malaria control program from the British colonial period in Northern Rhodesia. The essential features of this program are formalized as an adaptive optimization problem. This structure is then used in a spatially explicit adaptive design for malaria control in contemporary Dar es Salaam, Tanzania. The multiple conflicting ?success? criteria that arise in formal evaluating of such programs is emphasized.
37. Phylogenetics and its Role in Epidemiology Mike Steel University of Cantebury, New Zealand Since Charles Darwin, biologists have been using trees to represent the historical relationships between different species. As the data and analysis have become more sophisticated (and molecular-based), so too has the underlying mathematical and computational theory. The field of 'phylogenetics' is now a flourishing area of interaction between at least four disciplines: mathematics, statistics, computer science and biology. The techniques developed have been applied in numerous biological arenas, including epidemiology - for example, to classify and study the origin and evolution of rapidly evolving viruses. Phylogenetics is now an important tool to an investigator interested in such questions as the origin, diversity and evolution of HIV, or the selection of influenza strains for future vaccines. However some of the particular features of epidemiological data suggest that new techniques and approaches should be developed. In this overview talk I will briefly describe some of the key concepts in phylogenetics, and how they have been usefully applied in epidemiology. I will also mention some of the central mathematical and computation challenges that have arisen.
38. Poster: A Model Describing the Evolution of West Nile-like Encephalitis in New York City Diana Thomas Montclair State University Encephalitis is a virus that is carried by mosquitoes and transmitted to humans and birds. Mosquitoes and birds do not show any signs of the virus and the main life cycle is carried between the two. Humans contract the virus from infected mosquitoes and the results can be deadly. In the Summer of 1999, New York City and the surrounding area were struck with a West Nile strain of encephalitis. We develop a difference equation model describing the evolution of the virus. The model incorporates a control variable that accounts for pesticide sprayed to influence mosquito populations in New York City. Using the theory of asymptotically autonomous difference equations, we arrive at parameter calculations that will eradicate the disease.
39. Understanding Disease Clusters Daniel Wartenberg UMDNJ-Robert Wood Johnson Medical School The investigation of disease clusters, aggregations of a few to several cases of disease, remains a controversial issue in epidemiology and public health. They raise issues of personal tragedy, blame, limited data and precautionary public health actions. Reported at the rate of more than 3 per day nationally, active response requires substantial resources. Given different goals stated by different groups involved, strategies vary greatly for responding to reports from community residents, evaluating preliminary data, and conducting in the field follow up. This presentation considers whether scientists or public health officials should investigate disease clusters, when they should do so, and if so, how. We review methods used for preliminary analyses, noting limitations. Some of the most heated debates occur over the interpretation of statistical probability values and the validity of etiologic inference based on these analyses. This paper suggests that generation of specific hypotheses rather etiologic inference is best achieved from preliminary analyses, and subsequent follow up must be prioritized based on our best assessments. To meet the needs of the public while accommodating the limited resources of public health officials and some of the concerns of epidemiologists, an active surveillance program, using newly development methods, with occasional investigation is recommended.
40. Poster: Magnification of Bias in Ecologic Epidemiology Tom Webster Boston University School of Public Health Individual-level epidemiologic studies analyze exposure, outcome and other variables for each subject. Ecologic studies analyze variables at the group-level, e.g., average exposure and disease risk by town. The loss of information caused by aggregation can lead to potentially severe bias. I am interested in understanding the magnitude of the bias in ecologic studies relative to individual-level studies. To simplify the problem, I estimated risk differences using population-weighed least squares. It can be shown that the reduction in exposure variance caused by aggregation magnifies the ecologic bias due to group-level confounding or effect measure modification. In this way, tiny amounts of bias on the individual-level can be greatly increased. Other things held constant, ecologic studies based on continuous exposures have less bias than studies based on binary exposures. Semi-individual studies?designs that measure exposure on the group-level but other variables on the individual-level?are subject to some of the same problems as fully ecologic studies. Under certain conditions, the magnitude of the bias is intermediate between that of the fully ecologic and individual-level studies.

Previous: Participation
Next: Registration
Workshop Index
DIMACS Homepage
Contacting the Center
Document last modified on June 28, 2002.