Fall 2005 Newsletter | The Methodology Center

Fall 2005 Newsletter

Perspective

 

In This Issue:

 

Download PDF version

 


 

A Note from the Director

Linda Collins Welcome back to the Methodology Center Perspective! Since our last newsletter, there have been some important events and changes affecting The Methodology Center.

I am happy to say, it looks like five more years of P50 center grant funding for The Methodology Center. The new grant is called Center for Prevention and Treatment Methodology, reflecting our increasing interest in methods for treatment and health services research. Most of the people reading this newsletter have a sense of the tremendous effort involved in pulling together a P50 application. This one was truly a team effort, with me in the role of both coach and player. I feel incredibly fortunate to be able to work with Stephanie, Runze, Susan, Joe, and Michael. They put up with my whip-cracking with good humor and turned out a fine product. Thanks, guys! Most importantly, we are excited about the work we proposed and can’t wait to begin.

 

The close relationship between The Methodology Center and Penn State’s Prevention Research Center for the Promotion of Human Development has resulted in a training grant on the integration of prevention science and statistical methods. The training grant, which was launched this summer, has slots for both predoctoral and postdoctoral trainees. Elsewhere on this page there is an announcement about postdoctoral fellowships. Be sure to send qualified applicants our way. We welcomed our first postdoctoral trainee, Robert Petrin, this fall.

 

Methodology Center scientist E. Michael Foster has left Penn State for a position in the Department of Maternal and Child Health at the University of North Carolina, Chapel Hill. He is one of the principal investigators on the center grant and remains an indispensable part of our research team. We are looking forward to gaining some new colleagues at UNC. We are also thinking creatively about ways to maintain center synergy across three universities.

 

It is always a bittersweet event when a trainee leaves to start a professional career. We wish them the best and share their excitement, but at the same time, we hate to see them go. Three former graduate students and one post-doc recently left to take academic jobs at great universities. Brian Flaherty is now in the Psychology Department at the University of Washington. Hwan Chung is now in the Department of Epidemiology at Michigan State University. Mildred Maldonado-Molina is now in the Department of Epidemiology and Health Policy Research at the University of Florida. Theodore Walls is now in the Department of Psychology at the University of Rhode Island. We look forward to continuing collaboration with these talented individuals.

 

In response to comments from other scientists, we have added a new section to our Web site (). It is called “Getting Started,” and as the name implies, it contains material aimed at people who are new to a particular area of our center’s research. There is introductory material as well as suggestions for further reading. We hope you find it helpful.

 

Best regards,

 

Linda M. Collins, Ph.D.

Director, The Methodology Center

Professor, Human Development and Family Studies

Professor, Statistics

Penn State University

 

Back to Top

 


 

Announcing the Prevention and Methodology Training (PAMT) Program at Penn State

The Methodology Center, directed by Linda Collins, and the Prevention Research Center, directed by Mark Greenberg, would like to announce the new interdisciplinary PAMT program, funded by the National Institute on Drug Abuse. The goal of this collaborative training program is to produce scientists trained in the integration of prevention science and statistical methodology. The training program is currently accepting applications for post-doctoral fellows. These positions present a unique opportunity for highly motivated recent PhD’s to continue their training in a synergistic environment that includes highly qualified prevention scientists and methodologists. The centers will work together to train researchers in the development and application of cutting-edge research methods to the study of substance abuse and co-morbid prevention programs for children, youth, families, and communities.

 

Back to Top

 


 

Featured Pilot Project: Engineering Control Approaches for the Design and Analysis of Adaptive Interventions

Daniel RiveraControl engineering is the field that examines how to transform the behavior of systems over time from undesirable conditions to desirable ones. Cruise control in automobiles, the home thermostat, and the insulin pump are all examples of control engineering principles at work. In the last few decades, significant improvements in computing and information technologies have enabled the extensive application of control engineering concepts to physical systems. While control engineering principles could be expected to be meaningful as well to problems in the prevention of drug and substance abuse, their application to this field appears to be largely unexplored.

 

Daniel Rivera of the Department of Chemical and Materials Engineering at Arizona State University has been funded by a supplement to the P50 center to examine the role that control engineering principles can play in the development of adaptive interventions. A K25 grant application devoted to the topic is currently under review. Adaptive interventions systematically individualize therapy through the use of decision rules that determine intervention dosages and forms of treatment by relying on measurements of tailoring variables over time. Daniel’s efforts to date have established that adaptive interventions constitute a form of feedback control system in the context of behavioral health. Consequently, drawing from ideas in control engineering has the potential to significantly inform the analysis, design, and implementation of adaptive interventions, leading to improved adherence, better management of limited resources, a reduction of negative effects, and overall more effective interventions. The potential benefits are particularly significant for the case of adaptive interventions involving multiple components and co-morbidities, conditions which represent challenges to conventional clinical practice.

 

The basic conceptual framework for this project draws from the paper by Collins et al. (2004) and is described in the technical report by Rivera et al. (2005). A simulation study of a hypothetical adaptive, time-varying intervention based on the Fast Track program for prevention of conduct disorders in at-risk children is presented in this document. The results of explicit decision rules (similar to those proposed by Collins et al., 2004) are compared to a proportional-integral-derivative (PID)-type control system designed using engineering control principles. In light of this analysis and the simulation study, a series of systems technologies that can impact future research on this problem are proposed; these include dynamical modeling using system identification and decision rules based on model predictive control.

 

Collins, L.M., Murphy, S.A., & Bierman, K.L. (2004). A conceptual framework for adaptive preventive interventions. Prevention Science, 5 (3), 185-196.

Rivera, D.E., Pew, M.D., Collins, L.M., & Murphy, S.A. (2005). Engineering Control Approaches for the Design and Analysis of Adaptive, Time-Varying Interventions, Technical Report 05-73, The Methodology Center, Penn State University.

 

Back to Top

 


 

Featured Scientist

Runze LiRunze Li

Runze Li is an associate professor of statistics and an investigator in the Methodology Center at Penn State. Li has been working on a variety of research areas, including longi-tudinal data analysis, survival analysis, design and modeling for computer experiments, nonparametric regression, and variable selection.

 

Li’s current primary research interest is the analysis of intensive longitudinal data. Intensive longitudinal data have many closely spaced measurement occasions and, usually, many variables. With the advent of modern data collection devices and vast data storage space, intensive longitudinal data are being collected more and more in drug use studies. For instance, the method of ecological momentary assessment (EMA) emphasizes real-time assessments of multiple occasions in real-world settings, and administers assessments at random intervals in order to produce a representative sample of each person’s experience. EMA uses hand-held computers to manage the sampling and assessment schedule as well as to present assessments and record the data, and therefore allows for thousands of data points to be collected easily on each of several hundred individuals.

 

In theory, intensive longitudinal data can provide answers to important questions such as: How does the subjective sensation of withdrawal vary over a day or a week, and how does it vary according to environmental cues? What is the relationship between mood and drug use? How does this relationship change over the process of smoking cessation? What environmental cues trigger smoking? In practice, it is not immediately clear what statistical procedures can address such questions. For example, the sensation of withdrawal may go up and down repeatedly over a day or a week. Commonly used procedures, such as hierarchical linear models (HLM), are not well suited to modeling this kind of phenomenon. The direction and strength of the relationship between mood and drug use may vary within individuals over time; this is an example of a time-varying effect.

 

In Li’s recent work, functional hierarchical linear models (FHLM) were developed for intensive longitudinal data. FHLMs allow effects in hierarchical linear models to change over time. Li is developing semi-varying coefficient models for intensive longitudinal data. The semi-varying coefficient models allow some effects to be timevarying and other effects to be time-invariant, and therefore can be viewed as a combination of FHLM and HLM. Li is devoted to developing effective estimation procedures and powerful hypothesis testing statistics for the semi-varying coefficient models within the setting of intensive longitudinal data.

 

Runze Li, Ph.D.

Phone: 814-865-1555

rli@stat.psu.edu

 

Back to Top

 


 

Recent Activity in the Center

Schafer Gives Keynote Address at the International Meeting of the Psychometric Society

Joseph L. Schafer, associate professor of statistics at Penn State and investigator at the Methodology Center, presented a keynote address titled Missing Data, Multiple Imputation, and Causal Inference on July 5, 2005 at Tilburg, the Netherlands.

 

Murphy Delivers Clifford Clogg Memorial Lecture at Penn State

The Clifford C. Clogg Memorial Lecture in Sociology and Statistics honors the late Clifford C. Clogg, a distinguished professor of sociology and professor of statistics at Penn State from 1979 to 1995. Susan A. Murphy, H.E. Robbins professor of statistics at the University of Michigan and investigator at the Methodology Center, presented this year’s lecture titled Meeting the Future in Managing Chronic Disorders: Individually Tailored Interventions on October 17, 2005 at Penn State.

 

Collins Contributes Chapter to 2006 Annual Review of Psychology

A chapter by Linda M. Collins titled Analysis of longitudinal data: The integration of theoretical model, temporal design, and statistical model will appear in the Annual Review of Psychology. The chapter is currently available at http://www.annualreviews.org (go to the Psychology series, then select 2006 – Reviews in Advance).

 

2005 Summer Institute on Longitudinal Methods

Paul D. Allison of the University of Pennsylvania presented at the Summer Institute on Longitudinal Methods, supported by a grant from NIDA. This workshop, held June 1-3, 2005 at Penn State, covered the theory and practice of survival analysis, with an emphasis on Cox regression.

 

Developing Statistical Software in Fortran 95The New Face of Fortran

Developing Statistical Software in Fortran 95, by David Lemmon and Joseph Schafer of the Methodology Center, is currently ranked third by Google Directory among books on the Fortran 95 language. Though it is written mainly for statistical programmers, it is also an excellent general reference on the language for software developers working in any field. It provides a practical, step-bystep development paradigm that engineers, mathematicians, physicists, and many others will find very useful. The book also provides in-depth explanations of the most commonly used Fortran 95 language constructs not found in most texts. For more information go to: /fortranbook.

 

Design and Modeling for Computer ExperimentsPublished in October 2005: Design and Modeling for Computer Experiments

Design and Modeling for Computer Experiments, by Kai-Tai Fang of Hong Kong Baptist University, Runze Li of the Methodology Center and the Department of Statistics at Penn State, and Agus Sudjianto of Bank of America, is a new volume out this fall. With the advent of computing technology and numerical methods, engineers and scientists frequently use computer simulations to study actual or theoretical physical systems. To simulate a physical system, one needs to create mathematical models to represent physical behaviors. In many situations, it is often difficult and even impossible to study the behavior of computer models using traditional methods of experiments. Thus, design and modeling are two key issues in computer experiments. Our motivation to write this book is to share our experience of blending modern and sound statistical approach with extensive practical engineering applications. This book consists of three parts: Part I presents an overview of existing methods of design and modeling procedures for computer experiments; Part II systematically introduces various space-filling designs and their constructions; Part III presents most useful modeling techniques for computer experiments.

 

Models for Intensive Longitudinal DataForthcoming in February 2006: Models for Intensive Longitudinal Data

Models for Intensive Longitudinal Data, edited by Theodore Walls of the University of Rhode Island and Joseph Schafer of Penn State, is a new volume featuring state-of-the-art statistical modeling for intensive longitudinal data. Social scientific studies increasingly utilize technological devices, such as handheld computers, beepers, or Web interfaces, in data collection. This volume is intended for investigators designing new social scientific studies that will produce intensive longitudinal data, applied statisticians working on related models, and practicing methodologists. The volume will cover ten statistical approaches to the analysis of this type of data, including multilevel modeling, generalized estimating equations, item response theory, functional data analysis, time series analysis, point process modeling, state space modeling, dynamical systems modeling, and control systems modeling. For ordering information watch the Oxford University Press Web site (http://www.oup.com/us/).

 

Back to Top

 


 

Ask a Methodologist

If I know the cost associated with administering my substance abuse intervention program, how do I determine whether my program was cost-effective? — Signed, Worried about Bottom Line

 

Dear Worried,

You have done a great deal of the work already. My hope is that you measured the costs of your program consistently with the principles of economics, such as clearly stating the perspective from which costs are measured. As you know, the perspective has implications for which resources to count as costs and the value placed on them. (For a review of these issues, you might see some of my recent papers in this area available at www.unc.edu/~emfoster/ index.html#new. The work of Michael French and colleagues is also quite relevant (e.g. French et al., 2002)). There are three ways to weigh the costs of the intervention relative to the benefits. The first option (requiring the least of your research resources) is a cost-effectiveness analysis (CEA). While noneconomists use this term to refer to the field of economic analysis in general, CEA refers to something pretty specific— the analysis of incremental cost-effectiveness ratios (or some derivative) which are known as ICERs. ICERS involve measures such as the dollars spent per case of substance abuse averted. For this type of analysis, you need only your measures of costs and the program’s benefits. Unfortunately, the problems with ICERs are two fold. First, it is quite likely that you will get different ICERs for different outcome measures. Of course, non-economists also encounter this problem in assessing effectiveness, but it does create ambiguity (Sindelar et al., 2004). Second, it’s difficult at times to know what constitutes a big or small ICER—big is in the eye of the beholder. For example, is $10,000 per case of substance abuse averted big or small? However, ICERs are a good way to rank programs. It’s clear that a program with an ICER of $10,000 is preferred to one with an ICER of $15,000 (see e.g. Drummond & Mcguire, 2001 for more detail). The second possibility involves a true benefit-cost analysis (BCA). Again, many refer to a variety of analyses as BCA, but the meaning in economics is pretty specific—it refers to the costs of the program generally measured from a social perspective measured against society’s willingness to pay for the outcome. A nice feature of BCA is that both the program’s costs and benefits are measured in dollars, so one ends up with a measure of social “profits.” However, measuring society’s willingness to pay for a given outcome can be difficult. The third possibility involves a hybrid of some sort. One example would be an analysis of the impact of a program on public costs. While not a true BCA, such an analysis can assess the impact of an intervention on the costs of an illness. Such an analysis can tell us, for example, whether a portion of the expenditures on better mental health services are recouped by reductions in juvenile justice (Foster et al., 2004). Any of these three choices can provide useful information. Choosing among them will reflect the target audience and your research budget.

Drummond, M.F., & Mcguire, A. (Eds.). (2001). Economic Evaluation in Health Care: Merging Theory with Practice. Oxford: Oxford University Press.

Foster, E.M., Qaseem, A., & Connor, T. (2004). Can better mental health services reduce the risk of juvenile justice system involvement? American Journal of Public Health, 94(5), 859-865.

French, M.T., Salome, H.J., Sindelar, J.L., & McLellan, A.T. (2002). Benefit-cost analysis of addiction treatment: methodological guidelines and empirical application using the DATCAP and ASI. Health Serv Res, 37(2), 433-455.

Sindelar, J. L., Jofre-Bonet, M., French, M.T., & McLellan, A.T. (2004). Cost-effectiveness analysis of addiction treatment: paradoxes of multiple outcomes. Drug Alcohol Depend, 73(1), 41-50.

 

Back to Top

Follow Us