RSD Latest News

Picture of Jennifer Kelley
May's VJC Article
by Jennifer Kelley - Wednesday, May 2, 2018, 11:40 AM
 

The next article for the RSD virtual journal club is Experimental Evidence on Reducing Nonresponse Bias through Case Prioritization: The Allocation of Interviewers (Gummer, T., & Blumenstiel, J. E., 2018).  Be sure to post your comments and questions in the VJC forum!


Available courses

Half-day Workshop

Course Date: June 18th, 2018

Course Time: 9:00 a.m - 12:00 p.m.

Instructor: Mick Couper

Topics covered: Randomized Controlled Trials (RCTs) are an important tool for tests of internal validity of causal claims in both health and social sciences.  In practice, however, inattention to crucial details of data collection methodology can compromise the internal validity test.  One crucial example is recruitment and retention of participants – though randomized to treatment, unequal reluctance to participate or unequal attrition from the RCT jeopardize the internal validity of comparisons within the RCT design.  Another crucial example is the interaction of treatment and measurement – if the measures themselves change in response to the RCT treatment, then observed treatment and control differences may reflect these measurement differences rather than treatment differences.  In both cases, specific tools from survey methodology can be used to maximize the internal validity test in the RCT design. This course will focus on the survey methodology topics most important for maintaining the internal validity of RCT studies and feature specific examples of applications to RCTs.  One set of tools will focus on maximizing participation and minimizing attrition of participants.  Core survey methodology tools for encouraging participation in both pre-treatment measurement and the treatment itself as well as tools for minimizing the loss of participants to follow-up measures will be featured.  These tools include incentives, tailoring refusal conversion, switching modes, and tracking strategies. Links to RSD will also be made. A second set of tools will focus on measurement construction to reduce chances of interaction with treatment. These tools include mode options, questionnaire design issues, and special instruments (such as life history calendars) to minimize reporting error.  Each portion of the course will feature examples applying each specific tool to RCT studies. Please note, this course is not available for remote participation. All students must attend in-person. 


Course Date: June 19th, 2018

Course Time: 9:00 a.m - 4:00 p.m.

Instructors: James WagnerBrady West and Andy Peytchev

This course will provide participants with an overview of the primary concepts underlying RSD. This will include discussion of the uncertainty in survey design, the role of paradata, or data describing the data collection process, in informing decisions, and potential RSD interventions. These interventions include timing and sequence of modes, techniques for efficiently deploying incentives, and combining two-phase sampling with other design changes. Interventions appropriate for face-to-face, telephone, web, mail and mixed-mode surveys will be discussed. Using the Total Survey Error (TSE) framework, the main concepts behind these designs will be explained with a focus on how these principles are designed to simultaneously control survey errors and survey costs. Examples of RSD in both large and small studies will be provided as motivation. For more information, please visit the Summer Institute in Survey Research Techniques website



Course Date: June 20th, 2018 

Course Time: 9:00 a.m. - 4:00 p.m.

Instructors: Brady WestWilliam Axinn, and Barry Schouten 

This course will explore several well-developed examples of RSD. Dr. West will serve as a moderator of the course, and also introduce a case study from the National Survey of Family Growth (NSFG). The instructors will then provide independent examples of the implementation of RSD in different international surveys. All case studies will be supplemented with discussions of issues regarding the development and implementation of RSD. Case studies will include the NSFG, the Relationship Dynamics and Social Life (RDSL) survey, the University of Michigan Campus Climate (UMCC) Survey, and the Netherlands Survey of Consumer Satisfaction, among others. This variety of case studies will reflect a diversity of survey conditions. The NSFG (West) is a cross-sectional survey that is run on a continuous basis with in-person interviewing. The RDSL (Axinn) is a panel survey that employed a mixed-mode approach to collecting weekly journal data from a panel of young women. The UMCC survey is a web survey of students at UM that employed multiple modes of contact across the phases of the design. The Netherlands Survey of Consumer Satisfaction (Schouten) is a mixed-mode survey combining web and mail survey data collection with telephone interviewing.  Each case study will focus on RSD features of the survey in question. The focus of the course will be on practical tools for implementing RSD in a variety of conditions, including small-scale surveys. For more information, please visit the Summer Institute in Survey Research Techniques website


Course Date: June 20th, 2018 

Course Time: 9:00 a.m. - 4:00 p.m.

One-day Workshop

Instructor:  William Axinn and Stephanie Coffey

Topics covered: Web surveys can be an inexpensive method for collecting data. This is especially true for designs that repeat measurement over several time periods. However, these relatively low-cost data collections may result in reduced data quality if the problem of nonresponse is ignored. This course will examine methods for using RSD to effectively deploy scarce resources in order to minimize the risk of nonresponse bias. Recent experience with the University of Michigan Campus Climate Survey and the National Survey of College Graduates is used to illustrate this point. These surveys are defined by phased designs and multiple modes of contact. This approach produced relatively high response rates and used alternative contact methods in later phases to recruit sample members from subgroups that were less likely to respond in earlier phases. In hte case of the UM-CCS all of this was accomplished on a very small budget and with a small management team. Lessons from these experiences can be directly applied in many similar settings.


Course Date: June 21st, 2018 

Course Time: 9:00 a.m. - 4:00 p.m.

One-day Workshop

Instructor: Brad Edwards

Topics covered: This course will cover basic concepts for the design and use of “dashboards” for monitoring survey data collection. We will begin with a detailed discussion of how to design dashboards from an RSD perspective. This will include concrete discussion of how relevant data may be collected and summarized across a variety of production environments. We will also discuss how these dashboards can be used to implement RSD interventions on an ongoing basis. We will demonstrate these points using examples from actual dashboards. We will briefly explore methods for modeling incoming paradata in order to detect outliers. On the second day, we will consider practical issues associated with the development of dashboards, including software alternatives. Finally, we will demonstrate how to update dashboards using data reflecting the results of ongoing fieldwork. Students will be provided with template spreadsheet dashboards as discussed earlier.

One-day Workshop

Course Date: June 21st, 2018 

Course Time: 9:00 a.m. - 4:00 p.m.

Instructor: Barry Schouten and Natalie Shlomo

Topics covered: The response rate has been shown to be a poor indicator for data quality with respect to nonresponse bias. Several alternatives have been proposed – the fraction of missing information (FMI), R-Indicators, subgroup response rates, etc. This course will explore the use of these indicators as guides for data collection when working within an RSD framework. We also explore optimization techniques that may be useful when designing a survey to maximize these alternative indicators. The consequences of optimizing a survey to other indicators will be explored. We will also consider how the response rate fits into this approach. We will end with a brief discussion of methods for post data collection evaluation of data quality.

Course Date: June 22nd, 2018

Course Time: 9:00 a.m. - 4:00 p.m. 

Instructors: Heidi GuyerJoe Murphy and Shonda Kruger-Ndiaye

This course will cover issues associated with implementation of RSD to manage field work. Instructors will provide concrete instruction on active monitoring of key indicators across a variety of environments – small-scale surveys, large-scale surveys, and web, telephone, face-to-face and mixed-mode surveys. Methods for implementing RSD interventions in a diversity of production environments will be discussed. RSD will be presented within the framework of the principles of project management, with a particular focus on risk management. A checklist of steps for implementing RSD will be discussed in detail. This course will draw upon a semester-long graduate course in survey management, which includes sections on RSD. For more information, please visit the Summer Institute in Survey Research Techniques website


One-day Workshop

Course Date: June 22nd, 2018

Course Time: 9:00 a.m. - 4:00 p.m. 

Instructor: Daniel Almirall

Topics covered: The effective treatment and management of a wide variety of health disorders often requires individualized, sequential decision making whereby treatment is adapted over time based on the changing disease state or specific circumstances of the patient. Adaptive interventions (also known as dynamic treatment regimens) operationalize this type of individualized treatment decision making using a sequence of decision rules that specify whether, how, for whom, or when to alter the intensity, type, or delivery of pharmacological, behavioral, and/or psychosocial treatments. There has been a huge surge of scientific interest in constructing adaptive interventions via the sequential multiple assignment randomized trial (SMART) design. SMART is a type of multi-stage randomized trial design, developed specifically for the purpose of collecting high-quality data for building optimal adaptive interventions. SMARTs are still new to the great majority of behavioral and social science investigators. In this course, we will introduce adaptive interventions, SMART (including simple design principle, cutting-edge analytic methods (e.g., Q-Learning) for SMART data, and discuss how these ideas can guide responsive and adaptive survey designs.

One-day Workshop

Course Date: June 25th, 2018

Course Time: 9:00 a.m. - 4:00 p.m. 

Instructor: James Wagner

Topics covered: Two-phase sampling is an important tool for RSD. In this course, we will review the theoretical underpinnings of the method, and elaborate on the use of this method for controlling costs and errors in the context of RSD by combining two-phase sampling with other (often more expensive) design changes. We will also discuss implementation issues, such as timing of the sample across various modes and designs and the development and use of appropriate sample weights. Examples from several studies will be included.

 


One-day Workshop

Course Date: June 26th, 2018

Course Time: 9:00 a.m. - 4:00 p.m. 

Instructor: Brady West

Topics covered: This course will discuss a variety of potential RSD interventions. Many of these have been implemented experimentally, and the course will include evaluations of those experiments. The importance of experimental evaluations in early phases of RSD will be discussed. Methods for implementing interventions will also be discussed, including implementation of experiments aimed at evaluating new interventions. Strategies for implementing these interventions with both interviewer-mediated and self-administered (e.g., web and mail) surveys will be discussed. Methods for the evaluation of the results of the interventions (experimental and otherwise) will be considered. These evaluations will include measures of both costs and errors.


One-day Workshop

Course Date: June 27th, 2018

Course Time: 9:00 a.m. - 4:00 p.m. 

Instructor: Peter Miller, Ben Reist and Stephanie Coffey

Topics covered: This course will provide an overview of challenges and successes experienced in the development of adaptive survey design at the U.S. Census Bureau, including illustrations from the National Survey of College Graduates, the National Health Interview Survey and the Survey of Income and Program Participation. The presentation includes a brief history of the evolution of adaptive design capabilities at the Bureau.  We also discuss the development of a protocol for adaptive survey design that guides implementation and transparent documentation.  The three case studies show applications of AD in surveys with different designs (cross-section vs. longitudinal, single vs. multi-mode) and different cost/quality objectives.  We discuss successes and failures in these applications and factors that will shape future uses of adaptive design.