Survey Methodology #2006-02 - Use Of Dependent Interviewing Procedures To Improve Data Quality In The Measurement Of Change - U.s. Census Bureau Page 14

ADVERTISEMENT

2000). Other evidence also suggests that the DI questions were not used strictly as a nonresponse
follow-up tool. Several 2004 panel interview observation reports (Bruun, 2005; Davis, 2005;
Gilbert, 2005; Moore, 2004) note interviewers’ tendency to use the DI follow-ups as a means to
“peek ahead” at the answers reported in the last interview, and to help respondents answer the
amount questions without even giving them a chance to report on their own. Thus, the greater
than anticipated seam effect reduction may have been the result of interviewers’ tendency to
transform the intended reactive DI follow-ups into proactive-style questions.
The consistently lower off-seam month pair change rates in 2004 compared to 2001 also deserves
comment. We suspect that this effect is mostly due to instrument programming, not respondent
behavior in response to DI. In cases where DI is invoked in 2004, one of two paths produces a
monthly total. The respondents might verify the fed-forward amount, in which case it is assigned
to each month of the current wave. Or he/she might not verify the old amount, but offer a
corrected one in its place, in which case it, too, is allocated to each month in the current wave.
Either way, there was no chance of variation in an off-seam month-pair. Thus, the reduced level
of month-to-month earnings changes within a single wave’s reference period is a function not of
respondents’ responses to dependent questions, but rather how the survey’s designer’s have
chosen to implement them.
5. Conclusions and Discussion
Despite the limitations noted earlier, we find the results quite encouraging with regard to the
quality of month-to-month change data from the new SIPP questionnaire. They offer strong and
consistent evidence, across many diverse characteristics, of the significant positive impact of
dependent interviewing (DI) on the measurement of month-to-month transitions. In the earnings
amount results we even find evidence of “byproduct” positive effects, where nonresponse
reduction, not improvement in transition data, was the primary intent. The new, more precise and
focused dependent interviewing procedures employed in the 2004 SIPP panel with the specific
intent of improving data on transitions appear to have reduced reports of change at the seam, and
to have increased reports of off-seam changes. Both trends address what have been shown to be
the major error tendencies in the measurement of change in longitudinal surveys – overreporting
of change at the seam, and underreporting of off-seam changes (Moore and Marquis, 1989). As a
result, the likelihood of recording a transition at the interview seam in the current SIPP panel is,
for virtually every characteristic examined, significantly more in line with what would be
expected in the absence of measurement error than is the case with the previous panel. Despite
the significant improvements, however, much seam bias still remains.
Fortunately, the results presented here also highlight an additional area in which there is still
much untapped potential for further improvements: “no-to-yes” changes at the seam. DI as it has
been introduced in SIPP focuses exclusively on the presence of some characteristic – being
enrolled in school, receiving Food Stamps, etc. – in the last months of the prior wave’s reference
period. A previously-identified, likely-to-continue spell is carefully addressed in the new post-
wave-1 questionnaire; the same attention is not paid, however, to the onset of a new spell at the
-13-

ADVERTISEMENT

00 votes

Related Articles

Related forms

Related Categories

Parent category: Legal