26.12.2012 Views

Current Population Survey Design and Methodology - Census Bureau

Current Population Survey Design and Methodology - Census Bureau

Current Population Survey Design and Methodology - Census Bureau

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Item Response Analysis<br />

The primary use of item response analysis was to determine<br />

whether different questionnaires produce different<br />

response patterns, which may, in turn, affect the labor<br />

force estimates. Unedited data were used for this analysis.<br />

Statistical tests were conducted to ascertain whether diferences<br />

among the response patterns of different questionnaire<br />

versions were statistically significant. The statistical<br />

tests were adjusted to take into consideration the use of a<br />

nonr<strong>and</strong>om clustered sample, repeated measures over<br />

time, <strong>and</strong> multiple persons in a household.<br />

Response distributions were analyzed for all items on the<br />

questionnaires. The response distribution analysis indicated<br />

the degree to which new measurement processes<br />

produced different patterns of responses. Data gathered<br />

using the other methods outlined above also aided interpretation<br />

of the response differences observed. (Response<br />

distributions were calculated on the basis of people who<br />

responded to the item, excluding those whose response<br />

was ‘‘don’t know’’ or ‘‘refused.’’)<br />

Respondent Debriefings<br />

At the end of the test interview, respondent debriefing<br />

questions were administered to a sample of respondents<br />

to measure respondent comprehension <strong>and</strong> response formulation.<br />

From these data, indicators of how respondents<br />

interpret <strong>and</strong> answer the questions <strong>and</strong> some measures of<br />

response accuracy were obtained.<br />

The debriefing questions were tailored to the respondent<br />

<strong>and</strong> depended on the path the interview had taken. Two<br />

forms of respondent debriefing questions were<br />

administered— probing questions <strong>and</strong> vignette classification.<br />

Question-specific probes were used to ascertain<br />

whether certain words, phrases, or concepts were understood<br />

by respondents in the manner intended (Esposito et<br />

al., 1992). For example, those who did not indicate in the<br />

main survey that they had done any work were asked the<br />

direct probe ‘‘LAST WEEK did you do any work at all, even<br />

for as little as 1 hour?’’ An example of the vignettes<br />

respondents received is ‘‘Last week, Amy spent 20 hours<br />

at home doing the accounting for her husb<strong>and</strong>’s business.<br />

She did not receive a paycheck.’’ Individuals were asked to<br />

classify the person in the vignette as working or not working<br />

based on the wording of the question they received in<br />

the main survey (e.g., ‘‘Would you report her as working<br />

last week not counting work around the house?’’ if the<br />

respondent received the unrevised questionnaire, or<br />

‘‘Would you report her as working for pay or profit last<br />

week?’’ if the respondent received the current, revised<br />

questionnaire (Martin <strong>and</strong> Polivka, 1995).<br />

Behavior Coding<br />

Behavior coding entails monitoring or audiotaping interviews<br />

<strong>and</strong> recording significant interviewer <strong>and</strong> respondent<br />

behaviors (e.g., minor/major changes in question<br />

<strong>Current</strong> <strong>Population</strong> <strong>Survey</strong> TP66<br />

U.S. <strong>Bureau</strong> of Labor Statistics <strong>and</strong> U.S. <strong>Census</strong> <strong>Bureau</strong><br />

wording, probing behavior, inadequate answers, requests<br />

for clarification). During the early stages of testing, behavior<br />

coding data were useful in identifying problems with<br />

proposed questions. For example, if interviewers frequently<br />

reword a question, this may indicate that the<br />

question was too difficult to ask as worded; respondents’<br />

requests for clarification may indicate that they were<br />

experiencing comprehension difficulties; <strong>and</strong> interruptions<br />

by respondents may indicate that a question was too<br />

lengthy (Esposito et al., 1992).<br />

During later stages of testing, the objective of behavior<br />

coding was to determine whether the revised questionnaire<br />

improved the quality of interviewer/respondent<br />

interactions as measured by accurate reading of the questions<br />

<strong>and</strong> adequate responses by respondents. Additionally,<br />

results from behavior coding helped identify areas of<br />

the questionnaire that would benefit from enhancements<br />

to interviewer training.<br />

Interviewer Debriefings<br />

The primary objective of interviewer debriefing was to<br />

identify areas of the revised questionnaire or interviewer<br />

procedures that were problematic for interviewers or<br />

respondents. The information collected was used to identify<br />

questions that needed revision, <strong>and</strong> to modify initial<br />

interviewer training <strong>and</strong> the interviewer manual. A secondary<br />

objective was to obtain information about the questionnaire,<br />

interviewer behavior, or respondent behavior<br />

that would help explain differences observed in the labor<br />

force estimates from the different measurement processes.<br />

Two different techniques were used to debrief interviewers.<br />

The first was the use of focus groups at the centralized<br />

telephone interviewing facilities <strong>and</strong> in geographically<br />

dispersed regional offices. The focus groups were<br />

conducted after interviewers had at least 3 to 4 months<br />

experience using the revised CPS instrument. Approximately<br />

8 to 10 interviewers were selected for each focus<br />

group. Interviewers were selected to represent different<br />

levels of experience <strong>and</strong> ability.<br />

The second technique was the use of a self-administered<br />

st<strong>and</strong>ardized interviewer debriefing questionnaire. Once<br />

problematic areas of the revised questionnaire were identified<br />

through the focus groups, a st<strong>and</strong>ardized debriefing<br />

questionnaire was developed <strong>and</strong> administered to all interviewers.<br />

See Esposito <strong>and</strong> Hess (1992) for more information<br />

on interviewer debriefing.<br />

HIGHLIGHTS OF THE QUESTIONNAIRE REVISION<br />

A copy of the questionnaire can be obtained from the<br />

Internet at .<br />

<strong>Design</strong> of the <strong>Current</strong> <strong>Population</strong> <strong>Survey</strong> Instrument 6–3

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!