26.12.2012 Views

Current Population Survey Design and Methodology - Census Bureau

Current Population Survey Design and Methodology - Census Bureau

Current Population Survey Design and Methodology - Census Bureau

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

ate, which was 7.70 percent in July 2004 5 . For item nonresponse,<br />

a few of the average allocation rates in July<br />

2004, by topical module were: 0.52 percent for household,<br />

1.99 percent for demographic, 2.35 percent for<br />

labor force, 9.98 percent for industry <strong>and</strong> occupation, <strong>and</strong><br />

18.46 percent for earnings. (See Chapter 16 for discussion<br />

of various quality indicators of nonresponse error.)<br />

CONTROLLING NONRESPONSE ERROR<br />

Field Representative Guidelines 6<br />

Response/nonresponse rate guidelines have been developed<br />

for FRs to help ensure the quality of the data collected.<br />

Maintaining high response rates is of primary<br />

importance, <strong>and</strong> response/nonresponse guidelines have<br />

been developed with this in mind. These guidelines, when<br />

used in conjunction with other sources of information, are<br />

intended to assist supervisors in identifying FRs needing<br />

performance improvement. An FR whose response rate,<br />

household noninterview rate (Type A), or minutes-per-case<br />

falls below the fully acceptable range based on one quarter’s<br />

work is considered in need of additional training <strong>and</strong><br />

development. The CPS supervisor then takes appropriate<br />

remedial action. National <strong>and</strong> regional response performance<br />

data are also provided to permit the RO staff to<br />

judge whether their activities are in need of additional<br />

attention.<br />

Summary Reports<br />

Another way to monitor <strong>and</strong> control nonresponse error is<br />

the production <strong>and</strong> review of summary reports. Produced<br />

by headquarters after the release of the monthly data<br />

products, they are used to detect changes in historical<br />

response patterns. Since they are distributed throughout<br />

headquarters <strong>and</strong> the ROs, other indications of data quality<br />

<strong>and</strong> consistency can be focused upon. The contents of<br />

some of the summary report tables are: noninterview rates<br />

by RO for both the basic CPS <strong>and</strong> its supplements,<br />

monthly comparisons to prior year, noninterview-tointerview<br />

conversion rates, resolution status of computerassisted<br />

telephone interview cases, interview status by<br />

month-in-sample, daily transmittals, percent of personalvisit<br />

cases actually conducted in person, allocation rates<br />

by topical module, <strong>and</strong> coverage ratios.<br />

Headquarters <strong>and</strong> Regional Offices Working as a<br />

Team<br />

As detailed in a Methods <strong>and</strong> Performance Evaluation<br />

Memor<strong>and</strong>um (Reeder, 1997), the <strong>Census</strong> <strong>Bureau</strong> <strong>and</strong> the<br />

<strong>Bureau</strong> of Labor Statistics formed an interagency work<br />

group to examine CPS nonresponse in detail. One goal<br />

5<br />

In April 2004, CPS started phasing in the new 2000-based<br />

sample.<br />

6<br />

See Appendix D for a detailed discussion, especially in terms<br />

of the performance evaluation system.<br />

<strong>Current</strong> <strong>Population</strong> <strong>Survey</strong> TP66<br />

U.S. <strong>Bureau</strong> of Labor Statistics <strong>and</strong> U.S. <strong>Census</strong> <strong>Bureau</strong><br />

was to share possible reasons <strong>and</strong> solutions for the declining<br />

CPS response rates. A list of 31 questions was prepared<br />

to help the ROs underst<strong>and</strong> CPS field operations, to<br />

solicit <strong>and</strong> share the ROs’ views on the causes of the<br />

increasing nonresponse rates, <strong>and</strong> to evaluate methods to<br />

decrease these rates. All of the answers provide insight<br />

into the CPS operations that may affect nonresponse <strong>and</strong><br />

follow-up procedures for household noninterviews. A few<br />

are:<br />

1. The majority of ROs responded that there is written<br />

documentation of the follow-up process for CPS<br />

household noninterviews.<br />

2. The st<strong>and</strong>ard process is that an FR must let the RO<br />

know about a possible household noninterview as<br />

soon as possible.<br />

3. Most regions attempt to convert confirmed refusals to<br />

interviews under certain circumstances.<br />

4. All regions provide monthly feedback to their FRs on<br />

their household noninterview rates.<br />

5. About half of the regions responded that they provide<br />

specific region-based training/activities for FRs on<br />

converting or avoiding household noninterviews.<br />

Much research has been distributed regarding whether<br />

<strong>and</strong> how to convince someone who refused to be<br />

interviewed to change their mind.<br />

Most offices use letters in a consistent manner to<br />

follow-up with noninterviews. Most ROs also include informational<br />

brochures about the survey with the letters, <strong>and</strong><br />

they are tailored to the respondent.<br />

SOURCES OF RESPONSE ERROR<br />

The survey interviewer asks a question <strong>and</strong> collects a<br />

response from the respondent. Response error exists if the<br />

response is not the true answer. Reasons for response<br />

error can include:<br />

1. The respondent misinterprets the question, does not<br />

know the true answer <strong>and</strong> guesses (e.g., recall<br />

effects), exaggerates, has a tendency to give an<br />

answer that appears more ‘socially desireable,’ or<br />

chooses a response r<strong>and</strong>omly.<br />

2. The interviewer reads the question incorrectly, does<br />

not follow the appropriate skip pattern, misunderst<strong>and</strong>s<br />

or misapplies the questionnaire, or records the<br />

wrong answer.<br />

3. A proxy responder (i.e., a person who answers on<br />

someone else’s behalf) provides an incorrect response.<br />

4. The data collection modes (e.g., personal visit <strong>and</strong><br />

telephone) elicit different responses.<br />

Sources <strong>and</strong> Controls on Nonsampling Error 15–5

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!