12.01.2015 Views

RESEARCH METHOD COHEN ok

RESEARCH METHOD COHEN ok

RESEARCH METHOD COHEN ok

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

INTERNET-BASED SURVEYS 227<br />

packages are discussed at http://www.tucows.com/,<br />

which lists and reviews a range of packages, while<br />

http://www.my3q.com/misc/register/register.phtml<br />

provides free online survey software.<br />

For presentational matters Dillman and his<br />

colleagues (1998a; 1999) make the point that<br />

in a paper-based survey the eyes and the hands are<br />

focused on the same area, while in a web-based<br />

survey the eyes are focused on the screen while<br />

the hands are either on the keyboard or on the<br />

mouse, and so completion is more difficult. This<br />

is one reason to avoid asking respondents to type<br />

in many responses to open-ended questions, and<br />

replacing these with radio buttons or clicking on a<br />

mouse that automatically inserts a tick into a box<br />

(Witte et al.1999:139).Further,somerespondents<br />

may have less developed computer skills than<br />

others. They suggest a mixed mode of operation<br />

(paper-based together with web-based versions of<br />

the same questionnaire). The researchers also<br />

found that ‘check-all-that-apply’ lists of factors<br />

to be addressed had questionable reliability, as<br />

respondents would tend to complete those items<br />

at the top of the list and ignore the remainder.<br />

Hence they recommend avoiding the use of checkall-that-apply<br />

questions in a web-based survey.<br />

Similarly they advocate keeping the introduction<br />

to the questionnaire short (no more than one<br />

screen), informative (e.g. of how to move on) and<br />

avoiding giving a long list of instructions. Further,<br />

as the first question in a survey tends to raise in<br />

respondents’ minds a particular mind-set, care is<br />

needed on setting the first question, to entice participants<br />

and not to put them off participating.<br />

(e.g. not too difficult, not too easy, interesting,<br />

straightforward to complete, avoiding drop-down<br />

boxes and scrolling). Dillman et al. (1998a; 1998b;<br />

1999) make specific recommendations about the<br />

layout of the screen, for example keeping the response<br />

categories close to the question for ease<br />

of following, using features like brightness, large<br />

fonts and spacing for clarity in the early parts of<br />

the survey. They also suggest following the natural<br />

movement of the eyes from the top left (the most<br />

important part of the screen, hence the part in<br />

which the question is located) to the bottom right<br />

quadrants of the screen (the least important part of<br />

the screen, which might contain the researcher’s<br />

logo). They comment that the natural movement<br />

of the eye is to read prose unevenly, with the risk of<br />

missing critical words, and that this is particularly<br />

true on long lines, hence they advocate keeping<br />

lines and sentences short (e.g. by inserting a hard<br />

break in the text or to use table-editing features,<br />

locating the text in a table frame). Taking this<br />

further, they also advocate the use of some marker<br />

to indicate to the respondent where he or she has<br />

reached in the questionnaire (e.g. a progress bar<br />

or a table that indicates what proportion of the<br />

questionnaire has been completed so far).<br />

Respondents may not be familiar with webbased<br />

questionnaires, e.g. with radio buttons, scroll<br />

bars, the use of the mouse, the use of drop-down<br />

menus, where to insert open-ended responses, and<br />

the survey designer must not overestimate the<br />

capability of the respondent to use the software,<br />

though Roztocki and Lahri (2002) suggest that<br />

there is no relationship between perceived level<br />

of computer literacy and preference for webbased<br />

surveys. Indeed their use may have to<br />

be explained in the survey itself. Dillman et al.<br />

(1999) suggest that the problem of differential<br />

expertise in computer usage can be addressed in<br />

three ways:<br />

having the instructions for how to complete<br />

the item next to the item itself (not all placed<br />

together at the start of the questionnaire)<br />

asking the respondents at the beginning<br />

about their level of computer expertise,<br />

and, if they are more expert, offering them<br />

the questionnaire with certain instructions<br />

omitted and, if they are less experienced,<br />

directing them to instructions and further<br />

assistance<br />

having a ‘floating window’ that accompanies<br />

each screen and which can be maximized for<br />

further instructions.<br />

Some web-based surveys prevent respondents<br />

from proceeding until they have completed all<br />

the items on the screen in question. While<br />

this might ensure coverage, it can also anger<br />

respondents – such that they give up and abandon<br />

the survey – or prevent them from having a<br />

Chapter 10

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!