RESEARCH METHOD COHEN ok

RESEARCH METHOD COHEN ok RESEARCH METHOD COHEN ok

12.01.2015 Views

INTERNET-BASED SURVEYS 227 packages are discussed at http://www.tucows.com/, which lists and reviews a range of packages, while http://www.my3q.com/misc/register/register.phtml provides free online survey software. For presentational matters Dillman and his colleagues (1998a; 1999) make the point that in a paper-based survey the eyes and the hands are focused on the same area, while in a web-based survey the eyes are focused on the screen while the hands are either on the keyboard or on the mouse, and so completion is more difficult. This is one reason to avoid asking respondents to type in many responses to open-ended questions, and replacing these with radio buttons or clicking on a mouse that automatically inserts a tick into a box (Witte et al.1999:139).Further,somerespondents may have less developed computer skills than others. They suggest a mixed mode of operation (paper-based together with web-based versions of the same questionnaire). The researchers also found that ‘check-all-that-apply’ lists of factors to be addressed had questionable reliability, as respondents would tend to complete those items at the top of the list and ignore the remainder. Hence they recommend avoiding the use of checkall-that-apply questions in a web-based survey. Similarly they advocate keeping the introduction to the questionnaire short (no more than one screen), informative (e.g. of how to move on) and avoiding giving a long list of instructions. Further, as the first question in a survey tends to raise in respondents’ minds a particular mind-set, care is needed on setting the first question, to entice participants and not to put them off participating. (e.g. not too difficult, not too easy, interesting, straightforward to complete, avoiding drop-down boxes and scrolling). Dillman et al. (1998a; 1998b; 1999) make specific recommendations about the layout of the screen, for example keeping the response categories close to the question for ease of following, using features like brightness, large fonts and spacing for clarity in the early parts of the survey. They also suggest following the natural movement of the eyes from the top left (the most important part of the screen, hence the part in which the question is located) to the bottom right quadrants of the screen (the least important part of the screen, which might contain the researcher’s logo). They comment that the natural movement of the eye is to read prose unevenly, with the risk of missing critical words, and that this is particularly true on long lines, hence they advocate keeping lines and sentences short (e.g. by inserting a hard break in the text or to use table-editing features, locating the text in a table frame). Taking this further, they also advocate the use of some marker to indicate to the respondent where he or she has reached in the questionnaire (e.g. a progress bar or a table that indicates what proportion of the questionnaire has been completed so far). Respondents may not be familiar with webbased questionnaires, e.g. with radio buttons, scroll bars, the use of the mouse, the use of drop-down menus, where to insert open-ended responses, and the survey designer must not overestimate the capability of the respondent to use the software, though Roztocki and Lahri (2002) suggest that there is no relationship between perceived level of computer literacy and preference for webbased surveys. Indeed their use may have to be explained in the survey itself. Dillman et al. (1999) suggest that the problem of differential expertise in computer usage can be addressed in three ways: having the instructions for how to complete the item next to the item itself (not all placed together at the start of the questionnaire) asking the respondents at the beginning about their level of computer expertise, and, if they are more expert, offering them the questionnaire with certain instructions omitted and, if they are less experienced, directing them to instructions and further assistance having a ‘floating window’ that accompanies each screen and which can be maximized for further instructions. Some web-based surveys prevent respondents from proceeding until they have completed all the items on the screen in question. While this might ensure coverage, it can also anger respondents – such that they give up and abandon the survey – or prevent them from having a Chapter 10

228 INTERNET-BASED RESEARCH AND COMPUTER USAGE deliberate non-response (e.g. if they do not wish to reveal particular information, or if, in fact, the question does not apply to them, or if they do not know the answer). Hence the advice of Dillman et al. (1999)istoavoidthispractice.Oneway to address this matter is to give respondents the opportunity to answer an item with ‘prefer not to answer’ or ‘don’t know’. The point that relates to this is that it is much easier for participants in a web-based survey to abandon the survey – a simple click of a button – so more attention has to be given to keeping them participating than in a paper-based survey. Redline et al. (2002) suggest that branching instructions (e.g. ‘skip to item 13’, ‘go to item 10’; ‘if ‘‘yes’’ go to item 12, if ‘‘no’’ then continue’) can create problems in web-based surveys, as respondents may skip over items and series of questions that they should have addressed. This concerns the location of the instruction (e.g. to the right of the item, underneath the item, to the right of the answer box). Locating the instruction too far to the right of the answer box (e.g. more than nine characters of text to the right) can mean that it is outside the foveal view (2 degrees) of the respondent’s vision and, hence, can be overlooked. Further, they report that having a branching instruction in the same font size and colour as the rest of the text can result in it being regarded as unimportant, not least because respondents frequently expect the completion of aformtobeeasierthanitactuallyis.Hencethey advocate making the instruction easier to detect by locating it within the natural field of vision of the reader, printing it in a large font to make it bolder, and using a different colour. They report that, for the most part, branching instruction errors occur because they are overlooked and respondents are unaware of them rather than deliberately disregarding them (Redline et al. 2002: 18). The researchers also investigated a range of other variables that impacted on the success of using branching programmes, and reported the following: The number of words in the question has an impact on the respondent: the greater the number of words the less is the likelihood of correct branching processing by the reader, as the respondent is too absorbed with the question rather than with the instructions. Using large fonts, strategies and verbal design to draw attention to branching instructions leads to greater observance of these instructions. The number of answer categories can exert an effect on the respondent: more than seven categories and the respondent may make errors and also overlook branching instructions. Having to read branching instructions at the same time as looking at answer categories results in overlooking the branching instructions. Locating the branching instruction next to the final category of a series of answer boxes is amuchsaferguaranteeofitbeingobserved than placing it further up a list; this may mean changing the order of the list of response categories, so that the final category naturally leads to the branching instruction. Branching instructions should be placed where they are to be used and where they can be seen. Response-order effects operate in surveys, such that respondents in a self-administered survey tend to choose earlier items in a list rather than later items in a list (the primacy effect), thereby erroneously acting on branching instructions that appear with later items in a list. Questions with alternating branches (i.e. more than one branch) may be forgotten by the time they need to be acted upon after respondents have completed an item. If every answer has a branch then respondents may overlook the instructions for branching as all the branches appear to be similar. If respondents are required to write an openended response this may cause them to overlook a branching instruction as they are so absorbed in composing their own response and the branching instruction may be out of their field of vision when writing in their answer. Items that are located at the bottom of a page are more likely to elicit a non-response than items further up a page, hence if branching

228 INTERNET-BASED <strong>RESEARCH</strong> AND COMPUTER USAGE<br />

deliberate non-response (e.g. if they do not wish<br />

to reveal particular information, or if, in fact, the<br />

question does not apply to them, or if they do not<br />

know the answer). Hence the advice of Dillman<br />

et al. (1999)istoavoidthispractice.Oneway<br />

to address this matter is to give respondents the<br />

opportunity to answer an item with ‘prefer not<br />

to answer’ or ‘don’t know’. The point that relates<br />

to this is that it is much easier for participants<br />

in a web-based survey to abandon the survey – a<br />

simple click of a button – so more attention has to<br />

be given to keeping them participating than in a<br />

paper-based survey.<br />

Redline et al. (2002) suggest that branching<br />

instructions (e.g. ‘skip to item 13’, ‘go to item 10’;<br />

‘if ‘‘yes’’ go to item 12, if ‘‘no’’ then continue’)<br />

can create problems in web-based surveys, as<br />

respondents may skip over items and series of<br />

questions that they should have addressed. This<br />

concerns the location of the instruction (e.g. to<br />

the right of the item, underneath the item, to the<br />

right of the answer box). Locating the instruction<br />

too far to the right of the answer box (e.g. more<br />

than nine characters of text to the right) can<br />

mean that it is outside the foveal view (2 degrees)<br />

of the respondent’s vision and, hence, can be<br />

overlo<strong>ok</strong>ed. Further, they report that having a<br />

branching instruction in the same font size and<br />

colour as the rest of the text can result in it<br />

being regarded as unimportant, not least because<br />

respondents frequently expect the completion of<br />

aformtobeeasierthanitactuallyis.Hencethey<br />

advocate making the instruction easier to detect<br />

by locating it within the natural field of vision of<br />

the reader, printing it in a large font to make it<br />

bolder, and using a different colour. They report<br />

that, for the most part, branching instruction errors<br />

occur because they are overlo<strong>ok</strong>ed and respondents<br />

are unaware of them rather than deliberately<br />

disregarding them (Redline et al. 2002: 18).<br />

The researchers also investigated a range of<br />

other variables that impacted on the success of<br />

using branching programmes, and reported the<br />

following:<br />

<br />

The number of words in the question has an<br />

impact on the respondent: the greater the<br />

number of words the less is the likelihood<br />

of correct branching processing by the reader,<br />

as the respondent is too absorbed with the<br />

question rather than with the instructions.<br />

Using large fonts, strategies and verbal<br />

design to draw attention to branching<br />

instructions leads to greater observance of these<br />

instructions.<br />

The number of answer categories can exert<br />

an effect on the respondent: more than seven<br />

categories and the respondent may make errors<br />

and also overlo<strong>ok</strong> branching instructions.<br />

Having to read branching instructions at<br />

the same time as lo<strong>ok</strong>ing at answer<br />

categories results in overlo<strong>ok</strong>ing the branching<br />

instructions.<br />

Locating the branching instruction next to the<br />

final category of a series of answer boxes is<br />

amuchsaferguaranteeofitbeingobserved<br />

than placing it further up a list; this may<br />

mean changing the order of the list of response<br />

categories, so that the final category naturally<br />

leads to the branching instruction.<br />

Branching instructions should be placed where<br />

they are to be used and where they can be seen.<br />

Response-order effects operate in surveys, such<br />

that respondents in a self-administered survey<br />

tend to choose earlier items in a list rather than<br />

later items in a list (the primacy effect), thereby<br />

erroneously acting on branching instructions<br />

that appear with later items in a list.<br />

Questions with alternating branches (i.e. more<br />

than one branch) may be forgotten by the time<br />

they need to be acted upon after respondents<br />

have completed an item.<br />

If every answer has a branch then respondents<br />

may overlo<strong>ok</strong> the instructions for branching as<br />

all the branches appear to be similar.<br />

If respondents are required to write an openended<br />

response this may cause them to<br />

overlo<strong>ok</strong> a branching instruction as they are so<br />

absorbed in composing their own response and<br />

the branching instruction may be out of their<br />

field of vision when writing in their answer.<br />

Items that are located at the bottom of a page<br />

are more likely to elicit a non-response than<br />

items further up a page, hence if branching

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!