20.02.2013 Views

Qualitative_data_analysis

Qualitative_data_analysis

Qualitative_data_analysis

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

268 QUALITATIVE DATA ANALYSIS<br />

full access to all the <strong>data</strong> on which our <strong>analysis</strong> is based. By electronically linking<br />

our summaries and interpretations to the relevant <strong>data</strong>, our audience could then<br />

check for themselves any doubtful (or especially interesting) points in our <strong>analysis</strong>.<br />

So long as producing an account depends only on traditional forms of publication,<br />

we have to accept limitations which in principle may be overcome following the<br />

advent of desktop computing. However, this vision of a future in which the research<br />

community exchanges disks as well as papers, and accounts can be validated in a<br />

fully interactive medium, cannot be realized without the development and<br />

standardization of the relevant software.<br />

Meantime let us return to our present problems of validation, and consider the<br />

issues posed by ‘construct’ validity. These refer to the fit (or lack of fit) between the<br />

concepts used in our account and those already established in the relevant field. If<br />

we have used concepts which are congruent with those employed successfully in<br />

other analyses, our audience may have greater confidence in the validity of our<br />

account. If we have spurned the conceptual tools currently available in favour of<br />

inventing our own, we can expect a more sceptical response. Even the scientific<br />

community likes to keep originality on a lead—unless its problems have become so<br />

pressing that a complete shift in paradigm is required. Before we dedicate ourselves<br />

to revolutionizing current paradigms, however, we ought to recognize the<br />

circumstances in which such changes can occur. Einstein’s relativity theory<br />

explained empirical discrepancies which were inexplicable within the framework of<br />

Newtonian physics, and made some fresh predictions which could be tested against<br />

evidence. Theories in social science do not have such explanatory and predictive<br />

power. Often in place of explanation and prediction, we have to make do with<br />

insight and speculation. To our audience, these qualities, valuable though they may<br />

be, will rarely constitute an overwhelming case for changing the way they think.<br />

There is much to be said, therefore, for working with established rather than original<br />

concepts. The task of testing and honing these concepts through empirical enquiry<br />

is no less valuable than that of creating new conceptual tools.<br />

In practice, qualitative <strong>analysis</strong> may well involve a mix of these two tasks,<br />

depending on the fit between our <strong>data</strong> and the concepts we employ at the outset. To<br />

validate new concepts, we can still consider their congruence with established<br />

thinking. If our concepts are inconsistent with established thinking, we have to<br />

accept a sterner test of their validity, if not in terms of their explanatory and<br />

predictive power, then at least in terms of the significant insights and understanding<br />

they afford. Much the same point applies to ‘criterion’ validity. If our observations<br />

are inconsistent with the results produced through other measures, then we have to<br />

be particularly careful to ensure that our confidence in them is not misplaced.<br />

<strong>Qualitative</strong> <strong>analysis</strong> is often castigated as being too subjective, and as Patton<br />

comments:

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!