|Issue 17||Summer 1997|
Social Research Update is published quarterly by the Department of Sociology, University of Surrey, Guildford GU2 7XH, England. Subscriptions for the hardcopy version are free to researchers with addresses in the UK. Apply by email to firstname.lastname@example.org.
Open and Closed Question
Stephen Farrall is a Research Officer at the Centre for Criminological Research, University of Oxford. His main interests include the fear of crime and desistance from offending. He is currently undertaking research on the probation service. Jon Bannister (a Lecturer in Social Policy, University of Glasgow) has undertaken research and published several papers on both the fear of crime and, more recently, CCTV. Jason Ditton is Professor of Criminology at the Faculty of Law, Sheffield University. His most recent research interests include the fear of crime, CCTV and the use of drugs. Elizabeth Gilchrist is a lecturer at the School of Psychology at Birmingham University and has undertaken research into the fear of crime.
Research on the fear of crime has grown substantially in recent years. From its inception, this field has relied almost exclusively upon quantitative surveys, which have suggested that the fear of crime is a prevalent social problem. However, doubts about the nature of the instruments used to investigate this phenomenon, and in particular the use of closed questions, have raised the possibility that the fear of crime has been significantly misrepresented. This Update suggests that our understanding of the fear of crime is a product of the way it has been researched rather than the way it is.
When designing survey questions we talk of face validity, which in essence means whether a question looks like it is up to the required job. However, more complex assessments of a measures validity are rarely undertaken. This is partly because it is a formidable task, but also, as Brewer and Hunter (1989:41) suggest, Perhaps studies evaluating commonly used measures of social science concepts are relatively rare because they so often seem to end on a depressing note.
One of the most thorough reports of such a validity check is by Belson (1986), who attempted to assess the validity of quantitative tools by re-interviewing qualitatively respondents who had completed a quantitative interview. In one instance, respondents were asked about their chocolate consumption during the previous week using a quantitative tool and were then immediately re-interviewed in depth by a second researcher, the aim being to assess the extent to which the answers to the first questions were correct. Belson (1986:64) reports that:
...the number of bars, etc. claimed in the first interview was about a fifth larger than the total number finally agreed in the intensive interview (which is interpreted as being nearer the truth). [emphasis added].
In this Update, we discuss our own efforts to assess the validity of some measures in an area of social research which has seen rapid expansion in the last two decades: peoples anxieties about crime.
However, several commentators have raised doubts about the validity of the instruments used to generate these findings (see, inter alia, Bernard, 1992; Bowling, 1993; Fattah, 1993; Schneider, 1981; Skogan, 1981; and Zauberman, 1985). A range of methodological problems have been identified which cumulatively raise the possibility that the incidence of the fear of crime has been significantly misrepresented. Chief amongst these is the criticism that there may be great variation in reported fear of crime levels due to the nature of the question used to measure it. Bernard (1992:66), Fattah (1993:53) and Yin (1982:242) all note that closed questions produce greater reported levels of fear than open questions. Undertaking analyses similar to that employed by Belson (quantitative interviews followed-up qualitatively) we shall explore this issue.
|Type of Mismatch||Catastrophic||Serious||Mild||Total|
|Open or closed questions||21||17||8||46|
|Worry variously interpreted||5||10||1||16|
|Formless or concrete fears||5||6||2||13|
|Interpretation of question by respondent||2||2||-||4|
Each mismatch was coded not just for its seriousness but also for a possible explanation of cause. This entailed a great deal of interpretation on the part of the researchers. Each mismatch was recorded and notes made about preceding comments that could have explained it.
Table 1 crosstabulates the seriousness of mismatches by the explanation given for each. The most common mismatches were judged to be the results of using an open as opposed to a closed question (n=46, 40%). The next most common explanations were differences resulting from referring to specific contexts in the qualitative interviews (n=22, 19%); the use of the word worry as a surrogate for other words, (n=16, 14%) and the measuring of formless as opposed to concrete fears (n=13, 11%).
The majority of the catastrophic mismatches were between open and closed questions (n=21). For example, when one respondent was asked at the quantitative interview how much he worried about being robbed or assaulted, he replied by placing himself in the middle of a 5 point scale. At the qualitative interview, when asked again about how much he worried about this type of offence, he said No, no, cos as I say, since I got done the first time [during the late 1970s], Im very careful. So he was a 5, is now a 1, and personally averages himself as a 3.
Are we over-estimating the fear of crime by using closed questions or are we under-estimating the fear of crime by using open questions? Not all mismatches applied to worry about crime (five applied to risk assessments and one to worries about having a fire in the home), but of the 40 open and closed mismatches that did relate to worry about crime, 37 confirmed the finding that closed questions generate apparently higher levels of fear.
Closed questions, it is argued, sensitise and direct respondents toward the set of answers offered, and the empirical evidence supports this. A Harris poll (Harris & Associates, 1975), for example, employed a closed question to evaluate whether an elderly population was concerned about crime. They found that 23 per cent of the sample considered crime to be a serious personal problem. Yin (1982) employing an open question found that only 1 per cent of a comparable elderly population considered crime to be a serious personal problem, (reported in Fattah 1993:53-54). Bernard (1987:66) reports research undertaken by Schuman and Presser (1981). When they asked about the most serious problems in the USA today, 16 per cent cited crime and violence, but when they asked Do you think that crime and violence is a serious problem today? 35 per cent of respondents replied positively.
It would appear that measurements of the extent of the fear of crime are grossly sensitive to the nature of the question asked. Given that open questions allow respondents the opportunity to provide their own answers, uncontaminated by research priorities, we conclude that closed questions greatly over-estimate the incidence of the fear of crime. Closed questions are the staple of many crime surveys. Because of this, the surveys may be over-estimating the fear of crime by as much as 23 times. It seems that surveys will tell us little about the fear of crime, but a lot about the nature of the questions used. This issue is important as it must be remembered that the vast majority of crime and fear of crime surveys employ closed questions.
If Belsons conclusion that qualitative interviews are closer to the truth and quantitative questions over-estimate is correct, quantitative fear of crime measures will consistently over-estimate levels of fear. Given that many social surveys rely upon closed questions to assess attitudes towards a range of issues, this is indeed a worrying finding.
Crime is a very emotive topic that provokes strong reactions, presumably due to the inherent unpleasantness of being a victim. Two of the most commonly used terms in this field (fear and worry), are emotive but vague. Other complex issues which arouse vague emotions in people and about which strong opinions are held may similarly generate over-estimates. If this does prove to be the case, we need to be careful in our use of survey research methods to measure some issues. In addition, many questions designed to measure anxiety about crime are often asked out of context (for example, in a one-off interview which does not allow time for qualification or reflection). Other issues, such as attitudes to the environment, fear of unemployment and views about local services, which are commonly measured using similar techniques may also yield exaggerated responses.
It would be foolish to conclude from this that crime studies should abandon survey research and go qualitative. For one thing, surveys have to have a large number of respondents reporting criminal victimisations if the results of the analysis are to be meaningful. However, it is possible to envisage new developments in this field. It may be possible to re-calibrate results downwards. This would only be possible after further research which attempted to assess the exact extent of the over-estimation. This could be achieved by asking differently worded questions and subsequently assessing them by analysis similar to that discussed here. The exercise would involve asking respondents questions using different words in place of worry, asking them which best described their feelings about crime and then comparing the best new measure against the old measure. This would amount to a large scale test-retest study aimed at refining measures.
Re-conceptualising the dependent variable may also lead to improvements in its measurement. Anxieties about crime have been conceptualised as being unidimensional: fear and little else. By incorporating other words or expressions, other dimensions (perhaps less prone to over-estimation) may be measured. For example, cognitive aspects (such as thinking about crime and how to avoid it) could also be assessed. Respondents could be asked when they last thought about criminal victimisation (perhaps by relying upon diary methods). Whatever the solution, conceptual advances in this field are long over due.
An alternative method may be to stop asking about the fear of crime in a direct manner and instead ask about it indirectly. Some researchers (e.g. Wurff et al 1989) have already started along this road. Wurff et al used vignettes which described everyday events, which may or may not have involved the threat of victimisation. Respondents were asked to interpret the situation and to predict what might happen next and what they would do. This meant that respondents reflected their anxieties about crime through their interpretations of everyday events. Clearly this still relies upon qualitative data, but makes its use in survey research easier.
Belson, W. A. (1986) Validity in Survey Research, Gower Publishing, Aldershot.
Bernard, Y. (1992) North American and European Research on Fear of Crime, Applied Psychology: An International Review, 41(1):65-75.
Bowling, B. (1993) Racial Harassment and the Process of Victimisation, British Journal of Criminology, 33(2):231-250.
Brewer, J. & Hunter, A. (1989) Multimethod Research: A Synthesis of Styles, Sage, London.
Chambers, G. & Tombs, J. (1984) The British Crime Survey: Scotland, London, HMSO.
Fattah, E. A. (1993) Research on Fear of Crime: Some Common Conceptual and Measurement Problems, in Bilsky et al (eds), Fear of Crime and Criminal Victimisation, Ferdinand Enke Verlag, Stuttgart.
Harris, L. & Associates, (1975) Myth and Reality of Aging in America, Washington D. C., National Council on Aging.
Hough, M. (1995) Anxiety About Crime: Findings from the 1994 British Crime Survey, Home Office Research Study No. 147, Home Office, London.
Hough, M. & Mayhew, P. (1983) The British Crime Survey: First Report, London, HMSO
Hough, M. & Mayhew, P. (1985) Taking Account of Crime: Key Findings From The Second British Crime Survey, London, HMSO.
Kinsey, R. & Anderson, S. (1992) Crime and The Quality Of Life: Public Perceptions and Experiences of Crime in Scotland, Central Research Unit Paper, Edinburgh, Scottish Office.
Mayhew, P., Elliot, D. & Dowds, L. (1989) The 1988 British Crime Survey, London HMSO.
Maxfield, M. (1987) Explaining the Fear of Crime: Evidence From The 1984 British Crime Survey, Home Office Research Paper No. 41, Home Office, London.
Payne, D. (1992) Crime In Scotland: Findings From The 1988 British Crime Survey, Central Research Unit Paper, Edinburgh, Scottish Office.
Schneider, A. L. (1981) Methodological Problems in Victim Surveys and Their Implications For Research in Victimology, Journal Of Criminal Law and Criminology, 72(2):818-838.
Schuman, H. & Presser, F. (1981) Questions And Answers in Attitude Surveys, New York, Academic Press.
Selltiz, C., Wrightsman, L. S. & Cook, S. W. (1976) Research Methods in Social Relations, 3/e, Holt, Reinhart & Winston, New York.
Skogan, W. (1981) Issues in the Measurement of Victimisation, U. S. Dept of Justice, BJC, Washington D. C., U. S. Government Printing Office.
Skogan, W. (1990) The Police and Public in England and Wales: A British Crime Survey Report, Home Office Research Study No. 117, London, HMSO.
van der Wurff, A., van Staaldunien, L. and Stringer, P. (1989) Fear of crime in residential environments: testing a social psychological model, Journal of Social Psychology, 129(2): 141-60.
Yin, P. (1982) Fear of Crime As A Problem For the Elderly, Social Problems, 30(2):240-245.
Zauberman, R. (1985) Sources of Information About Victims and Methodological Problems in This Field, in Research on Victimisation, Council of Europe, Vol. XXII.
Social Research Update is published by:Department of Sociology
Telephone: +44 (0) 1 483 300800
Fax: +44 (0) 1 483 689551
Edited by Nigel Gilbert.
Summer 1997 © University of Surrey
Permission is granted to reproduce this issue of Social Research Update provided that no charge is made other than for the cost of reproduction and this panel acknowledging copyright is included with all copies.