Issue 3 | March 1993 |
Social Research Update is published quarterly by the Department of Sociology, University of Surrey, Guildford GU2 7XH, England. Subscriptions for the hardcopy version are free to researchers with addresses in the UK. Apply by email to sru@soc.surrey.ac.uk.
Computer Assisted Personal Interviewing
Roy Sainsbury and Sandra Hutton are Research Fellows at the Social Policy Research Unit (SPRU) at the University of York. They have both worked on a number of projects about social security using large-scale surveys, secondary data analysis, and qualitative techniques. John Ditch is Assistant Director of SPRU and is responsible for the social security programme.
CAPI, or Computer Assisted Personal Interviewing, is a simple idea. Instead of collecting data on paper questionnaires, interviewers use portable computers to enter data directly via a keyboard. Computer-assisted interviewing has been used in the past for, for example, telephone surveys but it is only in the last five years or so that it has been used for face-to-face interviews.
Most of the early applications of CAPI in this country have been to large scale, continuous surveys for government and the commercial sector. For example, in the public sector, OPCS has converted to CAPI for the Labour Force Survey for interviews with 25,000 households a year, and the Family Resources Survey has used CAPI from its outset in 1992. In the commercial sector, British Telecom's annual customer satisfaction survey has been increasing its use of CAPI since it started in 1990. The National Readership Survey is also CAPI-based.
In contrast, the number of one-off social surveys using CAPI has been small. The reasons for this mostly derive from the relative infancy of CAPI in social research. CAPI involves large initial costs such as the purchase of lap-top computers and the training of programmers and fieldwork staff. It is not surprising therefore, that companies able to provide CAPI have initially sought large contracts such as those listed above. However, the number of large continuous surveys is limited and we should now expect more companies to seek contracts for one-off surveys.
In this article we examine the advantages and disadvantages of CAPI in principle and compare these with our own experiences at the Social Policy Research Unit of using CAPI in a survey of over 1,100 income support recipients. This survey, carried out on our behalf by Research Services Limited (RSL) of Harrow, sought to quantify, over a six-month period, changes in claimants' circumstances and the resulting impact on their lives and on their social security benefits. The survey was part of a project commissioned by the Department of Social Security to investigate the effects of changing circumstances on benefit recipients and on the administration of income support. At the time of writing the project is still in progress but we expect to publish the results in the second half of 1994.
The main part of the article looks at aspects of CAPI which anyone considering using it will need to weigh against traditional paper and pencil techniques, including the quality of the data, speed of delivery and cost. Following this we present a summary of the availability of CAPI from the major research companies in the United Kingdom.
CAPI claims to enhance the quality of survey data in a number of ways:
The way CAPI handles routeing is one of its most impressive features. Rather than having to decipher routeing instructions during an interview, the computer program takes interviewers automatically to the next appropriate question. This is particularly important when the questionnaire includes complex routeing (as ours did). Similarly if a set of questions has to be asked a number of times (for example, for everyone in a household), the computer will automatically repeat the questions (go round the 'loop') the correct number of times and then move on. CAPI's routeing capabilities have two main advantages over paper and pencil techniques. First, the possibility of error from interviewers failing to follow routeing instructions is eliminated; they cannot follow a wrong route and ask inappropriate questions nor can they inadvertently skip over questions. Secondly, the interview flows much more smoothly since the interviewer does not have to keep referring to earlier answers to establish the correct route through the questions.
Interviewing is also made easier by the 'customising' of questions. The computer program can recall a piece of data from its memory, such as a name or a date and insert it in the appropriate place in a question. For example, it is common for a paper questionnaire to include questions such as: "How often do/does (you/NAME) use (TYPE OF TRANSPORT)?". Using CAPI interviewers would not have to keep a check on which member of the household and which type of transport they are asking about. Instead they would be faced with a series of questions like "How often does Bill use the train?". In this way the accuracy of the question and the smoothness of the interview are both improved. The data retrieved can not only be text but also be the result of a calculation based on several earlier pieces of data, for example computing a single figure for disposable income from a number of sources.
Another of the major advantages of CAPI is its ability to spot inadmissible or inconsistent responses that could be the result of either interviewer or respondent error. For example, 'range checks' can be carried out to ensure that an answer falls within an acceptable range. In our questionnaire we asked respondents for the dates of changes happening to them in a six-month period beginning 1 August 1992. If they gave a date before then or an earlier date was keyed in the computer would display an error message allowing the interviewer to identify the source of the error and enter the correct answer. CAPI also allows 'logic checks' that can identify inconsistent or contradictory responses. For example, it is inconsistent for the date of an increase in child benefit to precede the date given for the birth of a child. The computer can raise an error message allowing the interviewer to investigate the inconsistency. Range and logic checks are powerful features of CAPI which improve the quality of the data at source.
Because CAPI interviewers enter data directly into a computer, the separate process of data entry, familiar in paper and pencil surveys, is unnecessary. This eliminates one source of error and saves time and money. Responses to open-ended questions can also be typed in directly. There is no need for separate transcription later.
Before leaving the topic of data quality we should put the claims and promises of CAPI in perspective. Paper and pencil surveys can, and do, produce high quality data. The claim of CAPI to produce higher quality data clearly needs to be tested empirically in a way that allows us to measure the magnitude of any improvement.
The effect of CAPI on the timing of a survey is twofold. First, the process of converting a paper questionnaire into a computer program is time-consuming. The timetable for a survey using CAPI should, therefore, allow ample time between the design of a questionnaire and the start of fieldwork. This is particularly important if the fieldwork has to be carried out between specified dates. In our survey we needed the fieldwork to be completed before the annual uprating of benefits at the beginning of April. Allowing eight weeks for fieldwork meant that we had to have a finished CAPI version of the questionnaire at the beginning of February 1993. Our timetable, we thought, allowed sufficient time to achieve this but problems identified during piloting in December 1992 resulted in a rush of last minute changes, one or two of which led to some minor problems with the data later.
As mentioned earlier, CAPI eliminates a separate process of data entry. As a result, the time between the end of fieldwork and the supply of a clean data set is reduced. However, there may still be a need for some data cleaning, albeit on a smaller scale than for a paper and pencil survey. For example, despite the best intentions of the program designers, some of the possible range or logic checks may not have been included. Hence, some inconsistent or invalid data may result and this will need checking in the normal time-consuming way.
CAPI, compared with paper surveys, generates both extra costs and savings. If we contrast set-up costs only, CAPI will always be more expensive due to the time needed to convert paper questionnaires to a computer version. This time, and therefore the cost, will be greater for complex questionnaires than for more simple designs. Because each paper questionnaire has an associated data entry cost, the savings generated by CAPI (which has no separate data entry) increase as the size of the survey population increases. The costs of cleaning data are also higher for paper surveys since, in a CAPI interview, respondent and interviewer errors are rectified during the interview itself. Administration costs for paper surveys, which include the printing and distribution of questionnaires, also tend to be higher than for CAPI surveys.
Clearly the costs of any survey will depend, among other things, on the length and complexity of the questionnaire and the size of the target population. Though we cannot state definitively that CAPI is cheaper or more expensive than paper surveys it does seem that, in general, the extra set-up costs required for CAPI are more likely to be offset when the survey population is large and the questionnaire design is complex. This is certainly our experience. When we invited tenders for our survey three companies offered traditional paper and pencil methods and one offered CAPI. To our surprise the CAPI quote was not the most expensive.
In designing any questionnaire researchers need to decide how long they want an average interview to last. The interview needs to be long enough to gather the required data but must not impose too heavily on the time of the respondent. And the longer the interview the greater the cost of interviewer time. As well as these considerations, researchers contemplating using CAPI must be aware of the memory capacity of the portable computers used by the interviewers. This imposes a separate constraint on the length and complexity of the questionnaire that was clearly demonstrated in our own project. When the paper version of the questionnaire was first converted into a CAPI program it just exceeded the capacity of the interviewers' lap-tops. We therefore deleted some questions and cut down the number of loops for others (for example, by asking detailed questions about the first seven members of the household instead of all of them). In so doing we had to examine very carefully each question to assess whether it could be omitted or cut down in some way. Although this was a useful exercise in itself, allowing us to refine a number of questions, we were concerned that the integrity of the questionnaire would have been jeopardised had we been required to make greater cuts. Furthermore, we were obliged to leave out some of the range and logic checks incorporated into the program.
As the process of amendment and refinement continues there is a danger of researchers losing control of the questionnaire. To avoid this it is important that they keep in close touch with CAPI program writers and that regular up-to-date paper versions of the CAPI questionnaire are supplied.
Face-to-face interviews rarely follow the course intended by researchers or interviewers. Respondents change their minds about earlier answers, or suddenly remember something relevant later in the interview. In addition, range or logic checks reveal inconsistencies in previous answers that require changing. Interviewers, therefore, need to be able to move back and forth easily within the questionnaire. However, at present this is not one of the strengths of CAPI although it is likely that we can expect improvements in the future.
No matter how careful the preparation of a questionnaire there will always be occasions when interviewers struggle to fit a respondent's answer into its structure. In paper and pencil surveys the interviewers can make notes on the questionnaire and the researchers then decide how to deal with them . In contrast, the facilities for CAPI interviewers to make notes are not yet well-developed though again we can expect improvements as CAPI is refined.
An early concern of ours was the possibility that respondents might be put off or intimidated by an interviewer armed with a lap-top computer, although earlier applications of CAPI had suggested that such a fear was groundless. We were reassured to learn from the pilot exercise (in which we participated) that respondents generally have no problems when faced with this new technology. Indeed, for some it was of interest in itself and contributed to the rapport between them and the interviewer. CAPI was also popular with the interviewers. Among other things, they liked the ease with which they could progress through the questionnaire and the air of professionalism that using computers bestowed upon them.
CAPI is provided through specialised software. The most common packages in use in the UK at present are BLAISE, QUANCEPT, MICROTAB and BV Solo, each of which has its merits according to the type of survey being carried out. Of the major survey organisations, Research Services Limited (RSL) (using QUANCEPT), Social and Community Planning Research (SCPR) (using BLAISE), MORI (again using BLAISE) and BMRB (using BV Solo) all offer CAPI for one-off surveys. At the time of writing NOP are developing a CAPI capability (using MICROTAB) and expect to be able to offer a service sometime in 1994. (The addresses of these survey agencies can be found at the end of this article.) For the more ambitious, who may wish to design their own CAPI programmes, it is possible to buy some packages direct from the suppliers. For example, BLAISE (the choice of the DSS for its Family Resource Survey) is available from its Dutch manufacturers (address below).
The use of CAPI in social research is in its infancy and its quantitative impact on the quality of data is largely unknown. Nevertheless, in our view, CAPI already has a great deal to offer as an alternative to traditional paper and pencil techniques, particularly where the length or complexity of a questionnaire suggests that there is a possible risk to the quality of data. At present CAPI is largely used as a straight replacement for paper questionnaires but the computer's existing potential for using graphics and sound (let alone the possibilities created by artificial intelligence) could lead to new forms of questionnaire that could transform the nature and potential of the research interview.
Bateson, N. and Hunter, P (1991) 'The use of Computer Assisted Personal Interviewing for official British surveys' in OPCS, Survey Methodology Bulletin, No.28, pp.26-33.
Costigan, P. and Thomson, K. (1992) 'Issues in the Design of CAPI Questionnaires for Complex Surveys' in Westlake et al (eds) Survey and Statistical Computing, pp.147-156, London: North Holland
Manners, T. (1990) 'The development of Computer Assisted Interviewing for Household Surveys: The Case of the British Labour Force Survey' in OPCS, Survey Methodology Bulletin, No.27, pp.1-5.
Martin J. and Matheson, J. (1992) 'Further developments in Computer Assisted Personal Interviewing for household income surveys' in OPCS, Survey Methodology Bulletin, No.31, pp.33-36.
Saris, W.E. (1991) Computer-assisted interviewing Newbury Park: Sage
BLAISE, Central Bureau of Statistics, Hoofdafdeling M3, PO Box 959, 2270 AZ Voorburg, The Netherlands. Tel: +31 70 694341
British Market Research Bureau (BMRB), Hadley House, 79-81 Uxbridge Road, Ealing, London W5 5SU. Tel: 081 566 6000
Market Opinion Research International (MORI), 95 Southwark Street, London SE1 0HX. Tel: 071 222 0232
National Opinion Polls (NOP), Tower House, Southampton Street, London WC2E 7HN. Tel: 071 836 1511
Research Services Limited (RSL), Research Services House, Elmgrove Road, Harrow HA1 2QG. Tel: 081 861 6000
Social and Community Planning Research (SCPR), 35 Northampton Square, London EC1V 0AX. Tel: 071 250 1866
Social Research Update is published by:
Department of Sociology
Telephone: +44 (0) 1 483 300800
Fax: +44 (0) 1 483 689551
Edited by Nigel Gilbert.
March 1993 © University of Surrey
Permission is granted to reproduce this issue of Social Research Update provided that no charge is made other than for the cost of reproduction and this panel acknowledging copyright is included with all copies.