|Issue 6||March 1993|
Social Research Update is published quarterly by the Department of Sociology, University of Surrey, Guildford GU2 7XH, England. Subscriptions for the hardcopy version are free to researchers with addresses in the UK. Apply by email to email@example.com.
Computer simulation of social processes
Nigel Gilbert is Professor of Sociology at the University of Surrey. He became interested in potential of computer simulation from involvement with several inter-disciplinary projects involving computer scientists and social scientists. He began a series of international workshops on 'Simulating Societies' in 1992. The proceedings of the first of these has been published as Simulating Societies: the computer simulation of social phenomena (UCL, 1994), and the second is forthcoming as Artificial Societies (UCL, 1995). A third workshop will be held in Florida in September, 1995. He has also edited books on research methods (Researching Social Life, Sage, 1993) and authored works on the sociology of science and on statistics. He is Head of the Department of Sociology at Surrey.
Most social research either develops or uses some kind of theory or model: for instance, a theory of deviance or a model of the class system. Generally, such theories are stated in discursive English. Sometimes the theory is represented as a structural equation (for example, when doing regession). Quite recently, researchers have begun to explore the possibilities of expressing theories as computer programs. The great advantage is one can then simulate the social processes of interest in the computer and in some circumstances even carry out 'experiments' that would otherwise be quite impossible.
This Update reviews the current status of computer simulation of social processes and phenomena and suggests further reading for additional detail and advice. Although it helps to have some knowledge of computer programming to develop simulations, no such experience is needed to understand what simulations aim to do or to follow this Update. In practice, many researchers involved closely with simulations have colleagues with computer science backgrounds who do the actual programming for them.
The logic underlying the methodology of simulation is not very different from the logic underlying statistical modelling. In statistical modelling, a specification of a model is constructed (for example, in the form of a regression equation) through a process of abstraction from what are theorised to be the social processes that exist in the 'real world' (Gilbert 1993). By means of some statistical technique (e.g. ordinary least squares), the model is used to generate some expected values which are compared with actual data. The main difference between statistical modelling and simulation is that a simulation model can be 'run' to produce output, while a statistical model requires a statistical analysis program to generate expected values.
Although the simulation of social dynamics has a long history in the social sciences (Inbar & Stoll 1972), the advent of much more powerful computers, more powerful computer languages and the greater availability of data have led to increased interest in simulation as a method for developing and testing social theories (see Chapter 3 of Whicker & Sigelman 1991 for an historical review).
Simulation comes into its own when the phenomenon to be studied is either not directly accessible or difficult to observe directly. For example, simulation has been used to investigate the emergence of increased social complexity amongst hunter-gatherers in Upper Palaeolithic France, 20 000 years ago (more about this study below). Instead of studying the society (the target) itself, it is often useful to study a model of the target. The model will be more accessible and smaller scale, but sufficiently similar to the target to allow conclusions drawn from the model to be (tentatively) generalised to the target. The model might be statistical, mathematical or symbolic (based on logic or a computer program). The important point about a model is that it must be designed to be similar to the target in structure and behaviour.
Generally, a model is defined in terms of a mathematical or logical specification (Doran & Gilbert 1993). Sometimes it is possible to derive conclusions about the model analytically, by reasoning about the specification (for example, with mathematical proof procedures). Often, however, this is either difficult or impossible, and one performs a simulation. The simulation consists of 'animating' the model. For example, if the model is expressed as a computer program, the simulation consists of running the program with some specified inputs and observing the program's outputs.
Paradoxically, one of the main advantages of simulation is that it is hard to do. To create a simulation model, its theoretical presuppositions need to have been thought through with great clarity. Every relationship to be modelled has to be specified exactly, for otherwise it will be impossible to run the simulation. Every parameter has to be given a value. This discipline means that it is impossible to be vague about what is being assumed. It also means that the model is potentially open to inspection by other researchers in all its detail. These benefits of clarity and precision also have disadvantages, however. Simulations of complex social processes involve the estimation of many parameters and adequate data for making the estimates can be difficult to come by.
Another, quite different benefit of simulation is that it can, in some circumstances, give insights into the 'emergence' of macro level phenomena from micro level action (Conte & Gilbert 1995). Thus, a simulation of interacting individuals may reveal clear patterns of influence when examined on a societal scale. A simulation by Nowak and Latané (1993), for example, shows how simple rules about the way in which one individual influences another's attitudes can yield results about attitude change at the level of a society, and one by Axelrod (1995) demonstrates how patterns of political domination can arise from a few rules followed by simulated nation states.
A problem which has to be faced in all simulation work is the difficulty of validating the model. Ideally, a simulation should produce outputs which match those of the target for all possible inputs which can be envisaged to occur in reality, and should fail to produce output in all other circumstances. In practice, it is neither feasible to examine all input combinations, nor is it possible to assess whether the outputs from a wide range of inputs do indeed match those of the target, because the target may only be observable for some rather limited range of conditions. Sometimes a statistical solution to these problems is advocated (e.g. Bratley et al. 1983), but in practice it is hard to abide by the kinds of assumptions which conventional statistical tests require. Nevertheless, simulation always has a valuable role in helping to clarify ideas and theories, even if complete validation cannot be carried out.
The above advantages and disadvantages will be illustrated with some examples taken from recent work. The first uses an approach which has come to be called dynamic micro-simulation. Dynamic micro-simulation is used to simulate the effect of the passing of time on individuals and, often, on households (Harding 1990). Data from a large, usually random sample from some population (the 'base data set') is used to characterise the initial features of the simulated individuals. For example, there may be data on the age, sex, income, employment status and health of several thousand people. A set of transition probabilities is used to simulate how the characteristics of these individuals will change over a time period such as one year. For instance, there will be a probability that someone who is employed at the start becomes unemployed during a simulated year. These transition probabilities are applied to the data set for each individual in turn, and repeatedly re-applied for a number of simulated time periods (years). In some simulations, it is also important to model births, i.e. the addition of new members to the data set, and marriage, death and the formation and dissolution of households, in order that the data set remains representative of the target population.
The adequacy and value of such simulation depends on the availability of two kinds of data: a representative sample of the target population to form the base data set, and a sufficiently complete and valid set of transition probabilities. In the simplest simulations, these probabilities consist of an array of constant values, each indicating the chance of some specific change occurring given the current state of an individual. In more complex models, the coefficients can be made to vary according to the situation of other members of the individual's household or the wider social context.
The model can then be used to simulate developments in the future, for example to predict the number of those retired compared with those in work, and to explore the long term effect of social policy options. Of course, the accuracy of such predictions depends on the adequacy of the model and the validity of the implied assumption that there will not be major social changes at the macro level.
In conventional micro-simulations, the behaviour of each simulated individual is regarded as a 'black box'; that is, behaviour is modelled by probabilities and no attempt is made to justify these in terms of individual preferences, decisions or plans. Moreover, each simulated person is considered individually without regard to their interaction with others. The remaining examples of simulation to be described focus specifically on the simulation of individual cognitive processes and on communication between people, using techniques drawn from artificial intelligence (AI). AI is a discipline devoted to the design and construction of computer software that has some of the characteristics commonly ascribed to human intelligence. Simulation based on distributed artificial intelligence uses many AI programs, each representing an 'agent', which interact with each other and with a simulated environment (Bond & Gasser 1988).
A computer 'agent' typically has three components: a memory, a set of goals, and a set of rules. The memory is required so that the agent can remember past experience and plan ahead on this basis. The agent's objectives are defined by its goals, which may be as simple as to survive in a hostile environment in the face of depleting food or energy reserves, or may be more complex involving conflicts between alternative goals. The rule set defines the agent's behaviour and consists of condition-action rules. The condition part of each rule is matched against the contents of memory and input from environmental 'sensors'. If there is a match, the corresponding action is taken: this may be 'internal', affecting only the state of the agent's memory, or 'external', affecting the environment, for example, the sending of a message through the environment to another agent. The simulation works by cycling through each agent in turn, collecting messages sent from other agents, updating the agent's internal state by checking for any applicable rules, deciding on an action for the agent to take and finally communicating messages and the effects of the action to the environment, which then responds appropriately. This is repeated for every agent and these cycles continue indefinitely until the simulation is stopped or all the agents have 'died'.
An example of a simulation in this style is work Jim Doran, Mike Palmer, Paul Mellars and I have done studying the 'Emergence of Social Complexity' amongst hunter-gatherers in Upper Palaeolithic South-west France ( Doran et al. 1993 ). Many archaeologists believe that at that time there was a change from an egalitarian, low density society in which people lived in small, migratory groups and there was little political organisation and a simple division of labour by gender, to a somewhat more complex society, involving larger concentrations of people, some status differentials, role differentiation and more centralised decision-making. Associated with these changes were changes in burial patterns and the emergence of cave art and various symbolic artefacts.
The question which the simulation explored is what caused this change. One theory centres on the effect of glaciation in concentrating food resources in particular locations (e.g. the migratory routes of reindeer) in a predictable annual cycle (Mellars 1985). As people gathered in these locations, there was 'crowding', causing logistical problems (Cohen 1985). The growth of social complexity was a solution to this, as means were found to schedule activities so that there were not too many people attempting to secure the same resources at the same time; so that there was an appropriate division of labour; and so that people could relate to other people through stereotypical roles rather than on an individual basis.
We simulated this theory using agents that have the ability to plan their actions depending on the situation they find themselves in, to recruit 'followers' into groups and to communicate with other agents. The simulation was used to investigate issues such as whether the formation of groups increases the chances of survival of the agents.
Another example of an AI based approach to simulation is the work of Bousquet et al (1994). They were concerned with the practical problems of understanding the reasons for the depletion of fish stocks in the Niger river. The model they developed included agents representing fishermen from two ethnic groups, each with different traditions of decision-making and using different fishing methods, as well as agents representing the fish and the river itself. With this model, they were able to investigate the consequences on fish stocks over periods of several years of different individual attitudes towards risk (e.g. choosing a fishing strategy that could bring high rewards but at high cost versus a safer strategy), the effect of imitation and learning of good strategies, and the effect of agreements to share fishing grounds.
The basic method of simulation involves a number of steps:
• Since no social phenomenon can be examined in its entirety, the first step is to select those aspects which are of interest. The selection must be influenced by theoretical preconceptions about which features are significant.
• The modelling approach to be adopted is chosen. As well as the micro-simulation and AI based approaches mentioned above, there are simulations based on techniques drawn from operational research (e.g. Bulgren 1982; Gottfried 1984; Pooch & Wall 1993 ) and on the construction of differential equations, relating the rate of change of quantities to other parameters (e.g. Spriet & Vansteenkiste 1982 ). A further approach uses symbolic logic or symbol manipulation as the basis of the model ( Widman et al. 1989; Zeigler 1990; Gilbert & Doran 1993 ).
• Whichever approach is adopted, a further decision has to be made about the appropriate level of abstraction for the model. An important aspect of this is the level of aggregation selected for the units. For example, one might model the world economy using the major power blocks, individual countries, or (less practically) individual people as the units.
• It is then necessary to select the form in which the model is to be represented. If the model is to be a computer program, the decision mainly concerns the choice of computer language, although there will also be choices about how the program should be structured. Languages commonly used are BASIC, C, Prolog and Smalltalk (the latter two are languages developed by Artificial Intelligence researchers).
• Once all these preliminaries have been decided, the model can be constructed, the simulation run and the output examined.
• In practice, there is likely to be a period of modifying and testing gradually improving models. The simulation will be run a number of times, each time with a slightly 'better' model.
• Once the model is considered to be satisfactory, it is important to carry out sensitivity analyses. These examine the effect of small changes in the parameters of the model on its output. If small changes make large differences, one needs to be concerned about the accuracy with which the parameters have been measured; it is possible that the output is an artefact of the particular values chosen for the parameters.
Axelrod, R. 1995. A model of the emergence of new political actors. In Artificial Societies: the computer simulation of social life, G. N. Gilbert & R. Conte (ed.), London: UCL.
Bond, A. H. & L. Gasser 1988. Readings in Distributed Artificial Intelligence. San Francisco: Morgan Kaufmann.
Bousquet, F., C. Cambier, C. Mullon, et al 1994. Simulating fishermen's society. In Simulating Societies: the computer simulation of social phenomena, N. Gilbert (ed.), London: UCL.
Bratley, P., L. Fox, L. E. Schrage 1983. A guide to simulation. New York: Springer-Verlag.
Bulgren, W. G. 1982. Discrete system simulation. Englewood Cliffs, N.J.: Prentice-Hall.
Cohen, M. N. 1985. Prehistoric hunter-gatherers: the meaning of social complexity. In Prehistoric hunter-gatherers: the emergence of cultural complexity, T. Douglas-Price & J. A. Brown (ed.), 99-119. New York: Academic.
Conte, R. & G. N. Gilbert 1995. Introduction. In Artificial Societies, G. N. Gilbert & R. Conte (ed.), London: UCL.
Doran, J. & G. N. Gilbert 1993. Simulating societies: an introduction. In Simulating societies, G. N. Gilbert & J. Doran (ed.), London: UCL Press.
Doran, J., M. Palmer, N. Gilbert, P. Mellars 1993. The EOS Project: modelling Upper Palaeolithic change. In Simulating societies, G. N. Gilbert & J. Doran (ed.), London: UCL Press.
Gilbert, G. N. 1993. Analyzing Tabular Data: loglinear and logistic models for social researchers. London: UCL Press.
Gilbert, G. N. & J. Doran (ed.) 1993. Simulating Societies: the computer simulation of social processes. London: UCL Press.
Gottfried, B. S. 1984. Elements of stochastic process simulation. Englewood Cliffs, N.J.: Prentice-Hall.
Harding, A. 1990. Dynamic microsimulation models: problems and prospects. Discussion Paper 48. London School of Economics Welfare State Programme.
Inbar, M. & C. S. Stoll 1972. Simulation and gaming in social science. New York: Free Press.
Mellars, P. 1985. The Ecological Basis of Social Complexity in the Upper Palaeolithic of Southwestern France. In Prehistoric hunter-gatherers: the emergence of cultural complexity, T. Douglas-Price & J. A. Brown (ed.), 271-97. New York: Academic.
Nowak, A. & B. Latané 1993. Simulating the emergence of social order from individual behaviour. In Simulating Societies: the computer simulation of social phenomena, N. Gilbert & J. Doran (ed.), London: UCL Press.
Pooch, U. W. & J. A. Wall 1993. Discrete event simulation: a practical approach. New York: CRC Press.
Spriet, J. A. & G. C. Vansteenkiste 1982. Computer-aided modelling and simulation. New York: Academic.
Whicker, M. L. & L. Sigelman 1991. Computer simulation applications. Applied social research methods series Newbury Park: Sage.
Widman, L. E., K. A. Loparo, N. R. Nielson 1989. Artificial intelligence, simulation and modelling. New York: Wiley.
Zeigler, B. P. 1990. Object oriented simulation with hierarchical, modular models. New York: Academic.
Social Research Update is published by:Department of Sociology
Telephone: +44 (0) 1 483 300800
Fax: +44 (0) 1 483 689551
Edited by Nigel Gilbert.
March 1993 © University of Surrey
Permission is granted to reproduce this issue of Social Research Update provided that no charge is made other than for the cost of reproduction and this panel acknowledging copyright is included with all copies.