* Abstract

This briefly reviews some philosophy of science that might be relevant to simulating the social processes of science. It also includes a couple of examples from the sociology of science because these are inextricable from the philosophy.

Keywords:
Philosophy, Science, Simulation, Social Processes, Evolutionary Models, Sociology

* Introduction

1.1
The Philosophy of Science has not been a natural "bedfellow" of social simulation. Philosophy has tended to seek for eternal truths, truths outside time, and hence has not concerned itself much with dynamics. It has tended to be normative in flavour, stressing how science should work[1] rather than how it does work—focussing on the context of justification of knowledge rather than its context of discovery. It has tended towards "neat" abstractions rather than take into account the messy details of science as it occurs. It has a record of being obsessed with its own concerns and has not, on the whole, sought evidence about what it is considering. It has not generally taken the social aspects of science seriously, except as a criticism of its rationality.

1.2
However, there are some philosophers of science that have attempted to describe what happens in science as part of their argument, including aspects with social implications and the effects of social processes on science. This brief[2] review takes a somewhat arbitrary selection of philosophers and seeks to summarise what they have suggested about the processes of science. It does not seek to provide a comprehensive account of their philosophies or even an adequate bibliography, but rather to extract some relevant ideas and lessons that might relevant for a simulation of science, giving a single indicative citation for each one. It also "slips in" two small chunks of sociology because these are so important in understanding the philosophy. The interested reader can then go and explore from this point, gaining a more nuanced and detailed understanding of the philosophies that are summarised below.

* Descartes

2.1
Rene Descartes' philosophy of science was definitely normative in tone since what he proposed was then novel and definitely not what was being done by his colleagues. However, what Descartes argued for has now become so accepted that it does resemble in many ways what scientists do now.

2.2
In regard to science, perhaps his clearest and most succinct formulation come in the Discourse on the Method (Descartes 1637). He describes his four precepts as follows:
The first was never to accept anything for true which I did not clearly know to be such; that is to say, carefully to avoid precipitancy and prejudice, and to comprise nothing more in my judgment than what was presented to my mind so clearly and distinctly as to exclude all ground of doubt. The second, to divide each of the difficulties under examination into as many parts as possible, and as might be necessary for its adequate solution. The third, to conduct my thoughts in such order that, by commencing with objects the simplest and easiest to know, I might ascend by little and little, and, as it were, step by step, to the knowledge of the more complex; assigning in thought a certain order even to those objects which in their own nature do not stand in a relation of antecedence and sequence. And the last, in every case to make enumerations so complete, and reviews so general, that I might be assured that nothing was omitted.

2.3
The first of these implies was not to simply accept what one is told, but to subject each statement to sceptical judgement. This would rule out a pure contagion model of belief propagation. The second is an analysis phase whereby the component parts of the system under consideration are established[3]. The third stage is one of synthesis, building up the solution via an inference from the parts (nicely anticipating simulation). Lastly that the before coming to a conclusion to thoroughly establish the behaviours of the whole.

2.4
This is a largely asocial picture of science, even suggesting a certain resistance (scepticism) of others' opinions, but then Descartes could not observe communities of scientists in action.

* Mill

3.1
Jon Stuart Mill argued that single humans were fallible prone to false, partial and context-dependent beliefs. Thus, to obtain to something like justified true beliefs, critical discussion is necessary (Mill 1859). Thus unobstructed opportunity for such discussion is important for the development of knowledge. In this view knowledge is a social matter and not something that pertains particularly to an individual.

* Suppes

4.1
Patrick Suppes argued that the relationship between a theory and the world is not direct, but mediated through a hierarchy of models (Suppes 1962). Even a set of data gained by measurement or observation is a kind of model. Each model will have its own assumptions and relationships with other models. For example, a model of an ideal gas formed of randomly moving particles might get some of its support from an approximation that gives the gas laws, which in turn are related to data models derived from measuring the pressure, temperature and volume of some gases in a laboratory. Since each part of these model chains might have been proposed and justified by a different researcher, this necessitates a social cooperation in order to do science.

* Duhem-Quine thesis

5.1
The "Duhem-Quine thesis" is that it is impossible to test any scientific hypothesis in isolation since it rests on a web of other hypotheses (called "auxiliary hypotheses"). Thus the hypothesis alone makes no testable predictions. For Quine this is a special case of a more general underdetermination of theory by evidence (Quine 1951), so that one always has a substantial choice left as to your theory.

* Popper

6.1
Karl Popper pointed out (Popper 1959) that hypotheses cannot be inductively justified but they might be falsified by evidence. He expressed the progress of science in the following pseudo-formula:

PS1 → TT1 → EE1 → PS2

which is interpreted as follows. A problem situation (PS1) leads to a set of tentative theories (TT1), from which the bad ones are eventually falsified via a process of error elimination (EE1). This process leads by repeated stages to better theories that can be applied to new problem situations (PS2). This process is not feasible if the candidate hypotheses are not amenable in principle to being shown false by evidence, hence the crucial importance of falsifiability. The social aspects of this are the criticisability of conclusions and their openness to critique by a wider community of scientists. Reliable and useful theories are not reached via individual reason but by a community that continually criticises, rejects, posits tentative theories and applies them.

* Merton

7.1
Robert Merton is counted not as a philosopher but a sociologist[4]. However he has influenced subsequent philosophers of science and his approach fits in well with later descriptive philosophers like Kuhn or Giere so I include him here. Merton (1973) thought about why science might be different from other social institutions, and came up with the conclusions that it was its social norms that were critical in this. He identified a distinctive ethos in science which include the following[5]:
  • Communalism—the common ownership of scientific discoveries and knowledge, according to which scientists give up individual intellectual property and specific reward for their discoveries in exchange for recognition and esteem[6];
  • Universalism—according to which claims to truth are evaluated in terms of impersonal criteria, and not on the basis of the group membership (e.g. race, class, gender, religion, or nationality);
  • Disinterestedness—scientists are rewarded for acting in ways that outwardly appear to be selfless, for example making public the weaknesses of their own research efforts;
  • Organized Scepticism—all ideas must be tested and are subject to rigorous, structured community scrutiny.

7.2
Taken together these can be read as ways of ensuring the maximum communication and cooperation among scientists, but they also imply a certain economic structure and reward system that stresses reputation and status rather than monetary reward. This implies a social process whereby results and hypotheses are freely distributed for impartial and widespread criticism and further use. It is the original "open source" community. It also points out the crucial importance of reputation and status mechanisms both to individual scientists as well as to the social organisation of science.

7.3
However Merton also pointed out that this picture was not entirely true. He identified the "Matthew Effect" which is that famous scientists often receive disproportionate credit for their contributions, whereas lesser known scientists receive less credit than their contributions actually merit. This shows that individual outcomes and behaviour conflict with elements of the "official" ethos (in this case universalism). It is common knowledge that academics tend to organise themselves into groups and fields so that it is, in fact, hard for an outsider to get published in a field without a reasonable period of acculturation within it first.

* Campbell

8.1
Donald Campbell considered how creativity was possible both in cognitive and social terms and formed a model of this process called "Blind Variation and Selective Retention" (Campbell 1974). This is a generalisation of biological evolution. Campbell thus extended and characterised Popper's vision to a full blown evolution of hypotheses, called "Evolutionary Epistemology".

8.2
In this there are two essential processes:
  1. Variation: the generation of a sufficient variety of new hypotheses
  2. Selective Retention: the removal of some of the bad hypotheses

8.3
Under this view it does not matter how the hypotheses were formed as long as there was a sufficient variety to include some better ones. The important step was the effective "shooting-down" (not his phrase) of bad hypotheses so that, over time, the ones that were left could be increasingly relied upon. Since the most effective critic is usually ones competitors it implies a social process of producing rival hypotheses and trying to disprove the others.

8.4
The "Blindness" of the Blind Variation and Selective Retention model is that he considered it important that the generation of hypotheses should not be done with a view to "second-guessing" the selection phase, since this would eliminate the creativity of the overall process and make it less likely that unexpected and novel hypotheses might result. In terms of a simulation of science it would be important that hypotheses might be tested in ways unforeseen by their authors.

* Feyerabend

9.1
Paul Feyerabend basically pointed out that science progresses not in well-regulated ways but in a messy anarchic fashion (Feyerabend 1975). By considering famous case studies from the history of science he showed that all of the so-called principles of science were ignored at some stage or other, and to the good. In other words he argued that is science is (and should be) fundamentally anarchic. This can be seen as fitting in with the variation phase of the evolutionary model proposed by Campbell and Popper—anything goes when constructing new theories. He is less clear on the selection side, and does not claim that one should not reject hypotheses on the basis of evidence. However he does point out cases where apparent falsification by evidence is rightly ignored. A possible lesson for a simulation is that any over-neat picture might be missing a lot that is the most creative and, at times, productive in science.

* Kuhn

10.1
Thomas Kuhn, after studying some crucial stages in some fields of science, concluded that science does not always progress smoothly (Kuhn 1962). Up to then there had been a widespread assumption that science progressed "brick-by-brick", each brick of knowledge being carefully constructed and checked before being added to the wall of knowledge in a cooperative process. Kuhn pointed out that science seemed to have two stages: "normal science" which proceeded in a cooperative manner much as in the building a brick wall analogy and "revolutionary science" where there is a throwing out of many existing assumptions and a fundamentally new approach adopted.

10.2
During a revolution in science people are not converted to the new approach gradually, because the new and old approaches are so different that it is hard for people who adapt one to understand the other. Thus two camps of followers are involved with little dialogue between them but rather intense competition and even animosity. At the start it is outsiders and new researchers that join the new camp, but after the strength of evidence and opinion grows for the new, the received view "flips" to the new one, with a mass changing of minds (or at least keep their unchanged opinions to themselves). Thus this is very much a "tribal" affair with competing camps fighting for supremacy.

10.3
Part of the reason for the camps is that people are very selective about what evidence they see. That is our expectations filter out what we pay attention to, so that if our web of related conceptions do not predict something we do not look for it. Kuhn called this phenomenon "theoretical spectacles" to dramatise the way we perceive and consider evidence "through" our web of assumptions and beliefs. This very clearly paints a social process, driven by needs for access and acceptance and exacerbated by the difficulty of communicating across paradigms.

* Lakatos

11.1
Imre Lakatos refined the picture painted by Kuhn as well as criticising Popperian Falsifiability (Lakatos 1970). He pointed out that fields are jealous of their defining beliefs and will not reject them when faced by apparently contradictory evidence, but rather reject and adapt some of the protective belt of "auxiliary hypotheses". In this tribal picture you have competing "programmes" that are being pursued by tribes of researchers, each of which is characterised by a set of core beliefs. The programmes progress in a normal scientific manner with respect to all but their core beliefs, which define their collective identity. Instead of the evolutionary process occurring to all hypotheses as envisioned by Campbell and Popper some of the selection happens in the sense that some whole programmes are more successful than others. He called the successful ones "progressive programmes" and the unsuccessful "degenerative programmes". A progressive research programme can be recognised by its growth, its discovery of novel facts, the development of new techniques, etc. A degenerating program by its stagnation, or elaboration of the protective belt of assumptions that save its cherished core beliefs but does not lead to new discoveries of significance (outside the programme).

* Hull

12.1
David Hull further refines the evolutionary process inherited by Campbell and Popper (Hull 1988). For such a process Hull identified that you need:
  1. Replicators—something akin to the gene whose structure is passed on faithfully to future incarnations almost all the time.
  2. Interactors—the equivalent of the body through which a set of replicators interacts with an environment, to different degrees of success and thus (on the whole) cause the preferential replication of those replicators that contributed to the more successful interactors.
This is an obvious and direct translation from the biological genotype-phenotype distinction. The result of this is lineages of replicators that persist in tree-like structures through time. Thus a successful simulation of science should show these "tree of life" style lineages for its replicators.

12.2
Hull thought that these replicators are ideas of different kinds, including: theories, methods, and goals. However it is not clear that these are propagated with sufficient faithfulness over the generations, since ideas are reinterpreted against the cultural and intellectual background and so are constantly changing. However there are some other candidates in the shape of formal models (mathematical and simulation) which are passed down, largely without error, from generation to generation and so might actually form such lineages.

12.3
Hull envisaged that the relative success of the interactors would be via the "credit" that scientists accumulated for their contributions. This credit is loosely associated with academic reputation and might be gained by falsifying a theory, or being cited a lot. Thus the motivation in Hull's picture does not have to be effectively altruistic but driven by the selfish motivation of respect and status accorded by one's peers.

* Toulmin

13.1
Stephen Toulmin (1972) argued that absolutism lacks practical value, but that rather practical argument was what was important. He pointed out that whilst some arguments were "field-dependent" (special to the field and only persuasive there), others were "field-invariant". He analysed an argument as having several different parts:
  • Claim: this is the statement whose merit is being established.
  • Evidence (Data): the facts that one appeals in foundation of the claim.
  • Warrant: statement authorising the move from evidence to claim.
  • Backing: in many cases the warrant will not be convincing enough on its own, thus extra credentials might be added.
  • Rebuttal: recognitions of legitimate restrictions on the claim, caveats and exceptions.
  • Qualifier: words like "probably", "almost all the time" etc.

13.2
Toulmin criticised Kuhn's picture of incommensurable paradigms and revolution, pointing out that as well as the theories, the concepts were also evolving in a Darwinian fashion, so that the understandings of the theories within a group of like-minded scientists will change over time. Avoiding either absolutist or relativist positions, Toulmin favoured a process of (social) comparison to determine whether a concept will provide better explanations than its rivals.

* The Strong Programme

14.1
The Strong programme sought to apply qualitative sociological methods, to case studies of Science in practice (e.g. Latour and Woolgar 1979). It thus attacked idealisations about the processes of science with detailed qualitative observations of scientists in practice. This was not philosophy but sociology. However it was very effective at shifting the focus from over-neat "arm-chair" theories of science, showing that science often works in very human, messy and social ways. For example, by showing how intense rivalry between competing scientists could be settled in ways that were not routed in evidence.

14.2
People have interpreted the results of the Strong Programme in different ways, but what it did do was provide lots of qualitatively detailed accounts of science in practice. It, to a large extent, formed the backdrop against which subsequent philosophy of science sought to grapple with. In particular it formed the backdrop to many post-positivist philosophies of science and thus those philosophers who wanted to defend science.

* Rescher

15.1
Nicholas Rescher stresses the limitations and fallibility of human cognition, whilst still arguing that, in the long run, science produces increasingly better, but still imperfect theories about the world, e.g. (Rescher 2000). He describes science as progressing through a process of "erotetic propagation", that is every answered question will add a new presumption that then needs to be answered, so that science will never end. He also argued that there is a law of diminishing returns, so that as science moves to each new theory, this will take an increasingly large amount of effort in terms of data-collection, experimentation, calculation etc. He also propounded a coherence theory of truth.

* Thagard

16.1
Paul Thagard mostly focuses on the individual scientist and how they might choose what to believe. However the model he chooses is highly amenable to a social input. Thagard argues that people do not logically reason to determine their beliefs (at least not primarily) rather it is the coherence/dissonance between beliefs that constrain which sets of beliefs it is possible to be held. Thus this is a coherence model of belief choice. However the set of things that are required to be coherent with each other can be more than simply beliefs, they can also be personal goals, emotions and the beliefs of others (Thagard 2006)—this is what gives it a potentially social dimension.

* Cartwright

17.1
Nancy Cartwright observed that there are two different kinds of laws/theories that scientists use: phenomenological laws and explanatory laws (Cartwright 1983). Phenomenological laws literally relate data derived from observation (e.g. the classic "Gas Laws" that relate pressure, temperature and volume), but they don't tell you much about why these relates. Explanatory models show how and why things happen but are abstract and do not relate directly to the data (e.g. the random particle model of a gas). Science uses and needs both kinds of theory—both kinds are stated and discussed in scientific papers.

17.2
However how the two are connected, what Cartwright calls the "bridging rules", are often not written down but are something that one learns in practice as one is trained in a particular discipline or field of science. Thus each discipline might well have its own bridging rules and these might well change over time within these disciplines.

* Giere

18.1
Ronald Giere[7] considered what a theory consisted of, and saw that it seemed to be represented by a cluster of closely related (but more specific) models (Giere 1988). His "cognitive" approach to understanding what scientists do leads him to choose an "agent-based" conception of modelling whereby the goals and purposes of an agent are an essential aspect of any model. He also espouses a scientific "Perspectivism", that is theories of the world are always from a certain perspective which may change and is (somewhat) socially determined, but that this does not make the theories "fictions". Here science involves much more than one individual's perspective, it involves social cognition. He makes a distinction between "collective cognition" and "distributed cognition". Collective cognition involves humans doing things like sharing knowledge or engaging in collective problem solving. The goal is achieved collectively, but modern science is more than just collective action. It involves interaction with the world mediated by sometimes very complex instrumentation. Bringing the instrumentation into the process results in (fully) distributed cognition.

* Concluding Discussion

19.1
Science seems to be much messier than many philosophers would desire. In particular it is the outcome of many social processes and mechanisms that are common in many non-scientific spheres. However science has its own distinctive features and "flavour" and is not quite the same as any other human institution. Indeed each field in science is distinctive so one cannot assume that all of these can be lumped together simply as "just another science"!

19.2
As you can see from this short survey, philosophers have only just touched on the possible social processes of science, mainly via an evolutionary analogy. Partly due to their different ways of looking at the world[8], there has been relatively little integration between the history of science, the philosophy of science and the sociology of science (Hull 2000). This separation has lead to an impoverished study of science: with many in the sociology of science uncritically accepting strong versions of post-positivist critiques of science, and the philosophers not coming to grips with the full social mess that is part of the processes of science.

19.3
This might not be surprising if one accepts Zammito's (2004) point that the philosophy of language is a poor resource for studying the practice of science. I see no reason why messy and intricate social processes should result in knowledge that is less reliable that the process of reason implemented by equally messy and intricate processes between neurones, but it might be so. The only way to examine and tease out the various and apparently contradictory effects of these social processes will be to simulate them letting the results emerge from the interaction. In other words, without a prior commitment to such processes regarding the soundness of their ultimate outcomes.


* Notes

1Usually what the particular author thinks should be the case.

2In terms of philosophical norms, this is a very very brief survey. Even 'brief surveys' in philosophy are quite lengthy—words like 'brief' etc. in philosophy are defensive epithets, since if you summarise to any extent you are bound to be 'wrong'!

3If we were to apply this to the simulation of social processes (as opposed to seeing what it directly suggested about the processes of science) this would suggest an agent-based simulation, since the social embeddedness implies that the local interactions of scientists would be significant in shaping the aggregate outcomes.

4The difference between sociologists who think about the processes of science and philosophers who put weight on observing how scientists actually behave seems moot to me, ending up more an indication of a person's academic roots rather than method or content.

5Some also include originality (novelty of research) in this list.

6Not to mention an interesting, reasonably well paid and secure job!

7Some of the sense and text of this paragraph is taken from a personal correspondence with Giere; however the interpretation is mine. Giere remains one of my favourite modern philosophers, showing a good degree of common sense and knowledge of science, see http://www.tc.umn.edu/~giere/.

8A difference reflected in their evaluations of science.


* References

CAMPBELL, D. T. (1974). Evolutionary epistemology. In Schilpp, P. A. (Ed.) The Philosophy of Karl Popper. Open Court Publishing.

CARTWRIGHT, N. (1983). How the Laws of Physics Lie, Oxford University Press. [doi:10.1093/0198247044.001.0001]

DESCARTES, R. (1637). Discourse on the Method of Rightly Conducting the Reason and Seeking Truth in the Sciences. Reprinted in: Cottingham, J., Stoothoff, R. and Murdoch, J. (1988). The Philosophical Writings Of Descartes, 3 vols. Cambridge: Cambridge University Press.

FEYERABEND, P. (1975). Against Method. London: Verso Books.

GIERE, R. N. (1988). Explaining Science: A Cognitive Approach. University of Chicago Press. [doi:10.7208/chicago/9780226292038.001.0001]

HULL, DL (1988). Science as a Process: An Evolutionary Account of the Social and Conceptual Development of Science Chicago: University of Chicago Press. [doi:10.7208/chicago/9780226360492.001.0001]

HULL, D. L. (2000). The Professionalization of Science Studies: Cutting Some Slack. Biology and Philosophy, 15(1), 61-91. [doi:10.1023/A:1006547510796]

KUHN, T. S. (1962). The Structure of Scientific Revolutions. Routledge.

LAKATOS, I. (1970). Falsification and the Methodology of Science Research Programmes. In Lakatos, I. & Musgrave (eds.) Criticism and the Growth of Knowledge. Cambridge University Press, 91-196.

LATOUR, B. & Woolgar, S. (1979). Laboratory Life: The Construction of Scientific Facts. Sage, Beverly Hills.

MERTON R. K. (1973). The sociology of science: Theoretical and empirical investigations. University of Chicago Press.

MILL, J. S. (1859). On Liberty. London: John W. Parker and Son.

POPPER, K. R. (1959). The Logic of Scientific Discovery. Routledge. [doi:10.1063/1.3060577]

RESCHER, N. (2000). Nature and Understanding: A Study of the Metaphysics of Science. Oxford: Oxford University Press.

SUPPES, P. (1962). Models of Data. In Nagel, E. et al. (Eds.) Logic, Methodology and Philosophy of Science: Proceedings of the 1960 International Congress. Stanford University Press, pp. 252-261.

THAGARD, P. (2006). Hot Thought: Mechanisms and Applications of Emotional Cognition. MIT Press.

TOULMIN, S. (1972). Human understanding : the collective use and evolution of concepts. Princeton University Press.

ZAMMITO, J. H. (2004). A Nice Derangement of Epistemes: Post-positivism in the Study of Science from Quine to Latour. University of Chicago Press.

QUINE, W. V. O. (1951). Two Dogmas of Empiricism. The Philosophical Review 60, 20-43. [doi:10.2307/2181906]