Session 3 notes
There were four speakers in this session, whose talks are briefly summarised here:
David Chapman talked about the way information is handled in telecommunications and applied those ideas to art. His main focus was the work of Claude Shannon, the engineer whose equation expressing the information capacity of a communication channel founded the field of information theory. The equation is concerned with the reduction of uncertainty in a set of possible information, so that it can be transmitted more efficiently. Shannon’s focus was not on the meaning of the information carried by these channels, but this is crucial – it depends on the context in which the information is received; it is not possible to disentangle information and meaning. In applying these ideas to art, David discussed work on the relationship between pleasure and complexity, which forms a bell curve – the greatest pleasure in art is found where there is some but not too much complexity. A different version of information in art is to consider the JPEG compression of a digital image, which gets the image down to its ‘information content’, yet the real information the picture contains is much more than that content.
Paul Piwek talked about logic-based models of information exchange, deriving from the field of natural language semantics (within computer science). The field aims at the systematic study of the informational content of natural language (e.g. English) sentences – in this field, to talk about the informational content of a sentence is to discuss its meaning. Meaning is understood through entailment, the logical relationship between sentences – if you understand the meaning of a sentence, you should be able to work out for that sentence and any other sentence whether one entails the other. Further, to understand the meaning of a sentence we need to understand its truth-conditions – the circumstances under which it is true. Paul discussed the work of several key thinkers who have built theories of logic to analyse this meaning, in particular that of Hans Kamp who argued that sentences contribute to a discourse and it is the discourse, rather than any individual sentence, which has truth-conditions.
Tony Nixon talked about quantum computing, compared quantum information to classical information. In classic information theory, a bit is 1 or 0. In quantum computing, a quantum bit (qubit) is only 1 or 0 when it is observed; otherwise it is in a superposition of the two states. A quantum computer – which has probably yet to be implemented – could be used for modelling the quantum world (e.g. in pharmaceuticals), for cracking difficult encryption problems such as RSA, or for large-scale databases searching. However they will prove very difficult to engineer, as quantum laws apply at very small scales with respect to both time and space. One American company claims to be close to producing one. Quantum information provides an interesting mechanism for creativity in organisations (which arose further in the discussion below).
Kirstie Ball talked about information in the employment relationship, arising from a study she conducted of two outsourced call centres (one in the UK, the other in South Africa). She was seeking to find out about the sources of variances of information in call centres. Information about what people did and when they did it existed at several different levels; what happened at each layer depended on the tightness of its coding. In the South African case, she observed a ‘mad patchwork’ in the use of information, of people making it up as they went along. In considering what this tells us about the nature of control, she found a set of layers superimposed upon one another, rather than binary relationships (such as between normative and rational control) as are often found in organisation theory. New experiences are encscribed in procedures at each level. She drew analogies with the systems theory distinction between open and closed systems, and asked what is the semiotics of standardisation.
Discussion
The discussion in this session was very wide-ranging, given the breadth of perspectives exhibited by the four speakers, and tended to focus on the ideas of one speaker at a time (and at times simply to operate in a question-and-answer fashion). The account below will name the speakers but not those making other contributions; I have attempted to record people’s views faithfully but what I present is usually a paraphrase and thus not always correctly represent what was intended.
We began with a discussion of Kirstie’s concept of layers, with a question as to whether they represent a scale from informal to formal information. She replied that they represent different kinds of information, being applied in pursuit of various objectives, arising from the need by call centre supervisors to exhibit a range of different skills in their work. The information they write down day to day does not necessarily find its way into formal settings, although in some instances it may do so, such as if a staff member deviates too far from the norm or is particular successful and their ideas become embedded in the processes used by other teams.
Continuing on the question of layers, someone asked why Kirstie had used that language, which suggested a hierarchical relationship, from top to bottom. She replied that this is not necessarily so, as the layers and complementary and supplementary. She distinguished between the concept of closure found in systems theory, which is all about hierarchies and embedding, from the way that actor-network theory treats closure which is all about black-boxing and an ontological flatness. She was unsure which was more appropriate in this case but was wary of the fact that systems theory frequently talks of organisms, which she finds inappropriate as a metaphor for discussion organisation. A further contribution suggested that we might think of the way geographical information systems use layers, with no sense of hierarchy but instead a great interest in the aggregate, looking down on all the layers at one. This use could be consider to be the process equivalent of GIS, but it was unclear how actors fitted into that.
Talk of metaphor led to a question for Paul – how does his logical system cope with metaphor? Paul replied that the logical system as such does not treat metaphor, but that it is possible to build on top of it. It is clear that the literal meaning may not make sense in itself, so you need to build additional assumptions which will allow the utterance to make sense.
This led to a question about meaning to David – is it so clear than Shannon really intended to exclude meaning? In the more popular work by Warren Weaver which was published as a book along with Shannon’s paper, it is fairly clear that Weaver is extrapolating into meaning. David was unsure that Shannon had thought through the distinction, noting that in the original paper he talks of language, crossword puzzles and the like.
A further contributor consider the extension of Shannon’s model to art and aesthetics, and discussed two approaches which in the past had sought to define high culture (the music of Beethoven in one case, and classical art in another) as somehow better than popular culture through its Shannon-information content – more complex or less formulaic – but this is very dependent on what you measure. David recalled a study of the novelty value of classical music concerts based on the frequency of composers found (Mozart being predominate), which seems very bizarre. A contemporary version of this was discussed by another contributor who had read a recent article in Nature which had used MRI scanning to suggest that whether we find a particular piece of art interesting can be seen by which parts of our brain are stimulated, and thus that some kinds of visual structures (a Zen garden in the article) may tap into the deep structure of our brains.
We moved on to discus the way Shannon treats receivers, with one contributor saying that a key flaw with Shannon is his assumption that all people will receive information in the same way, whereas it is clear that they will interpret it quite different. The issue (as David and another contributor said) is that Shannon’s concern was with telecommunications systems, rather than interpretation of messages, making it difficult to model the capacity to add new messages. As one contributor noted, there are multiple ways in which channels may be used – subsequently to Shannon, many others have added sophistications to his model, although one can think of each as an individual signal on top of Shannon’s, so that what we have is a composite of many different channels.
The mention of multiple channels led us back to quantum computing and its multiple possible states. As Tony observed, the quantum metaphor is that all possibilities simultaneously exist even though they’re incompatible with one another. The metaphor allows us simultaneously to hold mutually exclusive ideas/positions on things, to sit on the fence, provided they’re not observed (if they are observed, you have to take a single position). One contributor argued that the trouble with this approach was that it discussed the superposition of states rather than of possibilities, which implies something is in many states at once. In reality what happens is that future possibilities are laid out, different futures that might happen, rather than actual states which have happened or will happen. Tony replied that quantum physicists tend not to talk about possibilities or probabilities because the mathematics that would entail would prevent the states from being observable. More than a technicality, this gives the approach its richness. We need to be careful how far we push the use of quantum theory as a metaphor, but there are ways in which observation stifles creativity. We all know that when you observe things, the observation alters the system, but QM goes much deeper and says that it fundamentally alters the system such that you won’t get anywhere near the behaviour you would get if the system were isolated.
Magnus was reminded of the Hawthorne effect, very influential in social research and arising from workplace studies in the 1920s, where a series of interventions within a factory all showed greater productivity simply because people were being observed. Kirstie observed that the key factor was that investigators formed social bonds with people being studied, so that the Hawthorne studies show the power of informal organisation and the power of social norms in the workplace. Magnus suggested that, given the Hawthorne effect, quantum computing might be an example (contrary to the classic myth of new ideas arising in physics and being transmitted to other disciplines) of the quantum physicists catching up with what the social scientists knew seventy years earlier. It’s not really like that, said a contributor – the ideas are in the zeitgeist, lots of disciplines move in similar directions at roughly the same time but we don’t hear about the other disciplines until a lot later so we think ours was first. However, another contributor observed that senior managers are more concerned with managing exceptions than with managing routine, and perhaps we should follow Popper to say confirming information doesn’t add anything to what you currently know and therefore in a sense isn’t information. However, information which tells you the world isn’t what you thought is crucial. (the black swan from earlier in the day).
On a different theme, Paul was asked about his concept of truth. He talked about truth as if it were a 1 or a 0, but often truth is relative – how does his model cope? Paul replied that just because you assign something a truth value doesn’t mean that it’s independent of the context, and the models can cope with that. Truth and falsity have relevance for the time span of that discourse but not beyond it. Alternative models of logical truth raised by Paul and others included fuzzy logic, multi-valued logic and even quantum logic where something can be true and flase at the same time.
A closing thought arose from the influential article by Nicholas Carr about the diminishing value of IT in organisations once it has become universal. When we look at the nature of information, we need to ask what’s added, where’s the value? Early in the day, emotion had been mentioned but not followed up, and maybe that’s one of the areas of value that we could be exploring.