Friday, October 2, 2015

What's it like to be a Computer?

I recently audited a Philosophy of Mind seminar led by John Searle at UC Berkeley. We read Thomas Nagel's What is it like to be a Bat?  and it got me thinking about computer science. Nagel made me realize that we computer scientists are missing out on an essential aspect of what it means to be a computer. We know that most computers get input from the outside world through keyboards, cameras, and microphones. We know that they represent that world via objects, databases, logic, and eventually collections of bits. Clearly this is far different than our human methods for perceiving and representing the world. As Nagel says about bats so we must say about computers:
there is no reason to suppose that it is subjectively like anything we can experience or imagine. This appears to create difficulties for the notion of what it is like to be a [computer]. We must consider whether any method will permit us to extrapolate to the inner life of the [computer] from our own case, and if not what alternative methods there may be for understanding the notion.
As is probably obvious I don't really think we need to do anything more to understand what it's like to be a computer. My point is that Nagel's argument for why we need to wonder what it is like to be a bat seem as insubstantial as my juxtaposition for the computer.

I think what is going on here is an example of what Chomsky describes [Chomsky 1996, 2008] as trivial questions such as "do submarines swim?" In English submarines don't swim in Japanese they do but the question is not considered a conundrum for marine biologists. It's simply a question of a language convention. So in English (at least so far) few people wonder "what it is like" to be a computer. But we do wonder what it is like to be a bat. It is common in literature for people to turn into bats and frogs. We have a common sense idea that identity is not necessarily tied to a human brain. But common sense and intuition are not science. They may be the starting point for science. So that should be how we evaluate Nagel's question: are there any actual scientific issues he is getting at?

One of his primary criticisms is that consciousness can't be studied by a "materialist" or "physicalist" approach. I agree that a strictly materialist approach to studying consciousness won't work.  However, not for the reasons that Nagel advocates.  As Chomsky points out [Chomsky 2012], the mind-body distinction ceased to makes sense when Newton destroyed the mechanistic worldview on which it was based. Even more so in the modern world where the fundamental building blocks of "matter" are not sub-microscopic particles but wave functions.

Or consider fields such as computer science or computational linguistics. The concepts we deal with are grammars, languages, transformations, logic, state machines, Turing machines, ontologies, interfaces, etc. These aren't material except in the mundane sense that they can describe things and processes in the real world. However, they aren't materialistic concepts about electrical currents on silicon. Indeed most of those concepts can be implemented in highly diverse ways. A state machine can describe a software program or the call-response language of various mammals [Hauser 2003].   Several years ago I saw a fascinating paper presented by researchers at Stanford [Myers 2012] who showed that they could use RNA to store information exactly as one would store to a computer. They demonstrated this by showing how the PDF for their own paper was stored and retrieved via DNA in their lab. These examples show that Nagel's view of materialism is out dated and not relevant to what many people who study computation and cognition are doing. The modern sciences of cognition are "materialistic" only in the most trivial sense.

Now let us consider Nagel's emphasis on "reduction".  How can we possibly even begin to think about reducing a scientific theory of mind to biological concepts when we don't yet have a mature scientific theory of mind? As Chomsky points out [Chomsky 2002] we can't even map neural correlates of consciousness for animals whose behavior are several orders of magnitude less complex than humans such as bees. Why should we constrain scientists working on the far harder problem of human cognition that if they can't perform such a reduction their work is not worth doing?

This brings us to Nagel's general viewpoint on science and philosophy. He is in essence a science denier. If science leads to a conclusion that is uncomfortable then he prefers to reject the science. For example, in his book Mind and Cosmos [Nagel 2012]  on pages 26-27 referring to materialistic and evolutionary theories he says: "but the explanations they propose are not re-assuring enough". A few pages later in Mind and Cosmos on page 29 he says: “Everything we believe, even the most far flung cosmological theories has to be based ultimately on common sense and on what is plainly undeniable”.

The goal of science is not to re-assure us or to validate our common sense intuitions. Indeed, the history of science shows that some of the most important discoveries were resisted because they challenged the current world view and made us re-evaluate the place of humans in the universe. People still resist Darwin because they find it offensive to think that humans evolved from primates. The "far flung cosmological" theory of quantum entanglement undeniably violates our common sense notion of causality.

Based on the history of science I think it would be somewhat surprising if when we ultimately do have a mature scientific theory of mind it didn't make people feel somewhat uncomfortable by forcing us to rethink common sense notions of consciousness such as free will.

Finally, I wish to close with a quote from a rather unrelated text.  I'm also auditing a quite different class on philosophy of mathematics and for that class today I was reading Frege's Foundations of Arithmetic. And I hope this doesn't seem overly harsh, I have great regard for Nagel, he is clearly a very influential philosopher, but as I was reading the introduction to Frege I couldn't help but think of Nagel as I read the following:
If Frege goes too far... he is certainly on the side of the angels when he espouses as a model for philosophy the defense of objective scientific truth in matters of conceptual clarification. He is surely right to oppose the supine subjectivism that seems to think we can say whatever we want merely by articulating unargued opinions in the course of creating a literary creative writing exercise. That is not philosophy for Frege... [Jacquette 2007]
Amen brother.

Bibliography

Chomsky, Noam (1996) Language and Thought: Some Reflections on Venerable Themes: Excerpted from Powers and Prospects.

Chomsky, Noam (2002) On Nature and Language. p. 56.

Chomsky, Noam (2008) Chomsky and His Critics. p. 279.

Chomsky, Noam (2012) The machine, the ghost, and the limits of understanding: Newton's contribution to the study of Mind. Lecture at the University of Oslo.

Hauser, Marc and Mark Konishi (2003) The Design of Animal Communication.

Jacquette, Dale (2007). Introduction and Critical Commentary to Foundations of Arithmetic by Gottlob Frege.

Myers, Andrew (2012) Totally RAD: Bioengineers create rewritable digital data storage in DNA. Stanford press release. Note: this is not the research I saw presented which was over ten years ago and unfortunately I can't recall that specific paper but the concept here is the same.

Nagel, Thomas (2012) Mind and Cosmos. 

7 comments:

  1. I can't reconcile the idea of bat life being incomprehensible to us with the notion that science should always conform to common sense. Those two suggestions seem incompatible.

    I obviously don't agree with Nagel that science must be comfortable and easy to grasp, and in a bar fight with the philosophers I would totally have your back, but I do think that the value of writers like Nagel and Searle is that they force the issue of the subjective aspects of consciousness that the computationalists would like to sweep under the carpet. Actually, I think it is not important *what* they are saying - what is important is that they *are* saying it.

    In a non-dualist world, we need a physics that can account for subjective experience; and that physics, yet to be realised, is the foundation on which computationalism and neuroscience are to be constructed. Subjective experience is not a subject matter for higher-level theory, it is completely fundamental and primary. In a monistic science, we cannot simply brush it aside as is the Cartesian's prerogative.

    On this point, I think physicists like Freeman Dyson and Roger Penrose have some interesting ideas on consciousness, and I am tending to feel that this is the way forward... though I am horrified at the thought of myself as a layman propounding 'quantum consciousness' or some such notion.

    ReplyDelete
    Replies
    1. I think you have a good point about the common sense notion. I kind of merged my critiques of two different works by Nagel, and it was the second one Mind and Cosmos where he really harps on common sense. Here is what I was trying to get at with the Bat issue: it seems to me that he is trying to say there is some issue that science isn't addressing when it comes to the consciousness of bats and humans. But IMO he is never clear on exactly WHAT that problem is, in either case. I.e., what specific problem is saying we need to account for "what its like" meant to address? I think when you strip away the verbiage he is making a rather mundane and obvious point. That to some extent consciousness has to be subjective. I agree. But so what? You can say similar things about vision. I can never know for sure that what I see as "red" is the same as what you see. But we find ways to work around that and still do science of vision and (although its exponentially harder) I see no reason we can't do the same eventually for consciousness.

      Delete
    2. I disagree that "we need a physics that can account for subjective experience; and that physics, yet to be realised" I mean it MAY be true that some new discoveries in physics will be required to completely understand consciousness. I certainly wouldn't rule it out. But I see no evidence that its a requirement and frankly I would be surprised if it turns out to be the case. We already understand a fairly good deal about cognition and information processing, to the point where we can hook up computer chips in prosthetic devices directly to the human nervous system. And none of it required any changes or new discoveries in physics.

      Pinker has written some good stuff on this topic that I (as usual) completely agree with. That one problem is that people view consciousness as the "secret sauce" of psychology and they try to look for simple answers. I.e. "Consciousness = X" which leads to embracing things like Mirror Neurons, Quantum Computing, and earlier Chaos Theory. Actually, in Searle's class someone mentioned how quantum physics would probably be required to understand consciousness and (I'm always struggling not to say too much in that class) without intending to I made a groan that clearly communicated my thought: Bullshit!

      There is zero, nada, bupkis, no evidence at all that the human brain can interact at the quantum level. Its rampant speculation of the worst kind.

      Patricia Churchland had an excellent short article on a related topic, Neural Correlates of Consciousness (NCC). Its in a book called This Idea Must Die where various thinkers attack ideas they think are no longer useful. And I agree with what she said, quoting from the end of her short article:

      "...we could peer inside someone's brain and find out which processes were the conscious ones and which the unconscious ones. But this is all nonsense. All we'll find are the neural correlates of thoughts, perceptions, memories, and the verbal and attentional processes that lead to think we're conscious. When we finally have a better theory of consciousness to replace these popular delusions, we'll see that there's no hard problem, and no NCCs"

      Delete
    3. I'm afraid we don't see eye-to-eye at all on the subject of consciousness. For the life of me, I can't see how you can hold it to be so unproblematic... and now you say it's so small a thing that it's not even a thing at all! What a slap in the face for Descartes!

      I will air my grievances in turn:

      As a schoolboy, I remember the notion of other people having different perceptions of qualia being quite profound. Nowadays I think it's a likelihood, and a rather mundane one at that. I cannot say that the notion of consciousness - the very existence of qualia themselves - has mellowed with age in the same way. The two are not comparable. It's like comparing black holes with manholes. Incidentally, I would be interested in hearing what workarounds have needed to be made in order to deal with the 'hard problem' of differential perception of red, as the issue is a new one to me.

      You say that we know a great deal about cognition and information processing, and have managed to connect prosthetics to the human nervous system without any new discoveries in physics; but this is simply because the difficulty of this procedure has been a purely technical one for a very long time. Dr. Quinn, medicine woman would have been able to explain to you in principle how to do it. On the other hand, were you to ask a modern, Nobel Prize winning researcher to design a prosthetic consciousness, her pencil would never even touch paper. I cannot understand how you manage to draw such comparisons. And computational neuroscience, the field which is making all the headway in cognition and information processing nowadays, is sadly not making much headway on the topic of consciousness - in fact, many practitioners in the field simply sweep it under the carpet and want little to do with it, as I have pointed out before, and as Patricia Churchland clearly demonstrates, along with Pinker.

      Delete
    4. However, regarding your scepticism of 'quantum physical solutions' to the problem, I know exactly where you are coming from and I will explain why I make such a suggestion. I will admit in advance that it is an amateurish suggestion, made only because it is the best one that I have come up with. Firstly, the mind is a machine, and all machines function through movement of some kind - be it a pneumatic drill or an atomic clock. Consciousness, then, must be produced by some form of movement, and consciousness appearing to be quite a specialist product, I imagine a specialist mechanism producing it. Secondly, as waveform collapse seems to depend on observation (consciousness), I wonder if they are two parts of the same elephant. And so I envisage some vibration (another buzzword you will hate) in the brain having some effect on waveform collapse and at the same time generating consciousness.

      At the moment, this is far from a tidy solution and it poses more questions than it answers. And having next to no knowledge of quantum physics, I would have kept it to myself; but a couple of days after thinking about it, I learned that Roger Penrose and Freeman Dyson, two eminent physicists, are making suggestions along similar lines. So I feel it is an avenue worth exploring, as we are making little progress in our other explorations.

      Although I understand your reluctance to accept quantum theories from strangers, when you say there is no evidence that the brain operates at the quantum level, I can't help but imagine Scottie from Star Trek proclaiming that some piece of alien machinery "seems to operate at the quantum level...", and cue the dramatic music, and pause to let it sink in for the viewers how advanced this race must be...

      But the computer that you are reading this on works at the quantum level, as does any other electronic device in the room. As far as the human body is concerned, we know that our sense of smell relies on quantum tunnelling - why must the brain be a quantum free zone? I agree with you on the absurdity of any proclamation that quantum science MUST be at the bottom of consciousness - but I see no reason to dismiss the idea out of hand. While we might think that quantum physics is kept by the universe in a glass case, to be broken 'in case of technological emergency', this is only because it is new to us and we don't understand it. Quantum mechanics is no guarantee of profundity, and if you are a universe, quantum mechanics is the same kind of vanilla physics as classical mechanics. To be honest, considering the complexity of the human organism, and particularly the brain, I'd be surprised if it hadn't collected a few quantum knick-knacks over the years.

      The idea of consciousness being an illusion seems to be philosopher-speak for "mañana, mañana". I challenge you to get a sack and fill it full of batteries, wires and chemicals, and shake it. I don't care how long you shake it, and your children shake it, and their children shake it... you will never make it experience itself. But now you're telling me that you might give it the 'illusion' that it is experiencing itself? I'll be honest and say that I don't even know what that means.

      Delete
  2. I don't think I'm communicating what I mean very well. I think consciousness is a very difficult and interesting problem. In fact one of the things I'm trying to emphasize is just how hard it is. I'm skeptical that we have even got a good definition of what the problem is let alone what a solution is.

    That is why I agree with that quote from Churchland's article on why the NCC idea must die. Although I differ with her slightly in the emphasis. She says that the problem of consciousness will just go away (i.e. be irrelevant) once we understand all the other problems she lists. I think rather that we need to have some decent answers for those questions, i.e., what is an intention, memory, belief, before we can answer what is consciousness. To me trying to answer what is consciousness before we answer those other questions is like trying to figure out how the universe started before we understand Newton's laws of motion.

    ReplyDelete
  3. OK. It seemed that you were shooing off consciousness as poppycock.

    The potential problem with a computation-first approach to understanding consciousness is the possibility that consciousness is involved in the the formative process of intentionality. For instance, Daniel Dennett sees consciousness as 'multiple drafts' at various stages of computation, I'm not sure if he sees consciousness as affecting the processes of computation or simply as observing them, but if it does have an effect, then our lack of understanding of consciousness would put the breaks on our understanding of intentionality.

    I agree with you that even defining consciousness is a hard problem.

    ReplyDelete