Tuesday 10 November 2015

Face it, whether your brain is computer or not, no one has a clue

blog

There is nothing like disagreement to motivate you to action. Several articles have given me the motivation to reactivate this blog. The first comes from the New York Times and has the bold title of “Face it, your brain is a computer.” The author, Professor Gary Marcus, is concerned that people are losing sight of the big picture when considering the brain. Neuroscientists are too focussed on “narrow measurable phenomena” instead of addressing the larger picture. That the author is discouraging scientists from focusing on measurable phenomena is worrying even if you acknowledge that the emphasis is on the phenomena being narrow rather than measurable.[1]


Putting this concern aside, Marcus is deploying a common strategy used by those who advocate a computational theory of mind. It is summarised with the, almost rhetorical, question “If the brain is not a serial algorithm-crunching machine, though, what is it?” This, Fodorian only-game-in–town like challenge has become all too familiar to those who have studied this debate. The answer, unsurprisingly, is that of course the brain must be a computer, why else would Marcus raise it? However, as I will show, what was once meant to be a rallying cry for a reasoned conclusion has become more and more like the 3am challenge of “wanna fight” issued by a freshly ejected drunk.


This is not to say that Marcus doesn’t put forward a case, he does in part. His argument revolves around dismissing three common reasons for rejecting the mind as a computer. The first is captured under the slogan ‘brains are parallel, but computers are serial.’ Marcus rightly points out that modern conceptions of computers are no longer simple serial machines, such a view is “woefully out of date.” Instead we have multi-parallel processing, modular components and this seems to just be the start of such developments. I readily accept this point that to be computer-like does not demand one CPU through which everything is processed.


The second argument he dismisses comes with another slogan: ‘brains are analog, while computers are digital.’ Once again I am happy to agree with Marcus’ conclusions, that the brain could be operating in a digital format or it could not, or it could be a combination of both. What is not explicitly said, but underpins the argument, is the claim that: provided the brain is an algorithm processing machine, then it doesn’t matter if the processing is done digitally or not. Just as long as there is the processing of algorithms, then the computational criteria are met. Expressed in this way, the issue is a bit more hazy. For example, how can you have algorithm processing that isn’t discrete and therefore digital? But I’ll put that aside for now.


This takes us to the third argument, that computers are unable to generate emotions while the brain is. This argument is one I have encountered more and more in the university undergraduate classroom, and one that does seem a fallback position to many who encounter artificial intelligence through pop culture. Once again, I agree with Marcus, the parts of the brain that have been isolated to be core areas of emotional processing are no different from other parts of the brain, so it is very likely that they operate in a similar way. This means that if we can reproduce these in a computer, for example a chess playing one, then we should be able to reproduce the emotional parts too. Again, as I have argued many times in the classroom, emotions seem no more special than other parts of our behaviour/mind so they do not make the mind some special, potentially non-physical thing. It is quite rewarding each year to change a few students’ minds on this point.


So with all this agreement why did I include the above comparison to a drunk trying to pick a fight? The reason is that, just like the drunk only challenging those who she thinks she can beat in a fight, so too has Marcus only presented easy pickings. Note the disparity between the central claim: that the brain is an algorithm-crunching machine and the arguments presented. None of them challenge this idea in the slightest or even discuss it. The anti-computationalist approach driven by the parallel processing ideas of the 1980s and 1990s, is one in which this core idea is questioned. Through puzzles like the frame problem, people like Hubert Dreyfus (1992) claim that algorithm processing just fails to explain human intelligence. Alternatively, roboticist Rodney Brooks has just moved on, turning intelligence into an engineering problem and leaving the high-end picture behind.


While Marcus most likely had a particular audience in mind when he wrote that piece, I cannot help but feel it is aimed too low (e.g. the philosophy undergrad) and not at those actively arguing and reasoning for a anti-computationalist account. In doing so he is not strengthening his case.


To be fair, there is a positive proposal presented by Marcus and I agree that the “real payoff in subscribing to the idea of a brain as a computer would come from using that idea to profitably guide research.” So what is the payoff? Marcus and his colleagues talk of the brain being made up of something like field programmable gate arrays. These are flexible clumps of logic gates that can be configured and reconfigured to perform a range of tasks. The move here seems to be away from an idea of central processing to a conception more consistent with neuroscience, that different parts of the brain perform different functions. In short, rather than the brain being a serial algorithm-crunching machine, it is a parallel algorithm-crunching machine, one with several areas all performing their own tasks.


However, there is a real problem here, especially if we adopt the high-level analysis suggested by Marcus. Are he and his colleagues simply suggesting that the brain might be modular? With parts capable of performing different tasks as the need arises? If so, then their allegiance to Fodor is clear, though I suspect they may want to catch up on some reading which made this point a little while ago (Fodor 1983).


If they are making the stronger claim, that the modularity of mind is best explained by there being separate parts of the brain being individual algorithm-crunching machines, then we have a more interesting claim.[2] However, on closer inspection, there is little evidence to back this up. The academic paper referred to in this New York Times piece is noticeably short and lacking clear justification for its claims. Disappointingly, the argument boils down to: the old conception of the mind as a computer is wrong, why not try out this parallel conception? There is no support other than a few studies that the theory just happens to work alongside and the authors conclude before anyone has the time to ask for “measurable phenomena.” They seem to be trying the old: throw it against the wall and see if it sticks, method of explaining the brain.


While it is true that I may be taking out my frustration of this topic a little heavily on Marcus and his own attempt to add something small to a long standing debate, to me this article epitomises the problem with the discussion in general. Too much is assumed, beating the easy challenges is seen as a win, and attempts at being different land as far away as a gentle breeze carries an apple falling from a tree. For me, the mysteries of the brain are going to require more creativity and new ideas than many are ready to allow for. Perhaps ultimately, it is something much simpler that I pine for, not having to argue with so many drunks at conferences as I try (mostly in vain) to present a genuine alternative to the only game in town.


[1] For full disclosure of the authors intentions, here is the entire passage:

“A lot of neuroscientists are inclined to disregard the big picture, focusing instead on understanding narrow, measurable phenomena (like the mechanics of how calcium ions are trafficked through a single neuron), without addressing the larger conceptual question of what it is that the brain does.”

[2] Though once again not that different from claims made by others.

No comments:

Post a Comment