April 12, 2019


Do you compute?: We're certainly on to something when we say the brain is a computer - even if we don't yet know what exactly we're on to (Kevin Lande, 4/12/15, aeon)

[T]he claim that the brain is a computer is not just a metaphor. The cognitive sciences are full of hypotheses to the effect that the brain computes such-and-such in so-and-so a way. Many of our perceptual capacities, for example, are understood in computational terms, and there aren't any viable alternatives around. Here are a handful of widely accepted hypotheses about what the brain computes, though I will leave out the details:

Direction of sound from interaural time difference: if a loud crash occurs directly in front of you, its soundwaves will reach your left and right ears at the same time. If it occurs to your left, its soundwaves will reach your left ear slightly before they reach your right ear. When you hear a loud crash as being to your left or your right, your auditory system is computing, according to trigonometric principles, an estimate of that crash's direction on the basis of the difference in times between when the sound waves arrived at your right and your left ears.

Depth from disparity (or stereopsis): most things reflect light to both your eyes. Take one of your fingers and hold it arm's length away from you, and take another finger and hold it halfway between the farther finger and your face. Now fix your gaze on the closer finger. Your farther finger will reflect light to a different part of your left eye (relative to its centre) than it will to your right eye (relative to its centre). To see this, keep fixating on your closer finger. Close one eye and pay attention to the space that's visible between your nearer and farther finger. Now switch the eyes - open one and close the other. You'll notice that the visible space between your fingers is different. If you now bring your farther finger a bit nearer to you and repeat the eye-closing experiment, the effect is less dramatic. When you see one thing as twice as far away as another, part of what is happening is that your visual system computes an estimate of depth by first computing which retinal cells are responding to the same point in the world, and then determining the relative difference ('disparity') in the positions or coordinates of those retinal cells (greater disparities = greater depth).

Contour integration: when looking at the outline shape of an object in a cluttered scene, your visual system initially registers a bunch of tiny, individual segments of lines or contours (imagine lots of dashed lines). The visual system has to determine which line segments go with each other - which segments are parts of a common object's outline, and which belong to different ones. The visual system computes outlines from line segments on the basis of, among other things, how close together those segments are, how similar in orientation they are, and whether they form an approximately straight line.

Surface colour from illumination: the light that reaches your eye from a surface is a product of that surface's colour and the colour of the illumination. So, your white shoes will reflect different types of light depending on whether it is daytime or dusk, or whether you are on the dance floor or in a fluorescent-lit bathroom. Still, you can usually tell that your shoes are white under these different conditions. When you see something as having a certain colour, your visual system is computing an estimate of the object's colour by taking into account the nature of the illumination. The reason some people saw that dress as blue and black, and others saw it as white and gold, is that their visual systems are computing colours from different estimates of what the illumination is like.

Progress in cognitive science regularly consists in saying with mathematical precision exactly what is being computed - what direction should be estimated from some interaural time difference? - and exactly how the computation is performed. Hypotheses concerning these details can be and are tested against experimental observations, both of how people perform on tests (point to the loud noise, please) and of how populations of neurons respond to stimuli. There's pretty stable agreement about what would count as evidence for or against hypotheses of this sort. Nobody has any real idea of how else to understand our abilities to, for example, perceive the locations of sounds or the depths, outlines and colours of objects.

That's a level of clarity and commitment to a premise that is uncharacteristic of metaphorical claims.

Posted by at April 12, 2019 6:11 PM