Tuesday, February 22, 2011

What Computers Can't Do

Stanley Fish on Watson

I really like reading Stanley Fish's articles in the NYT.  Mostly because they're completely Heideggerian and Wittgensteinian without admitting it.  I don't really know who Stanley Fish is, but most of the articles I've read by him point backwards to a general point: meaning holism is probably correct in some broad fashion (that's basically what he argues in this article):
...as the philosopher Hubert Dreyfus explained almost 40 years ago, a “computer is not in a situation” (“What Computers Can’t Do”); it has no holistic sense of context and no ability to to survey possibilities from a contextual perspective; it doesn’t begin with what Wittgenstein terms a “form of life,” but must build up a form of life, a world, from the only thing it has and is, “bits of context-free, completely determinate data.” And since the data, no matter how large in quantity, can never add up to a context and will always remain discrete bits, the world can never be built.
Dreyfuss is maybe the modern day teacher of Heideggerian thought (well, he explains it much better than anybody else I've run across).  I even have a link to some youtube videos where Dreyfuss explains Heidegger's place in the history of philosophy fairly well--see my sidebar for them.  I believe Dreyfuss (in that book) also thought that computers would never be able to master chess, as well, but this isn't an argument against his more general point.  You could argue that chess is actually the epitome of being able to apply strict rules to a given situation regardless of the larger, holistic context.  As long as you know the rules and can run a large amount of predictions on any given move an opponent might make, you can win at chess.  But what Fish is arguing for is that we build up a context of meaning ("a world") from first living in it; the "discrete bits" take their meaning from that world. As Wittgenstein would have said, from their "use" within that world.

What computers can’t do, we don’t have to do because the worlds we live in are already built; we don’t walk around putting discrete items together until they add up to a context; we walk around with a contextual sense — a sense of where we are and what’s at stake and what our resources are — already in place; we inhabit worldly spaces already organized by purposes, projects and expectations.
For example, there's the hammer/nail analogy in Heidegger's "Being and Time".  A carpenter hammers a nail just like a computer manufacturing a car could hammer a nail or screw a bolt.  But what's missing in the computer's case is what Heidegger would call an "equipmental totality."  We don't just hammer nails for the fun of it: a carpenter hammers nails because he/she is building a house or building a couch or making a desk, so that he/she can get paid for doing this work and then eat when they go home--you get the idea.  The machine has no such "equipmental totality."  It's simply following a program, not determining how many nails it should hammer, or why its building cars in the first place (presuming it's building cars), or how many cars it should make.  It simply does it--it's a piece of equipment itself, embedded in a larger context of car-making.  Now, its not hard to imagine another computer deciding in some technocratic fashion how many cars should be made, maybe based on carbon emissions (also programmed and decided).  But we still never get to the crux of the issue, which is what Fish addresses in the quote above.  Our worlds are "organized by purposes, projects and expectations." These are things that are part of being human on some pre-conscious level, I would argue.  They are part of simply being in a world at all.  (enter a meditation on animals, here--but leave out the computers.)

In an interesting turn, too, Heidegger adopts a new word for "human": Dasein.  Literally it means "there-being." For Heidegger's project this becomes interesting because he's describing what it's like to be in a world at all without all the garbage associated with trying to define "the human."  He's describing the ways in which we simply are in the world prior to our conscious selves.  We already inhabit a world.  What becomes interesting is describing how we inhabit it.

Computers, so the argument would go, can retain information and even use this information in limited, rule-following contexts (like statistical analysis).  But these rule-following contexts crucially miss part of what it means to be a person: the ability to be faced with the unfamiliar.  Because we ourselves already inhabit a world, we're already given over to it, just as a computer with a large set of rules would be.  But the unexpected always shows up; and crucially, Fish argues, these unexpected amendments will "always [outrun] the efforts to take account of them, and after a while you’ve reached the point when every situation will require a rewriting of the rule, which means that there will no longer be a rule at all."  Basically, because we're "built" with the ability to "be in the world," we can adjust to new worlds and new contexts.  Not to say this is easy.  But computers, Fish argues, can't do this. This adaptive principle is unique.  This is why Fish argues that Watson's achievement is purely "formal."

No comments:

Post a Comment