Hyping Artificial Intelligence, Yet Again

The New Yorker:

According to the Times, true artificial intelligence is just around the corner. A year ago, the paper ran a front-page story about the wonders of new technologies, including deep learning, a neurally-inspired A.I. technique for statistical analysis. Then, among others, came an article about how I.B.M.’s Watson had been repurposed into a chef, followed by an upbeat post about quantum computation. On Sunday, the paper ran a front-page story about “biologically inspired processors,” “brainlike computers” that learn from experience.

This past Sunday’s story, by John Markoff, announced that “computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.” The deep-learning story, from a year ago, also by Markoff, told us of “advances in an artificial intelligence technology that can recognize patterns offer the possibility of machines that perform human activities like seeing, listening and thinking.” For fans of “Battlestar Galactica,” it sounds like exciting stuff.

As a cognitive scientist, I agree with Markram. Old-school behaviorist psychologists, and now many A.I. programmers, seem focussed on finding a single powerful mechanism—deep learning, neuromorphic engineering, quantum computation, or whatever—to induce everything from statistical data. This is much like what the psychologist B. F. Skinner imagined in the early nineteen-fifties, when he concluded all human thought could be explained by mechanisms of association; the whole field of cognitive psychology grew out of the ashes of that oversimplified assumption.

At times like these, I find it useful to remember a basic truth: the human brain is the most complicated organ in the known universe, and we still have almost no idea how it works. Who said that copying its awesome power was going to be easy?

Read the whole story: The New Yorker

Leave a comment below and continue the conversation.

Comments

Leave a comment.

Comments go live after a short delay. Thank you for contributing.

(required)

(required)