"No computer has ever been designed that is ever aware of what it's doing; but most of the time, we aren't either"
About this Quote
Minsky lands the punchline where it hurts: not on machines, but on us. The first clause flatters human exceptionalism - computers execute, they dont "know". Then he swivels, casually, to the far more destabilizing claim that our own self-awareness is intermittent at best. Its a joke with teeth, the kind scientists tell when they want to smuggle philosophy into the lab without sounding like theyve left their discipline.
The intent is partly corrective. In AI debates, people reach for consciousness as a bright line: machines calculate; humans understand. Minsky, a founding figure in cognitive science and early AI, treats that line as politically comforting and scientifically suspicious. The subtext is his long-running thesis that mind is not a magical essence but an arrangement of processes. If "awareness" is something that comes and goes for humans, maybe it isnt a prerequisite for intelligent behavior at all. That reframes the whole argument: the question stops being "Can machines be conscious?" and becomes "How much of what we call consciousness is just a story we tell about automatic routines?"
Context matters. Minsky worked in an era when AI swung between grand promises and public disappointment. By undercutting human certainty, he also undercuts hype: if we barely grasp our own cognition, we should be cautious about demanding a tidy definition of awareness from machines. The line works because it collapses the comfortable hierarchy without denying difference. Computers may not be aware, but neither are we, much of the time - and that is less an insult than a diagnosis of how cognition actually operates: mostly on autopilot, with occasional flashes of narration we mistake for control.
The intent is partly corrective. In AI debates, people reach for consciousness as a bright line: machines calculate; humans understand. Minsky, a founding figure in cognitive science and early AI, treats that line as politically comforting and scientifically suspicious. The subtext is his long-running thesis that mind is not a magical essence but an arrangement of processes. If "awareness" is something that comes and goes for humans, maybe it isnt a prerequisite for intelligent behavior at all. That reframes the whole argument: the question stops being "Can machines be conscious?" and becomes "How much of what we call consciousness is just a story we tell about automatic routines?"
Context matters. Minsky worked in an era when AI swung between grand promises and public disappointment. By undercutting human certainty, he also undercuts hype: if we barely grasp our own cognition, we should be cautious about demanding a tidy definition of awareness from machines. The line works because it collapses the comfortable hierarchy without denying difference. Computers may not be aware, but neither are we, much of the time - and that is less an insult than a diagnosis of how cognition actually operates: mostly on autopilot, with occasional flashes of narration we mistake for control.
Quote Details
| Topic | Artificial Intelligence |
|---|
More Quotes by Marvin
Add to List




