"The nervous system and the automatic machine are fundamentally alike in that they are devices, which make decisions on the basis of decisions they made in the past"
About this Quote
Wiener is doing something sly here: he collapses the distance between the most intimate thing we have (a nervous system) and the most alienating thing mid-century life was busily multiplying (the “automatic machine”). The hook is the word “devices.” It’s not just a metaphor; it’s a demotion. Minds are not airy souls but engineered systems, and machines are not dumb tools but entities with a kind of memory. In 1940s and 50s America, with radar, anti-aircraft predictors, and early computers reshaping warfare and work, that framing carried a quiet provocation: if both brains and machines “make decisions on the basis of decisions they made in the past,” then the line we draw around “human judgment” is thinner than our pride admits.
The sentence also smuggles in cybernetics’ core idea without fanfare: feedback. Past decisions loop back, train the system, and bias the next output. That’s the subtextual bridge between neurons and algorithms, between habit and automation. Wiener isn’t praising “automatic” intelligence; he’s warning that intelligence, wherever it appears, is path-dependent. Systems learn, but they also get stuck.
Read now, it sounds like a premonition of machine learning and the politics of data: decisions made yesterday harden into “knowledge” tomorrow. Wiener’s intent isn’t to declare humans replaceable; it’s to insist that both people and machines are shaped by their own histories, and that whoever controls the feedback loops controls the future behavior.
The sentence also smuggles in cybernetics’ core idea without fanfare: feedback. Past decisions loop back, train the system, and bias the next output. That’s the subtextual bridge between neurons and algorithms, between habit and automation. Wiener isn’t praising “automatic” intelligence; he’s warning that intelligence, wherever it appears, is path-dependent. Systems learn, but they also get stuck.
Read now, it sounds like a premonition of machine learning and the politics of data: decisions made yesterday harden into “knowledge” tomorrow. Wiener’s intent isn’t to declare humans replaceable; it’s to insist that both people and machines are shaped by their own histories, and that whoever controls the feedback loops controls the future behavior.
Quote Details
| Topic | Artificial Intelligence |
|---|
More Quotes by Norbert
Add to List






