"If you think the technology is infeasible, you don't worry about what it might do and what its potential is"
About this Quote
Merkle’s line is a sly diagnosis of how people dodge responsibility in tech: disbelief doubles as moral anesthesia. If a technology feels “infeasible,” you’re excused from thinking through its consequences, because consequences only belong to things you believe will exist. The point isn’t just that skeptics are wrong; it’s that skepticism has a convenient political and psychological function. Calling something impossible is a way to opt out of the hard work of governance, design restraint, and ethical imagination.
Coming from Merkle, a cryptography pioneer who spent years arguing for ideas many contemporaries treated as science fiction, the context matters. Public-key crypto, molecular nanotechnology, even some of today’s AI trajectories all lived in that awkward phase where feasibility was contested. In that phase, “it can’t be done” isn’t neutral technical analysis; it’s a social permission slip to ignore the future. Merkle is pointing at a recurring pattern: institutions and experts often demand “proof of inevitability” before they will engage with risk, but by the time inevitability arrives, the choices have narrowed.
The subtext is a warning about our timing problem. We are best positioned to steer a technology when it’s still uncertain, when standards, norms, and safety constraints can be baked in. Yet uncertainty is exactly when people feel licensed to look away. Merkle’s sentence reads like a reminder that feasibility debates are never just engineering debates; they’re also debates about accountability, attention, and who gets to say “not my problem” until it is.
Coming from Merkle, a cryptography pioneer who spent years arguing for ideas many contemporaries treated as science fiction, the context matters. Public-key crypto, molecular nanotechnology, even some of today’s AI trajectories all lived in that awkward phase where feasibility was contested. In that phase, “it can’t be done” isn’t neutral technical analysis; it’s a social permission slip to ignore the future. Merkle is pointing at a recurring pattern: institutions and experts often demand “proof of inevitability” before they will engage with risk, but by the time inevitability arrives, the choices have narrowed.
The subtext is a warning about our timing problem. We are best positioned to steer a technology when it’s still uncertain, when standards, norms, and safety constraints can be baked in. Yet uncertainty is exactly when people feel licensed to look away. Merkle’s sentence reads like a reminder that feasibility debates are never just engineering debates; they’re also debates about accountability, attention, and who gets to say “not my problem” until it is.
Quote Details
| Topic | Technology |
|---|
More Quotes by Ralph
Add to List






