"However, writing software without defects is not sufficient. In my experience, it is at least as difficult to write software that is safe - that is, software that behaves reasonably under adverse conditions"
About this Quote
Venema is quietly dismantling a comforting myth in engineering culture: that “correct” is the finish line. The first clause concedes the obvious virtue of defect-free code, then immediately demotes it. “Not sufficient” lands like a rebuke to teams that treat bug counts as a proxy for trust. He’s drawing a boundary between software that works in the lab and software that deserves to exist in the wild.
The subtext is security-minded, but broader than security. “Safe” is defined not as moral virtue or compliance checklists, but as comportment under pressure: adversarial inputs, partial failures, weird edge cases, exhausted disk space, corrupted state, hostile networks, human error. The phrase “behaves reasonably” is doing heavy lifting. It implies humility about prediction and a preference for graceful degradation over brittle perfection. It also smuggles in an argument about priorities: users don’t experience software as a proof; they experience it as a system that either contains damage or amplifies it.
Context matters: Venema is a scientist and security engineer associated with hard-nosed, real-world Unix infrastructure (Postfix, TCP Wrapper). His world is one where “adverse conditions” aren’t rare anomalies; they’re Tuesday. The line “in my experience” isn’t anecdotal hand-waving, it’s a claim of domain authority: long exposure to failure modes that unit tests don’t imagine and product timelines don’t reward.
The intent is corrective: stop fetishizing defectlessness, start designing for resilience, recovery, and safe failure. In other words, the job isn’t to avoid mistakes; it’s to survive them without hurting people, data, or systems.
The subtext is security-minded, but broader than security. “Safe” is defined not as moral virtue or compliance checklists, but as comportment under pressure: adversarial inputs, partial failures, weird edge cases, exhausted disk space, corrupted state, hostile networks, human error. The phrase “behaves reasonably” is doing heavy lifting. It implies humility about prediction and a preference for graceful degradation over brittle perfection. It also smuggles in an argument about priorities: users don’t experience software as a proof; they experience it as a system that either contains damage or amplifies it.
Context matters: Venema is a scientist and security engineer associated with hard-nosed, real-world Unix infrastructure (Postfix, TCP Wrapper). His world is one where “adverse conditions” aren’t rare anomalies; they’re Tuesday. The line “in my experience” isn’t anecdotal hand-waving, it’s a claim of domain authority: long exposure to failure modes that unit tests don’t imagine and product timelines don’t reward.
The intent is corrective: stop fetishizing defectlessness, start designing for resilience, recovery, and safe failure. In other words, the job isn’t to avoid mistakes; it’s to survive them without hurting people, data, or systems.
Quote Details
| Topic | Coding & Programming |
|---|
More Quotes by Wietse
Add to List



