"When I write software, I know that it will fail, either due to my own mistake, or due to some other cause"
About this Quote
Software, in Venema's telling, begins with an admission of guilt. Not moral guilt, but engineering guilt: the certainty that failure is not an edge case but a scheduled appointment. Coming from a scientist known for security work, the line reads less like pessimism than a survival tactic. It refuses the startup myth that code is a clean expression of will; instead it treats software as a living system embedded in messy ecosystems of hardware quirks, user behavior, shifting requirements, hostile actors, and plain entropy.
The craft move here is the double attribution: "my own mistake" sits beside "some other cause". That pairing collapses the comforting boundary between bug and accident. It implies that even when you do everything "right", reality still gets a vote. In security, that "other cause" is doing a lot of work: unexpected inputs, undocumented dependencies, time-of-check/time-of-use races, supply-chain compromise. The quote smuggles in a worldview where control is partial and humility is a prerequisite for competence.
There's also an ethical subtext. If you start from "it will fail", you design differently: you log, you test adversarially, you handle errors explicitly, you limit blast radius, you assume users will do surprising things and attackers will do worse. Venema isn't romanticizing failure; he's naming it early so it can be managed late. The intent is to swap ego for rigor, and optimism for responsibility: a quiet manifesto for defensive engineering in a culture that still loves heroic debugging more than unglamorous resilience.
The craft move here is the double attribution: "my own mistake" sits beside "some other cause". That pairing collapses the comforting boundary between bug and accident. It implies that even when you do everything "right", reality still gets a vote. In security, that "other cause" is doing a lot of work: unexpected inputs, undocumented dependencies, time-of-check/time-of-use races, supply-chain compromise. The quote smuggles in a worldview where control is partial and humility is a prerequisite for competence.
There's also an ethical subtext. If you start from "it will fail", you design differently: you log, you test adversarially, you handle errors explicitly, you limit blast radius, you assume users will do surprising things and attackers will do worse. Venema isn't romanticizing failure; he's naming it early so it can be managed late. The intent is to swap ego for rigor, and optimism for responsibility: a quiet manifesto for defensive engineering in a culture that still loves heroic debugging more than unglamorous resilience.
Quote Details
| Topic | Coding & Programming |
|---|
More Quotes by Wietse
Add to List






