Thursday, October 17, 2013

Public Security

Imagine a meteorologist calculates that there is a 50% chance of rain this afternoon. It's his job to report the weather. But instead of forecasting possible rain, he tells the local newspaper to write that it will be sunny. In the afternoon it rains, ruining a parade the city had scheduled and severely damaging some parade floats. The parade could have been postponed to tomorrow, but now the city's budget has been wasted. The meteorologist is asked why he was carrying an umbrella, and he reveals that he knew there was a good chance of rain. "But I didn't want to ruin anyone's day," he says. "After all, the weather was very nice in the morning."

This is an analogy for computer security. Many powerful organizations want to silence people who find holes in their systems. For example, Andrew Auernheimer was convicted for publicizing security flaws he found in AT&T's systems. Another example was recorded by Charles Stoll, who mentions password security being undocumented because certain government agencies were slow to adopt stronger passwords. This is an effort to promote "security by obscurity" (which is considered a Bad Idea in the field of computer science). Large organizations that hide flaws are clearly acting in self-interest. Seeking to hide one's shortcomings at the expense of others is reprehensible.

There is another side to this problem. Hackers of any type are not justified when they leak information irresponsibly. (See The Washington Post's 2013 coverage of NSA leaks for a good example of how to leak classified information.) System administrators should respectfully consider any reports of security holes. Users who find flaws in the systems they use should report bugs appropriately. And (this one's for everybody), users should avoid using systems that are insecure and warn their friends.


1 comment: