Keeping it Close to Home

KEEPING IT CLOSE TO HOME by Charlie Martin

Let’s say you go visit your bank, and you realize that they have all the cash, neatly stacked, on a table behind the teller cages. When you ask the bank manager about it, you’re told “That’s not a problem, because we trust our employees.”

Advertisement

Certainly if you have ever worked in a bank, this would send shivers up your spine. It’s all well and good to trust your employees — but make it too easy, and someone will be tempted beyond their strength.

When it comes to computer security, though, the same kind of common sense often doesn’t appear. I had a consulting client some years ago that supplied data to commodities traders. They wanted to build a new, Internet-based service and they wanted to be sure it was secure. We discussed a lot of advanced cryptographic techniques, how they could reliably identify authentic customers, and be able to cut them off from access if they weren’t paid up. It wasn’t a simple problem, and doing a good job of it required some careful analysis, and some sophisticated mathematics, but eventually we came up with a workable scheme.

One key component of this was a bit of data, a cryptographic key.  If you had the key, you could intercept this commodity data. At the very least, you could steal a data stream that cost a paying customer about a hundred dollars a day, but that’s nothing. There were approaches — you’ll understand if I don’t go into detail — that would allow a trader with access to the data, by ‘front running’ or other tricks, to potentially make millions of dollars a year with very little likelihood of ever being detected.

Advertisement

When I started talking about how to protect that data, the company’s first answer was “Oh, that’s only accessible to a system administrator, we trust them.”

Now, don’t get excited about figuring out who the customer was or how to crack them. They saw my argument once I explained the attack, and in any case, everyone in banking and commodities is more careful now; people have been burned a number of times.

Or, make that “almost everyone.” Fannie Mae was recently targeted by a contractor who was terminated, and managed to slip some destructive code into the Fannie Mae systems while he was leaving. It was caught by luck, but could have shut down Fannie Mae for a week or more and cost it millions of dollars. The City of San Francisco is in the middle of another case, with a network administrator that either was being a little too vigorous, or possibly was trying to get a stranglehold on the city’s networks. [It’s unclear exactly which it was because, frankly, the lawyers and law enforcement people involved understand the situation so poorly that some of their accusations are literally nonsense; they simply aren’t possible.] But just the process of restoring confidence in San Francisco’s network will cost in excess of $1 million.

Advertisement

The maddening thing about this is that none of it is a surprise. It’s been known since Judas kissed Jesus that insider attacks do ten or a hundred times more damage than attacks from outside, whether it’s computer system cracks or old fashioned theft. That’s why your bank doesn’t leave the cash lying on a desk behind the teller cages. Complicated password rules and other manifestations of security theater may very well hurt by giving people a false sense of security, when the real solutions are the same as they’ve always been: know your people; don’t give one person too much authority; maintain an audit trail, and every so often actually perform an audit.

In other words, as the Russian proverb goes, “trust, but verify.”

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement