We cannot be perfect. We can only be smart. We try to follow the rules and be as thorough as we can, and we hope for some luck.
I grew up as a self-proclaimed cautious lad. Like most kids, I got in my share of trouble, but less so than most of my peers. 10 years of doing mostly systems engineering in the federal government, coupled with the safety and security training I received there, served to engrain my paranoia. Sometimes the paranoia is unjustified, and I feel foolish. Yet, good peripheral vision and situational awareness enable me to see things many people do not.
I got on a flight the other day. As I waited in the line at the counter to check my bag, the guy in front of me set his wallet down on the counter. Then, he bent over and wrote out two luggage tags and clipped them onto his bags. His wallet sat on the counter for a good 65-70 seconds, and even without my glasses I could see an American Express card, and what I am pretty sure was a Citibank Visa. The wallet could have disappeared in seconds.
At the airport, I made it through Security with no problem, but as I walked toward my gate, I noticed an open door on the other side of security. As I walked past, I glanced into the open door, and realized I could see what looked to be a break-room. Anyone could have walked in the front door, through the break room, and out the back door, completely bypassing security. I got the strangest look from the TSA agent when I asked him if they meant that room to be open on both sides, but he did go into the room - presumably to close the door.
The gate areas for planes always make me sad. There were no fewer than three people sleeping, with their carry-on luggage, laptop case sitting on the floor, or iPod and headphones sitting on the seat next to them. There was a group of teens that got up and walked around the gate area, leaving all of their carry-on bags and electronics scattered around. At one point, they were on the other side of the counter, some 70-80 feet from their stuff, several thousand dollars’ worth of cameras, iPods, iPads, and laptops was completely out of sight. There was a lady changing her baby's diaper, with her back to her open purse which was sitting behind her on her stroller. Perhaps she thought the smell would be successful at repelling predators like a force field. And honestly, that kid did not smell any better when it got on the plane. I am really glad she was in row eight, while I was way back in 27. I prefer to think I’m realistically cautious, not overly paranoid.
After my flight was delayed, I grabbed some lunch. I made sure I knew where the restaurant’s back exit was, and sat facing the room. That is almost an obsession. I never park next to the sliding door of a van. At a stoplight or stop sign, I always leave enough space between me and the car in front of me so that I can pull out if I need to. I could go on, but the point is, yes, I am a little paranoid.
But, to some extent, a security professional has to be—and software engineers should be. If you ever meet a software tester who is not a little bit paranoid you should probably worry about their qualifications. In one of my first jobs, I was given the task of writing a user guide for a rather simple communications system. The system had very little configuration options, and a whole 16 menu picks. But, the system would be used by people with no technical background, and sometimes with little to no training. When Mel assigned me the task, he told me, “Do not write it so that you can understand it. Write it so that they cannot possibly misunderstand it.” I ended up with a 54-page user guide that the office used as a training tool on how to write user guides – I had taken my assignment seriously. And users still figured out how to break it – yes, we are talking “cup holder on the front of my PC” type problems.
That is the way business works. Much of the time, that is how IT works. A good IT/IS person will include some resiliency, and you will end up with a system that works better than it really has to – most of the time. But, for the most part, that is how we build things – to do their job and a little more. We want systems that work well, and consistently meet their operational objectives. Systems run until they break. The software runs until it hits a flaw, or is broken. And that is why hackers succeed. Because no one is perfect, and our errors catch up with us.
We try to make rules to make ourselves less imperfect.
“Don’t share your password”. Ever. Until you leave your laptop at work and call a co-worker so they can log on and send out that proposal, or check your email for you. Or until you get the call from your IT Security group and you tell them your password so they can update your anti-virus software. I mean, they have no idea why, but your anti-virus software has never updated and is so far out of date that you need a new install – we are, you know, being infected by viruses all over the place, right now. Except that wasn’t really your IT Security group, and now you gave some stranger your password. Why the heck would you do that? Cause you didn’t follow the rule.
“Always lock your workstation”. And you do. Until you run down to pick the report up from the printer, and some chatty guy with a chipped coffee cup starts yacking about his fantasy football team, and has picked up your report with his papers, so by the time you sort it out and get back to your computer, Julia has emailed the company’s R&D Development Plan to her “friend” – from your computer.
“Do not open attachments from someone you don’t know”. Of course we never do that. No one ever does that. Except for that one time, in band camp, when the person’s name sounded kind of familiar, and they said your Mom told them to send it to you, so you just had to see the attached video – I mean, it was called “Escalator eats Dad’s pants!!!” – it just had to be funny.
I have successfully done all three of these things while testing client environments, and worse. And, by the way, these things always work. And real attackers create even more devious attacks every day. It might be a buffer overflow in your code – but, you do perfect parameter checking, right? It might be an unpatched vulnerability in an application or operating system, though it would help if you are fully patched. It might be an attack through an unblocked port, except that you used the “Deny all” rule in your firewall, then only opened ports you needed, right? And “they” will build other attacks we cannot conceive of, because our brains don’t work the same way.
Ultimately, the problem is that we build things to work, while others look at things differently. They look for ways that systems don’t work, or can be made to not work, or made to work differently. This is the same principle that says developers make poor testers, because they want the code to work. While a tester may want the code to work, they also want to see if they can make it fail, because if they can make it fail, then they have done their job, and helped reduce flaws. So, as long as people write software and build systems, other people will find flaws in the software and break the systems. As long as people are involved, other people will be able to manipulate them.
The bottom line is that we cannot be perfect. We can only be smart. We build policies, procedures, and work on compliance. We train our staff in security awareness. We try to follow the rules and be as thorough as we can, and we hope for some luck. I’ll also add that it really does pay to be a little paranoid as well.