A look at why some companies struggle to operate securely
By Chris Rasmussen, Systems Administrator
In the first installment of this article, I addressed compatibility issues, but I did not have time to discuss the issue of modern culture and the ways it can make achieving security more difficult. For this installment, we will look at how the end user fits into cybersecurity from a cultural perspective, along with some ways we can improve our behavior in order to protect ourselves and our data.
Bottom line: we collectively assume cybersecurity is the responsibility of people with “cybersecurity” in their job titles, and that preventing theft or loss of all the 1s and 0s in the network is handled by protocols and firewalls. That is technically true, but with a caveat: cybersecurity is ALSO the responsibility of everyone who owns, uses, or even stands inside a room that holds a computer.
Really. I don’t care if you don’t own a computer: if you are able to read this article, you have a serious responsibility about cybersecurity, just like you have a responsibility to not accidentally set things on fire. Yes, putting out fires is the job of the fire department, but that doesn’t mean you can use a lighter while pumping gas, simply because somebody else is responsible for extinguishing the resulting flames.
To put things in perspective, modern cars have fantastic safety features, from the obvious ones like seatbelts, airbags, anti-lock brakes, and more. None of these mean we can drive without paying attention.
A password is a seatbelt at best; using it properly does not mean you are automatically fine and your responsibilities regarding safety are complete. It means if you get hit by a digital attack, hopefully it will mitigate the damage. You can be sure, however, that not wearing a seatbelt is a fantastic path to tragedy.
Digital security means not clicking on links while you are busy talking on your cell phone; it means not going, “Hey, the antivirus won’t let me download this! I’ll just turn off the protection and download it anyway,” but instead going, “I should figure out why my antivirus is saying this thing I want is harmful.”
The simple truth is that most major disasters occur not because we did not know how to stop the problem from happening, but because we ignored important warnings or failed to follow our own instructions. The problem, unfortunately, is that so many of the warnings we receive nowadays are incredibly stupid, boring, or outright insulting.
“We are not responsible for damage resulting from…weapons employing atomic fusion.”
– My apartment lease
One lease I had to sign was roughly three pages of actual lease info, and 50 pages of legal warnings, disclaimers, and instructions about bed bugs. And, yes, the terribly important warning that if somebody sets off a nuclear warhead and it irradiates my property, the building owners were not liable.
With clauses like this one, it is no wonder people don’t actually read things before signing them. It only gets worse with electronics, especially software.
“If this product kills the user because of a flaw we knew about ahead of time, we are not at fault unless the law says we are.”
– Paraphrase from multiple software license agreements
Unless you are willing to accept that gross negligence may knowingly result in death by a product you paid for, you are usually better off not reading the ridiculous legalese put there to overwhelm you. So, what do most of us choose to do? We don’t read the fine print. Most of the time, that won’t harm us.
We have that same mentality when going through our inbox and seeing e-mails that have a giant blazing red banner that says, “This e-mail came from somebody else: you should treat it like clicking on it could destroy the entire company.”
“This e-mail came from outside the organization. Do not respond if you do not know the sender.”
– Warning banner on an email received by customer service
After the third e-mail with this warning (or maybe the seventh if we are really law-abiding), we stop paying attention to the warning altogether, which is what makes it so easy for one person to click a link and destroy an entire company.
I recently watched a live demo where someone opens an e-mail attachment that directs them to a look-a-like website that would fool at least 90% of the human race, and the savvy attacker ultimately gains what is essentially total control of an entire system.
It took 30 minutes. Just the length of a single Seinfeld episode to seize control of what would have been an entire company had it not been a demo.
Now, yes, there are a lot of reasons why this happened. Odds are good if you click on a link you shouldn’t have, the company will not end up with the entire network falling under enemy control.
Odds are also good that if you cross the street without looking both ways, you won’t get killed from it (depending on the street), but that doesn’t mean that it’s a good or safe idea. And a key difference between getting hit by a car and getting hacked is that you usually don’t know you got hacked, so you can’t fix the damage.
“More the hurry, more the obstacles.”
– Welsh Proverb
Another aspect of culture that creates issues for operating securely is our constant state of being in a hurry. Deadlines loom, to-do lists pile up, and oftentimes, we are just trying to get through each day with some semblance of productivity.
The constant hustle and bustle leads us to let our guard down and rush through tasks that we view as trivial or habitual. This is often where some of our greatest vulnerabilities lie – in the times when we are just trying to get one more task checked off the list before closing time, so we dismiss a message with little thought or click on a link without reading the text around it.
“It takes 20 years to build a reputation and few minutes of cyber-incident to ruin it.”
– Stephane Nappo
So what can we do to operate safely in our constant state of overstimulation, desensitization, and hurry-up culture?
Here’s my first tip: slow down and be obnoxiously careful. Be that annoying worrywort who says the digital equivalent of, “Are you sure you want to drive without a seatbelt on?” Read the warnings and follow them every single time to the best of your ability.
I have to admit that there are plenty of times when the warnings are unnecessary or just plain ridiculous, and that means there are times that people will be justified in ignoring them. But remember that operating securely in cyber world is like crossing a street.
Sometimes it is perfectly safe to cross the street with your eyes closed, but usually it is a horrible idea. The fact that sometimes certain people can pull it off safely under specific circumstances does not mean the average person should make a habit of doing it – or do it even once. Our common sense/danger instincts that work in the physical world utterly fail to translate into the digital world.
So, slow down. Take an extra five, or 20, seconds to read the e-mail. Yes, you will likely end up getting through fewer e-mails if you are being careful, but you will also be far, far, far less likely to hand over access of a system that contains sensitive information of every employee or client. Slowing down may prevent you from checking off all of your to-do items, but it may save you from playing into the hackers’ hands.
“If you don’t finish your training, I’m going to start sending you spoilers for The Inside Man.”
The second tip is even harder to swallow: pay attention to your training. All of it. There’s an increasing amount of effort put into producing good computer or security training (like The Inside Man!). Even if you are stuck with infamously bad training, go through each and every one of them like your paycheck depends not on passing a multiple-choice quiz, but on applying that training to your daily life. Odds are increasingly good that the paychecks of everyone at your company really will depend on whether you spend that two-hour training actively learning or just clicking buttons while watching TV.