Security blunders that will get you fired

From killing critical business systems to ignoring a critical security event, these colossal slip-ups will get your career in deep water quick.

00 title fired
Credit: Thinkstock
Mistakes

Getting fired from an IT security job is a rare event, but there are certainly ways to ensure or accelerate your own unemployment. Trust your skills, follow corporate directives, and concentrate on the basics, and you’ll have a long career in IT security. Help your employer right-size its defenses in the right places, and you’ll excel. But fall prey to one of the following mistakes, and you’ll be looking for new work -- maybe a new career -- more often than not.

Colossal security mistake No. 1: Killing critical business functionality
Credit: Thinkstock
Colossal security mistake No. 1: Killing critical business functionality

If you unexpectedly bring down critical business functionality longer than a day or so due to a new security process or device you put in place, you’ll be shopping résumés faster than it takes to bring the network back up. Business rules.

Lessons learned: Learn what is critical to business and don’t interrupt it unless not doing so will result in more damage.

Colossal security mistake No. 2: Killing the CEO’s access to anything
Credit: Thinkstock
Colossal security mistake No. 2: Killing the CEO’s access to anything

I’ve seen CEOs yell at security pros simply because IT required the CEOs to put in a new password on their computers or put in a new password to access a high-risk application. CEOs for the most part want to open their laptops, click an icon, and have everything readily accessible, security be damned. Every IT security worker that has worked directly with a CEO has stories.

Lesson learned: Make access as easy possible for the CEO while maintaining the required amount of security.

03 ignore
Credit: Thinkstock
Colossal security mistake No. 3: Ignoring a critical security event

Management is all about choosing critical business functionality over security -- until the security event impacts critical business functionality. Then the pendulum swings swiftly, and heads roll for doing business as usual as the company coffers are plundered.

Lesson learned: Define the critical security events that are most likely to indicate malicious activity, and always research them to their ultimate conclusion when they occur. You can’t chase down every potential false positive; know which ones are the deadliest and put in your due diligence.

Colossal security mistake No. 4: Reading confidential data
Credit: Thinkstock
Colossal security mistake No. 4: Reading confidential data

If the CEO is the king of the company, then the network administrator is the king of the network. I know many network admins who've allowed their godlike access control to tempt them into viewing data they didn’t have permission to see. In military parlance, you need proper clearance and the need-to-know.

Lessons learned: Don’t access data you don’t have valid permission to see, and consider helping data owners/custodians to encrypt their confidential data with keys that you don’t have access to.

Colossal security mistake No. 5: Invading privacy
Credit: Thinkstock
Colossal security mistake No. 5: Invading privacy

Invading a person's privacy is another surefire way to put your job on the line, no matter how small or innocent the incident may seem.

Lesson learned: Privacy has become one of the leading computer security issues today. A few short years ago nearly everyone accepted that admins with access to a particular system might take the occasional look at records they didn’t have a legitimate need to access. Those days are over. Today’s systems track every access, and every employee should know that accessing a single record they don’t have a legitimate need to view is likely to be noticed and acted on.

06 data
Credit: Thinkstock
Colossal security mistake No. 6: Using real data in test systems

When testing or implementing new systems, mounds of trial data must be created or accumulated. One of the simplest ways to do this is to copy a subset of real data to the test system. Millions of application teams have done this for generations. These days, however, using real data in test systems can get you in serious trouble, especially if you forget that the same privacy rules apply.

Lessons learned: Either create bogus data for your test systems or harden test systems that contain real data as you would any production system.

Colossal security mistake No. 7: Using a corporate password on the Web at large
Credit: Thinkstock
Colossal security mistake No. 7: Using a corporate password on the Web at large

Hacking groups have been incredibly successful using people’s website passwords to access their corporate data. Routinely, the victim is phished with an email that purportedly links to a popular website (Facebook, Twitter, Instragram, and so on), or the website itself has had its password database stolen. Either way, the bad guy has passwords that he bets people use elsewhere, including with company assets. Time to poke around and see what kind of access that earns.

Lesson learned: Make sure all employees understand the risk of sharing passwords between nonwork websites and security domains.

Colossal security mistake No. 8: Opening big “ANY ANY” holes
Credit: Thinkstock
Colossal security mistake No. 8: Opening big “ANY ANY” holes

You’d be surprised at how many firewalls are configured to allow all traffic indiscriminately into the network and out. This is even more interesting because almost all firewalls begin with the least permissive, deny-by-default permissions, then somewhere along the way an application doesn’t work. After much troubleshooting, someone suspects the firewall is causing the problem, so they create an “allow ANY ANY” rule. This rule essentially tells the firewall to allow all traffic and to block nothing. Whoever requests or creates this rule usually wants it only for a short while to figure out what role the firewall might play in the problem. At least, that's the initial thinking.

Lesson learned: Don’t ever allow "ANY ANY" rules to be deployed.

Colossal security mistake No. 9: Not changing passwords
Credit: Thinkstock
Colossal security mistake No. 9: Not changing passwords

One of the most common mistakes that can put your job on the line is not changing your admin passwords for a very long time. My auditing experience has made this very clear. Almost all companies have multiple unexpired, years-old admin passwords. In fact, it’s the norm.

Every computer security configuration guide recommends changing all passwords on a reasonable, periodic basis, which translates to every 45 to 90 days in practice. Admin and elevated passwords should be stronger and changed more frequently than user passwords. At most companies, admin passwords are long and complex, but almost never changed.

Lessons learned: Periodically change all passwords, especially admin and service accounts. And always change passwords immediately upon separation of employment. Plus, don’t use admin accounts and passwords to power your applications.

Colossal security mistake No. 10: Treating every vulnerability like “the big one”
Credit: Thinkstock
Colossal security mistake No. 10: Treating every vulnerability like “the big one”

There will always be a significant number of vulnerabilities that colleagues and the media tout as the critical hole that will cripple your network and systems. It takes experience and skill to recognize what you really need to be worried about. If you run around panicking at every last “big” vulnerability, you risk being seen as someone who doesn’t know their job, can’t discern the real threats to your business, and shouldn’t be taken seriously, even when your alarm coincides with a vulnerability your company should definitely pay attention to. Granted, crying wolf likely won’t get you fired, but it can certainly cause roadblocks to your long-term upward mobility.

Lesson learned: Correctly prioritize vulnerabilities, and be careful not to undermine your credibility with colleagues by wasting their time with false alarms.