I read an interesting article this morning, about computer security and dumb ideas concerning it. The author had some great points and it's tough to really understand where he is coming from on others. In a society of engineers where we are taught things such as penetration testing and install working/secure later it's tough to agree with his statements.
I'll highlight a few of the points to express my opinion on them:
-"We're Not a Target" : The author hits this square on the head. Worms and malicious code don't care whether your network is interesting or not. I find that this is said a lot to smaller businesses as an "assurance" policy to make them feel better with their lack of security.
-"We don't need host security, we have a good firewall" - Again the author brings a great point to light. I find so many people that think because they have a hardware firewall at their point of entry they are safe. Host security is becoming one of the largest fields right now. Just look at the geek squad earning mad cash at Best Buy. They entire business is built on fixing your computer problems, which 90% are occurred from viruses, spyware and malicious individuals.
-"Let's go production with it now and we can secure it later" - In my IT career to date I can't tell you how much I hear this statement said. It just blows my mind that people do this, and they do it all the time. This point is also a great spot of debate. Because in the real world you have to balance security vs need. In the end no matter how secure flawed an item is, if you NEED it, it gets implemented. When instead it should be re-made more secure, and then implemented.
Here is the Fifth dumbest idea in computer security:
""Penetrate and Patch" can be applied to human beings, as well as software, in the form of user education. On the surface of things, the idea of "Educating Users" seems less than dumb: education is always good. On the other hand, like "Penetrate and Patch" there have been numerous interesting studies that indicate that a significant percentage of users will trade their password for a candy bar, and the Anna Kournikova worm showed us that nearly 1/2 of humanity will click on anything purporting to contain nude pictures of barely clothed females. If "Educating Users" is the strategy you plan to embark upon, you should expect to have to "patch" your users every week. That's dumb.
The real question to ask is not "can we educate our users to be better at security?" it is "why do we need to educate our users at all?" In a sense, this is another special case of "Default Permit" - why are users getting executable attachments at all? Why are users expecting to get E-mails from banks where they don't have accounts? Most of the problems that are addressable through user education are self-correcting over time. As a younger generation of workers moves into the workforce, they will come pre-installed with a healthy skepticism about phishing and social engineering.
Dealing with things like attachments and phishing is another case of "Default Permit" - our favorite dumb idea. After all, if you're letting all of your users get attachments in their E-mail you're "Default Permit"ing anything that gets sent to them. A better idea might be to simply quarantine all attachments as they come into the enterprise, delete all the executables outright, and store the few file types you decide are acceptable on a staging server where users can log in with an SSL-enabled browser (requiring a password will quash a lot of worm propagation mechanisms right away) and pull them down. There are freeware tools like MIMEDefang that can be easily harnessed to strip attachments from incoming E-mails, write them to a per-user directory, and replace the attachment in the E-mail message with a URL to the stripped attachment. Why educate your users how to cope with a problem if you can just drive a stake through the problem's heart?
When I was CEO of a small computer security start-up we didn't have a Windows system administrator. All of the employees who wanted to run Windows had to know how to install it and manage it themselves, or they didn't get hired in the first place. My prediction is that in 10 years users that need education will be out of the high-tech workforce entirely, or will be self-training at home in order to stay competitive in the job market. My guess is that this will extend to knowing not to open weird attachments from strangers."
I really like the idea he has going here. Take all the possibility for errors and problems out of the users hands. Enact software restriction lists, and implement policies that prevent the user from coming in to contact with these things. The initial overhead for this type of thought process is very high, but comes down substantially after it's implemented correctly.
Here is the article: http://www.ranum.com/security/computer_s