A good article I have read in ComputerWorld. Here comes a summary with this 34 reasons:
- The game is rigged in favor of the bad guys: To avoid breaches, the good guys have to succeed 100% of the time. The bad guys only have to succeed once.
- TCP/IP, the underpinning of the Internet was never designed with security in mind. Ditto Ethernet, the underpinning of almost all local area networks.
- Internet User Guide: There is no User Guide to the Internet that lays out briefly and in simple language the obvious mistakes that should be avoided. Neither hardware manufacturers, nor ISPs, nor operating system vendors have bothered to offer a helping hand to their most clueless users. A pamphlet would be plenty.
If it only covered the most basic things, that would still be a huge step up. Things like the dangers of clicking links in email messages or that when you are prompted to install software there's a good chance it's a scam. Mac users are just learning this last point the hard way. Welcome to the club. Back in February 2010 Microsoft employee and security expert Roger Grimes wrote: The majority of the risk is due to end-users intentionally executing socially engineered Trojans that show up as fake antivirus software, malicious video codecs, fake patches, and needed software drivers. Yes, good patching and strong passwords also help, but Trojan horse programs that your end-users (or friends or family) get tricked into installing are by far the most popular, successful threat.
- The FROM address of an email address is easy to forge (see prior point) and too few people know this.
- People are gullible.
- SSL, the technology behind secure web pages, is a sham.
- Home WiFi: People use WEP on their home WiFi networks. That Verizon continues to employ WEP for new customers is shocking. It should be illegal. WEP encryption is easily broken, unlike the two newer schemes WPA and WPA2. That said, even WPA and WPA2 can be hacked if the password is weak.
- Public WiFi: People use unencrypted public WiFi networks without a VPN. You don't spit into the wind, you don't tug on Superman's cape and you shouldn't use unencrypted public WiFi networks without a VPN. It opens up a slew of potential problems.
- Some files/data should never be accessible over the Internet. Yet, they often are.
- The IT field changes very quickly: When faced with a medical problem, we often deal with a doctor with 10 or more years of experience in their specialty. Very few programmers have that much experience in the development environment they use. For example, no one on the planet has 10 years experience coding Android apps. Inexperience inevitably leads to rookie mistakes.
- Too many corporate executives have no technical savvy. This leaves them susceptible to scams and handicaps their ability to judge the importance and effectiveness of the computer security at their company.
- Small businesses have no computer techies on staff which makes them ripe for online banking fraud. Brian Krebs did a series of articles describing many instances of this.
- Economics dictates that software will be buggy: Developers are paid to write applications that work and, often, that are finished ASAP. That applications are totally and completely bug free may not be the highest priority. For one thing, it delays roll-out. For another, not every developer is up to the task. Steve Gibson discussed this briefly on his Security Now! podcast (episode 302, May 26, 2011). The topic was Donald Knuth, the author of TeX. Gibson called him "an artist of software" and marveled at how bug free TeX turned out to be, despite being a massive system. According to Gibson, Knuth ... wrote it in a language that he knew .. and wrote it very carefully ... Now, is that a commercial practicality? No. I mean, he would have been fired by any employer.
- Software will always be buggy even without economics: Programming is still an art and one best done by the fewest possible people. How many great works of art in a museum were done by a large team working together? Large applications, written by teams of developers, are especially likely to be buggy, either due to communication failures or the inclusion of less skilled developers.
- Popular software: When software gets brutally popular (think Windows, Flash, Adobe Reader and Java) bad guys devote time and effort to finding bugs that can be exploited. Many times on this blog I suggested avoiding software that has a bulls eye painted on it's back.
- Bug fixes: The process of installing bug fixes (politely known as patches) to software applications on Windows and Macs is disgraceful, with each application forced to roll its own self-update scheme. It's anarchy. While large corporations can spring for software that installs bug fixes company-wide, smaller organizations and consumers suffer. Thus many, if not most, personal computers are running software that is missing patches to known bugs. I used to recommend Secunia's Online Software Inspector, but it requires Java and I'm hesitant to encourge the use of Java as flaws in old versions are frequently exploited by bad guys.
- Nothing prevents a program from advertising itself as doing one thing, but when it's installed doing something else too.
- Windows does not do a great job of defending itself. For example, Patchguard, UAC, DEP and ASLR have all been defeated, at times, by bad guys.
- Least privilege: Both Windows and Macs have a concept of limited/restricted users and administrative users. Think of it as adult users who can do anything and child users who are restricted from messing up the guts of the system. An important defensive computing tact is to run with the least privileges necessary. Practically speaking, this means logging on to the computer as a limited/restricted user most of the time and only logging on as an administrator/adult when necessary. But, both Windows and Macs default to using administrative level logons, a big security mistake. At the least, Windows XP users should consider DropMyRights. Windows 7, which I hate with a passion, does a great job of running as a restricted user. An explanation of this belongs in the fictional Internet User Guide.
- Windows autorun: Microsoft keeps trimming it back, but it still exists in Windows 7. It should be thoroughly, completely and totally disabled. Microsoft does not offer this as an option. I did back in January 2009. The technique I described then still works and is still necessary in Windows 7.
- Motivation: Sometimes, perhaps often, the bad guys are more motivated than the good guys. Maybe its the potential for a huge payday, a sense of pride, the desire for respect from their peers or a sense of nationalism (it has been suggested that some hacking is state sponsored).
- Bad guys rarely get caught.
- Competent techies: When hiring nerds, it's hard to judge technical competence. Computers are a new and fractured field. Plus, as noted above, programming is still an art rather than a science.
- The good guys may not be perfect: Some good guys are not well trained for the task at hand. Some are optimists (only a pessimist will think of everything that can possibly go wrong and plan for it). Some are lazy. Some are intellectually challenged (think boss's brother-in-law). As noted in the first point on this list, the good guys only need to fail once for the house of cards to fall. Then too, some good guys are not good guys at all - it only takes one rogue techie to undermine the good work of their honest techie colleagues.
- SQL injection: If anything points up the imperfect nature of developers, it's SQL injection, a way of hacking into websites. SQL injection is totally preventable (I say this having worked with databases for many years). That it succeeds, is a mark of sub-optimal application developers. Perhaps lazy, perhaps ill-trained, perhaps an honest oversight here and there, or maybe just under the gun to finish a project as quickly as possible.
- The bad guys are constantly getting better, both in the sophistication of their software and their scams. They also adapt to pick on the weakest link in the security chain. For example, as Windows got better at keeping itself updated with bug fixes, the bad guys moved on to attack other popular software (Java, Flash, Adobe Reader).
- Not enough sandboxing: Sandboxing refers to putting a virtual wall around an application to insure that it does not harm the rest of the system. My favorite Windows sandboxing utility is Sandboxie. It can prevent the permanent installation of malicious software on Windows PCs. It does not prevent malware from getting onto a computer in the first place, and the malware can execute if not caught by standard antivirus software. But Sandboxie can prevent malware from permanently residing on a Windows computer. It does not get nearly the attention it deserves. Highly recommended.
- Antivirus software: Speaking of antivirus software (for Windows), any single product offers flawed protection. Many computers with up to date antivirus software get infected anyway. And, even if an antivirus application detects malware, that doesn't mean it's cleanup of the infection will be perfect.
For better detection, occasional scans with a different antivurus program are the way to go. Better still, an occasional scan with software that runs off its own bootable CD is the best approach. Microsoft just released their Standalone System Sweeper and many antivirus vendors, such as Avira and Kaspersky, offer something similar. Yet, how many people do this?
- Google: Bad guys trick Google into listing malicious web pages and images near the top of search results. My defense against this is Web of Trust, a free browser plugin available for Firefox, Chrome and Internet Explorer.
- The C programming language refuses to die.
- Newer operating systems (found on smartphones) can remotely disable applications that are determined to be malicious. This is not possible on the older systems used on personal computers.
- Computer and network security may not get the attention it deserves at companies. This is understandable as it's not an income producing area. Like insurance, it costs money and returns nothing, at least nothing immediate.
It has been all over the news recently that Lockheed-Martin's network was attacked and somewhat breached. What I find interesting about the story is that as a result of the attack, Lockheed-Martin "took swift and deliberate actions" to increase their network security. Really? If there was any company that should have the best possible computer security its Lockheed-Martin. Yet, even they weren't giving security sufficient priority.
- All security schemes need constant care and feeding. Automated tools only go so far. But monitoring takes more time/effort/expense than many companies are willing to endure.
- Judging by the stats I get, virtually no one reads this blog.