Search the Community
Showing results for tags 'blunder'.
Found 1 result
We all make mistakes. But can we make mistakes when it comes to information security? If and when we do, should we even admit to them? Making mistakes and learning from them is a part of life, even if it is in the area of information security. For this article, we were able to get 25 infosec gurus to admit to security blunders. Errors range from the purely technical to good ‘ole fashioned poor judgment. While the slip ups were all unwanted, they yielded great learning experiences. Read on and you too can learn from your colleagues’ failures. 1: I got no errors so I’m sure the backup is valid Fool me once shame on you. Fool me twice, shame on me. Fool me three times? Uggh! Thus was the story with Edward Haletky (@texiwill) of The Virtualization Practice who was burned by poor virtual machine (VM) data backups. From gigabytes of corrupt data, to backup software failure, to a crashed VM, Haletky fell victim to three poor backups. Haletky never checked the validity of the backups. He just relied on the lack of errors being reported as validation. But just because the application says everything’s fine, doesn’t mean it is, realized Haletky. “Always verify before you do anything major. Don’t assume everything is working. Just because it looks right and smells right, don’t assume,” said Haletky. After his triple burn Haletky now uses software with better reporting mechanisms, does restoration testing, and has forensics tools always at the ready. 2: Do you really need everyone to wish you “Happy Birthday?” It’s really nice to receive birthday greetings on Facebook, but for your personal security it’s just not worth it. Bill Brenner (@BillBrenner70), blogger at The Salted Hash on CSO Online, failed to heed the very social networking security advice he’s written about multiple times which is to keep sensitive personal data off your Facebook profile, specifically birth dates. While he didn’t have the year on his Facebook profile, Brenner did have the day and month. With that information readily visible online Brenner quickly learned from local police that someone had been going around to drug stores in the area trying to get prescription drugs in his name. The drug hacker had his name, day, and month correct, but got all the other information wrong. 3. Maybe I should have tested that password file I just edited In college, Benjamin Tomhave (@Falconsview), blogger at The Falcon’s View, was a sysadmin who was a little too trusting of the university’s very wonky UNIX operating environment. Tomhave tried to “fix a glitch” in the /etc/passwd file that was creating a series of errant commas for non-entered data. He opened the file in Notepad under Windows for Workgroups and did a quick “replace all” and saved the file, the system file. Whoops. Tomhave didn’t test the file or create a backup plan. He unfortunately discovered that Microsoft added a “^M” character on every line of the /etc/passwd file, thus making the system unavailable for anyone to login. Luckily, Tomhave got a little help from tech support and was able to log back into the OS and re-edit the file. 4: Yes, a UFO is an unidentified flying object, but it’s probably an alien “I’ve made mistakes in being too trigger-happy to declare something a breach before all the facts were in,” said Wendy Nather (@451Wendy) who learned that there’s a big difference between “a Breach,” “an Incident,” “Something Weird Is Going On,” and “We Don’t Know What It Is But It Doesn’t Look Good.” “Before I got executives all excited, I should have taken more time to investigate and have real proof to show, rather than just some strange entries in logfiles plus a lot of speculation,” Nather said. In many cases these “incidents” were just an application acting weird and/or someone making unscheduled changes, said Nather. “You have to be careful throwing around the ‘B-word,’ otherwise you can undermine your credibility,” warned Nather. 5: I don’t care. I work in information security, not physical security Pete Lindstrom (@SpireSec) of Spire Security admits he’s been fairly lax in physical security of computer equipment, especially with large-sized equipment inside work buildings. That was until he came into work one day to discover several computers missing. The “story” was a guard was jumped and tied up. In actuality, the “guard” was in on the job. That incident and another tale of the escaped mental patient who impersonated a temp employee and lived in a conference room for a week reminded Lindstrom of the importance of physical security, the need for backups, and not to ever discount the inside job. 6: Is there an award for having the most passwords? Allison Miller (@SelenaKyle) has been a longtime fan of online activities, and she’s shown her appreciation by signing up for lots of email accounts, online merchants, blogs, e-payment systems, and financial services. As she added more and more accounts, she had to get more and more creative with her password creation scheme. It was getting out of control. “The buffer in my brain for passwords was starting to overflow,” said Miller who admitted to committing the cardinal sin of writing down passwords, or at least cryptic notes to remind her of those passwords. Miller ultimately had to repent for her infosec sins. Her accounts were eventually compromised and Miller had to change passwords en masse. She finally broke down and started using the password management tool 1Password and recommends Lifehackers’ top 5 password managers. Read Miller’s entire tale. 7: Let’s get the bad guys…all the bad guys “Early in my career, and even time from time when I’m not thinking, I reacted too quickly,” admitted Adam Ely (@AdamEly), CISO of Heroku at Salesforce, who often leapt at a single threat, vulnerability, or an off-hand comment from a coworker. Over time Ely realized, “A single vulnerability is a small risk in the grand scheme of things and must be weighed against all of other threats and tasks to be completed. Defining and following a strategy is key to ensuring I’m not distracted by the smaller, daily challenges I face.” 8: I think it’s safe to assume that everyone has the same agenda as me Josh Corman (@JoshCorman), Director of Security Intelligence at Akamai started his security career as a researcher of malware. At the time he sought out malware vendors to make them aware of egregious security vulnerabilities he saw. It was all for naught. “They didn’t care,” said Corman, “They sold the buyers what they thought they needed.” Corman was determined to get his message out there, so he did an end around and went straight to the buyers to tell them the story of malware failures. His strategy succeeded, but it ended up putting him in an adversarial relationship with vendors. “I tried to appeal to the intellect and honor of vendors to do the right thing failing to realize economic forces,” said Corman. Corman now realizes if he wants to be successful with a “change the industry” rampage he needs to look at the self interests of all parties involved. 9: I’ll use this password for this account, and this account, and this account… It’s a rookie move to reuse the same password across multiple accounts. Andrew Hay (@andrewsmhay), an analyst for 451 Research, made that newbie mistake and it cost Hay one of his accounts to be compromised. The attacker got into his Gmail account and began sending malicious links to Hay’s five most frequently contacted addresses. Hay immediately changed his password and sent apologetic emails to those affected. “Just being a security professional doesn’t mean that you’re above common mistakes that any user might make,” said Hay. “The best thing to do is to ensure you’re aware of the mistakes, as you make them, and react accordingly.” 10: We only offer secure access to our system, unless you want to use our test machine Back when Chris Wysopal (@WeldPond), blogger at the Veracode Blog, worked at the hacker think tank, L0pht Heavy Industries, the members shared computing resources via Secure Shell (SSH) encrypted connections, not text-based Telnet authentications. That held true unless you chose to access the network via a neglected test machine that did have Telnet enabled. When an SSH client wasn’t available, one authorized user took advantage of the test machine to Telnet into the system. During that process the authorized user crossed an insecure network and his password was sniffed. The attacker made a big deal of “hacking the L0pht” and it made Wysopal and the other members look pretty foolish. “The lesson here is not to give your users a less secure way to get something done or they will pick it and be compromised,” said Wysopal. 11: This system was secure when I bought it Uncommon Sense Security blogger Jack Daniel (@Jack_Daniel) used to manage an outdated networking environment. Part of the network’s problem was underfunding and the other part was Daniel’s admittedly dated tech skills. His complacency drove him into what he lovingly calls “The Pit of Despair” where he begged for money and scrambled to secure obsolete systems. Daniel jokes that he “solved” his problem by moving into marketing where it doesn’t matter. In actuality, he keeps up on his tech skills by running a small lab in his home. 12: I’ll just be really smart and they’ll listen to me, right? Early in his infosec career, Mike Murray (@mmurray) put a lot of emphasis on trying to be the smartest guy in the room. Being in his early 20s he naively thought that he’d be successful just on the basis of his ideas. His colleagues would see how great his ideas were, they’d agree, and everyone would just do it. That philosophy never bore itself out. He was never able to get his security message across. It wasn’t that the others listening were stupid, said Murray. It’s just that certain people with authority were more passionate and knew how to sell their idea. Even though he knew their ideas were so wrong, they were so convincing and hard to disagree with. After spinning his wheels for years, Murray knew it would be career limiting if he stuck to his “be the smart guy” model, so he started researching and practicing how to present effectively. 13: We’ll make it a security policy and everyone will follow it Daniel Frye, Associate Vice President of Corporate Security at CedarCrestone (@CedarCrestone) implemented his first information security policy ten years ago. Part of the process was removing 600 default and insecure vendor IDs for which the username and password were the same. Frye assumed all the teams and their clients were managing systems according to “policy.” That turned out not to be the case as Frye saw thousands of default IDs still on the system. The high number was due to account replication happening every time an application development stage launched. Frye learned two very good morals: “You have to have recurring and ongoing monitoring of all of your security policies and processes. Without monitoring, you have absolutely no indication of how well you’re securing your environment. “You have to not only educate your staff, but your clients as well. There is no limit to security education despite organizational boundaries.” 14: Hurry up, we need to fix this problem right now! A number of years ago Security Watch blogger Brian Honan (@BrianHonan) was working with a client trying to isolate an IP address that was sending errant traffic. To stop the influx, they asked the outsourced IT service provider to block the IP address at the firewall. Honan mistakenly assumed the engineer knew what he was doing. He didn’t. Minutes later the branch office called to complain they couldn’t connect to the network. The engineer had unknowingly caused a Denial of Service (DOS) against the client. “While working on a live incident can be challenging and frantic you should spend the time to slow down, think, and double check everything,” said Honan. “The last thing you need when dealing with an incident is to cause more issues through basic mistakes.” 15: Yeah sure, the USB key is secure A very desperate coworker barges into military professional Rob Ton’s office and begs him to print out a document from his thumb drive. While working as an infosec rep, Ton and his company had strict rules against accepting unknown USB keys. The coworker pleaded saying he needed the document for a meeting and swore that he scanned the key for viruses with the latest definitions. After much persuasion Ton relented and stuck the key into his machine only to discover there was a virus on the USB key. This blunder required him to quarantine his computer, file a report, and look like a fool to his manager. The truth was the coworker did know there was a virus on the USB key but was so desperate that he lied to Ton. Looking back, Ton learned many lessons including: People will lie to get what they want. Having to report yourself makes you feel like an idiot. When you have a sensible personal rule – stick to it. 16: I’ll just put this firewall up and that will solve all our problems Prior to the explosion of the web, Marcus Ranum (@mjranum), now CSO of Tenable Network Security, thought information security problems were just technical problems. While it was easy to secure services such as FTP, Telnet, USENET, and email/SMTP, it wasn’t so easy to secure the web, as non-security people always required exceptions, ports that needed to be open on the firewall. At that time, Ranum had a limited view of security, thinking it was purely a technical problem to be solved. Ranum’s big eye opening moment came in the mid-90s when a board room of big shot executives were negotiating access rights with the security guys.It didn’t matter what compromise they made, realized Ranum, the hackers, who weren’t even there, were going to find another means regardless of what was “agreed upon.” “The firewall isn’t security,” thought Ranum. “The firewall is this universally bypassed thing that just slows things down a bit.” When things go wrong, we want to attribute the decisions to the other party being stupid. But it’s not that they’re being stupid, they just have a different agenda than us. They’ll still blame us if it all goes wrong, said Ranum. 17: This is going to be the best marketing gimmick ever! When starting his cloud-based website scanning business, Golem Technologies, Charlie Belmer provided an option to allow users to try a free limited test scan to get a feel for the service. This offer was a little too enticing as people came in droves to test scan Belmer’s own site. It only took 50 concurrent scans against his own site before it was brought to its knees. He had effectively DOS’ed himself. That overzealous publicity stunt required him to enforce a one-at-a-time scan per site limit. “It taught me to add additional controls because even non-malicious users can misuse something to cause a security problem,” Belmer said. 18: I’ll just dictate security and it’ll work Just starting out in her security career, Amy Rinkle (@ARinkle) thought the way to create solid security is to only engage with her supervisor and head of the organization about needs and processes. “Resources were limited; and so was my time, and it was far easier to design processes and safeguards than it was to explain and tailor them to the culture of the organization,” admitted Rinkle. Even though Rinkle had the support of her supervisor who said she would make sure people would jump on the security bandwagon, she realized that she should have collaborated with the whole staff. “I should have spent more time lobbying my cause with coworkers and other staff – because had they understood from the beginning exactly what was required of them and why it was so important for them, I wouldn’t have been putting out fires after the new system had been put into place,” Rinkle said. 19: People are usually very thorough when filling out survey forms Have you ever trusted your network security on what someone filled out in a network topography and connectivity survey form? Andrew Storms (@St0rmz), Director of Security Operations for nCircle, did after their company acquired another company. Filling out a survey form was normal business procedure after a company acquisition. Looking at the form, the network connections all seemed good. Things changed when they actually saw the network. An on-site visit revealed an undisclosed T1 connection that exposed all their partner’s source code. “I’m sure that T1 made a lot of business sense but it made no security sense whatsoever,” said Storms. “The lesson I learned from this was how important it is to build security into business processes.” 20: All vulnerabilities take priority over the business “I’ve been guilty in the past of focusing on technical minutia that the business absolutely couldn’t care less about,” admitted Gene Kim (@realgenekim), founder of Tripwire. “I once believed that all vulnerabilities should be fixed, even if it’s at the expense of important business project work. Or that patches should be applied without thorough testing. Or that emergency capital be used to remediate every compliance failure.” Kim quickly was seen as one of “the shrill, hysterical people who seem to get in the way of getting ‘real work’ done.” He quickly realized he was in the wrong when they told him, “We’re not in the business of security, but in the business of staying in business.” Many of these “security vs. business” issues and more are characterized in Kim’s upcoming book “When IT Fails: The Novel.” 21: Eventually, when I have time, I’ll encrypt that hard drive Back in 2005, Larry Ponemon (@ponemon), Chairman and Founder of the Ponemon Institute, purchased a brand new Dell computer. He loaded it with tons of confidential information, backed it up, but didn’t encrypt the data. He was traveling so much he simply didn’t have the time to do the encryption. After a long day of traveling, Ponemon was trying to catch a plane with his new Dell in tow. Completely exhausted, and his mind in a million other places, he dumped his computer into the plastic tray and ambled through security. As he walked away, people started shouting “Thief!” Ponemon stopped and turned around. Who were they talking to? Ponemon accidentally picked up someone else’s computer. He apologized profusely, but in the interim the TSA had taken his computer away. It was gone. A million new thoughts raced through his head. How is he going to explain this? He was the classic traveling dunce. His mind was in another place. All he was worried about was catching a plane and getting home. It is exactly what he has written about time and again, and now he was the subject of his own research. Eventually Ponemon got his computer back, but he did miss his flight, and immediately encrypted his data. “It made me appreciate that it can be anyone. It doesn’t have to be a dunce. It could just be anyone just rushing and being absent minded,” said Ponemon, “It was a lesson in humility.” 22: No one is going to screw with my unattended computer in the office Many years ago Andrew Jacquith (@arj), CTO of Perimeter E-Security, was working at @stake, a security consultancy he helped found. One of his clients, a CISO at a defense industrial base, would walk around to unattended computers and check to see if the screensaver with password was engaged. If it wasn’t, he would gleefully open up the victim’s email and send out messages to selected colleagues, or worse, his or her boss. Jacquith fell victim to the CISO’s wrath. “I don’t remember what he typed, but I remember it was highly creative, totally nonsensical, and deeply vulgar,” said Jacquith who now uses a password-locked screensaver. 23: Let’s just build one version of our company’s infrastructure Back in the day, Greg MacPherson, CISSP, managed a team to build the backend infrastructure of a startup. One of his team members ran a UNIX cron job that unfortunately exposed a massive hole in the SUN Solaris file system resulting in corrupted drive partitions. His team spent 19 hours rebuilding the production infrastructure from bare disk drives. The only thing that saved his business was that customer data was on hardware RAID shelves that didn’t get corrupted. After MacPherson restored everything his boss rewarded him with a pink slip. MacPherson learned a valuable lesson: “Anticipate failure…Have master disks with applications installed and licensed and invest in an expensive RAID or some other bulletproof storage to maintain the integrity of your data.” 24: Wow, a cool new untested security product! Dwayne Melancon (@ThatDwayne), CTO of Tripwire, is like most geeks. He likes products that are new and shiny. Unfortunately, his love of new tech surpassed his better judgment. Once while running IT for a tech company, he let an outsourcing provider sell him on the brand new cool “active Intrusion Prevention System” (IPS) to protect their company website from being defaced. Two weeks later the IPS blocked an attempted patch to the production web server. It then prevented anybody, at his company or the provider, from fixing the problem. The web server was completely non-functional and the only guy who had the password to turn off the IPS was on vacation. Melancon learned a few valuable lessons: Cool new technology isn’t always the best answer. Don’t rely on “the guy” to get you out of trouble. You need to develop your bench and cross-train. Test, test, test before you go live in production. 25: Ha! Sure glad that wasn’t me. A decade ago Martin McKeay (@McKeay), host of The Network Security Podcast, was a system administrator at a small company. One afternoon he was looking out the window and saw a large crow land on a power transformer and then go poof in a puff of smoke as its feet made a circuit between the contacts on the transformer. McKeay laughed for a few seconds until he watched the power of the building go out and heard the sound of the UPS straining to keep his servers up and running. McKeay learned a valuable lesson: “Don’t laugh at the misfortune of others, they may soon be your own.” Conclusion: You can make mistakes in infosec and still be very successful What all these stories show is that even the smartest people in security make stupid mistakes. And each flub turned out to be a valuable learning experience. We hope we’ve passed those teachings on to you. If you found this valuable, please return the favor and share your security blunder story and what you learned from it as well. Kilde: 25 Infosec Gurus Admit to their Mistakes…and What They Learned from Them [Tripwire.com HTTPS.]