Emulating the methods used to transform production quality could clean up the Internet — and might even pay for itself. (An excerpt from an article in Startegy & Business by Tim Laseter and Eric Johnson).
Pundits proclaim the miraculous power of the Internet. It ushered in a “New Economy” and created a “flat world.” We even refer to our progeny as members of the “Net generation.” More than 5 billion devices are now connected to the Internet, accessing or serving up 500 billion gigabytes of information and transmitting 2 trillion e-mails per day. The decentralized structure of the Internet has ushered in a new level of worldwide connectivity, enabling product development teams to collaborate across the globe, banks to reach people in the developing world, and middle-aged divorcees to find their high school sweethearts.
But this increasing connectivity has a dark side. Although spam recently dropped to its lowest levels in years, it still accounts for fully 75 percent of global e-mail traffic, 1.5 trillion messages per day. Every minute produces 42 new strains of malware — short for malicious software — including viruses, worms, and Trojans. An average of 8,600 new websites with malicious code are created each day, and 50 percent of results for the top 100 daily search terms lead to malicious sites. Until last year, 4.5 million computers were under the control of a single botnet that used these computers for nefarious means and disguised the malware presence by minimizing its impact on the computer’s performance and by eliminating other malware attempting to attack its network of computers. It was malware with its own antivirus software.
How then can we improve the safety and reliability of the Internet, an increasingly critical, shared global resource? As business leaders, managers and individuals, we place our trust in the technical wizards in the back room who run the servers and write the code. We install antivirus software as directed and update other programs when told (at least when we have time to restart our computers). But the results suggest this isn’t enough.
The best way to drive the kind of improvement in information security that would really clean up the Internet, we believe, is for corporate leaders and computer security professionals to reflect on the lessons of the manufacturing quality movement of the late 20th century. The methods employed by quality professionals — Six Sigma is an example — raised the visibility of the “cost of quality” and triggered a fundamental change in the philosophy of error prevention. Similarly, information security needs to be raised to the boardroom level, and the computer experts need to come out of the back rooms to engage all users to address the challenge. By doing this, we could collectively reduce malware to a level that does not put Internet-enabled advances at risk.
Quality and Information Security
If today’s managers adopted the approach taken by the quality movement toward product flaws, they could revolutionize how we tackle online security problems. The goals of information security are simple but daunting: ensure the confidentiality, integrity and availability of information. Unpacking those three words reveals that we want information to be limited to the owners and to those they grant access. We don’t want the information to be changed without the owner’s permission and we want to be able to access our information anytime we want.
To move from the back room to the boardroom, information security specialists should employ a common, rigorous framework for quantifying the bottom-line impact of security breaches. Too frequently, senior executives are aware only of the cost side of the equation. They see growing investments in software tools designed to catch problems but rarely see hard quantification of the benefits of these controls. Even more unusual is a quantification of the negative effects of excessive controls. Tight policies limiting use of new devices or unapproved applications offer greater security, but they also stifle innovation — something hard to quantify.
After quantifying the full costs of information security, companies need to focus on the root causes. Although “denial of service” attacks and botnets grab the headlines, the root causes of many security breaches include both benign and malicious insiders with access to sensitive information.
The increased visibility for management as well as users should be accompanied by higher expectations. Just as the quality movement led managers and employees to expect Six Sigma performance — and led consumers to expect that products would work right the first time and every time — so too must our society admit that current levels of cybersecurity are unacceptable. Do we really believe that an e-mail network consisting of three-quarters spam is acceptable? If we fully understood the untold costs of both lost productivity and needless investment in storage and bandwidth, we would also come to the conclusion that security is free. Once we — as a society composed of individuals, institutions, businesses and government — accept the challenge, then we can truly move down the road toward an open, but safe, global community.
Full Article :
- Executives Question the Value of AntiVirus Subscriptions; Enterprise Alternatives Available Now. (prweb.com)
- The New Laws of Anti-Malware Technology #4: There is No Single Threat Landscape; There are Many Threat Landscapes (sourcefire.com)
- Botnets: The Biggest Threat to Enterprise Security – Quick Take (biztechmagazine.com)
- Malware makes off with 45,000 Facebook passwords (ctv.ca)