Home – Creating a Culture of Enterprise Cybersecurity—Considering the Human Component

Creating a Culture of Enterprise Cybersecurity—Considering the Human Component


By Allen W. Batteau, Wayne State University

 

As networks extend our ability to communicate widely, they expose businesses and governments to hackers, competitors, disgruntled co-workers and other predators.

 

In America, the land of the techno-fix, discussions of cybersecurity overwhelmingly focus on hardware and software: encryption algorithms, firewalls, electromagnetic emissions, biometrics, retinal scans and other devices to shield systems. As important as these are, they are only one-third of the security equation. A disproportionate focus on security technology, as Bruce Schneier argues in Beyond Fear, can actually leave institutions more vulnerable to security gaps (Schneier 2003).

 

The other two-thirds of the security equation are people and process. Any security manager who is not attending to people, process and technology is inviting a major, and perhaps disastrous, incident.

 

Let’s consider the people, at the group level, where shared values and commitments—culture, in short—is a strong, if poorly understood determinant of organizational performance.

 

Corporate Culture as a Business Strategy

“Corporate culture” quickly made the transition from an academic theory to a consultants’ nostrum, pausing only in a few places to take root as a serious management strategy. Where it has taken root, it has produced a committed workforce and a coherent business strategy. In other companies, in the absence of leadership vision, local cultures have sprung up like weeds, creating numerous patches of resistance to corporate objectives, optimized for their own purposes, and draining off resources toward non-strategic ends.

 

Although the association between culture and performance is intuitively obvious, numerous studies have attempted to find, so far with disappointing results, correlations between measurements of corporate culture and performance. The disappointment is the result of a too-narrow understanding of both culture and performance, focusing only on that which can be quantified. The quality of relationships, is often more important than the quantitative measurement of attitudes. When a corporation’s culture embraces antagonism between employees and management from either side, it degrades performance, or more accurately, diverts performance in less than optimal directions.

 

A corporate culture is always contingent and emergent, and can never truly be managed. Corporations bring together numerous groups—their workforce, their professionals, their managers, their suppliers and their customers—in more or less permanent relationships. When the relationships are more permanent, shared understandings and trust can grow up among them, which then reinforce the permanence of the relationship. If any one group attempts to assert itself in this mix to too great an extent, it engenders resistance. At their best, these shared understandings create high levels of trust among members of a company. Communication becomes efficient and deep: parties have a sufficient depth of shared experience and understanding that a few words can evoke complex thoughts. It is this corporate landscape—this refinement of trust—that supports a strong security culture.

 

Safety Culture’s Lessons for Security

Aviation is without question the safest form of transportation available. This is due, in part, to a strong safety culture and trust shared by flight crews and executives alike, which includes thorough training, adherence to procedures, and an uncompromising acceptance of safety standards.

 

A group of political scientists, engineers and management theorists at the University of California Berkeley have studied organizations that they characterize as “high reliability organizations,” (Roberts 1990; Roberts and Rousseau 1989), a concept that has been closely linked to safety culture (Pidgeon 1991). These include nuclear power plants, naval aircraft carriers and air traffic control centers. Among the features shared by these organizations are constant training mode, accountability pushed to the lowest level, reliance on open and robust communications, and shared perceptions of hazards. The environments in which these observations were made were all spatially restricted, and not open to competitive pressures, and probably not replicable in the competitive world of business.

 

Like safety, security is also a matter of nuance. A good safety culture does not mean that the crews are excessively cautious, but rather that they are prudent and well-informed. With years of experience, pilots develop a strong sense of their flying abilities, how far they can push on in adverse conditions and when they should divert to an alternative destination. In similar fashion, security culture is not the same as paranoia. Security managers have to make judgments on the tradeoffs among reach, openness and ease-of-use of information. What forms of identification and authentication do we require before we let which users into which levels of our system? We always do trust some people, once they have been properly identified and authenticated. So what is the basis of that trust? Because they are employees? Most network attacks come from inside the company. Do we trust people because they have promised never, ever to download questionable material from the World Wide Web? Do we trust someone because he is our cousin? Our best friend from school? These are questions a security policy must answer; equally important, anyone who has access to an enterprise network must be familiar with the policy. Management support for the policy is equally critical.

 

Software can be hacked. Hardware can be bypassed. Policies can be weakly enforced. Trust can be abused. In other words, there is no single, foolproof security measure. The only silver bullet for security is to weld the doors shut, disconnect the electric power, fire all employees and go out of business. Short of these drastic measures effective security always includes defense in depth and people in the loop.

 

  • Defense in depth is the opposite of the techno-fix. It means having multiple (and consistent) policies, procedures and resources reinforcing security, so that if one is breached, another backs it up. It requires an in-depth understanding of the company’s business (more so than any consultant is likely to have), and a requisite imagination to anticipate a variety of threats: it is the opposite of “trust no one.”
  • People in the loop,” of course, begs the question of “which people.” Bruce Schneier makes it clear that often people are the last and most resourceful line of defense: It was the passengers, not the U.S. Air Force, who saved the White House from the hijackers on United Airlines 93 on September 11, 2001. For more ordinary situations, perhaps the greatest task of security culture is deciding which people to trust and why. Categories of identity (my cousin, my fraternity brother, my boss, those guys in accounting) are no substitute for training and shared experience, reinforced by long-standing and multi-stranded relationships.

 

The key element is that cybersecurity discussions must take place within a context of trust and respect. Again, figuring out whom to trust, and communicating respect, are major challenges. As enterprises evolve from top-down command-and-control hierarchies into networks of collaboration (Hecksher and Adler 2006), answering this question will graduate from a maintenance issue into a core part of business strategy.

 

Relationships in an Online World?

As difficult as this is in our everyday, face-to-face lives, its complexity grows in online or computer-mediated communication. When business was conducted with a close circle of nearby associates, this was easy. More distant commercial relationships were often handled through trusted intermediaries, such as banks, in which both parties placed their trust (and their funds). These fiduciary institutions were trusted in part because of the social prestige of their owners and officers, and in part because they are in a highly regulated environment.

 

In today’s business environment, these relationships are being rapidly reconfigured. Relationships are numerous and distant; leading institutions have sometimes shown themselves unworthy of trust; mortgage lenders have evolved from pillar to predator in many communities. In this brave new world, we trust sensitive information and funds to people we have never met; defining a circle of trusted associates is nearly impossible. For businesses that rely on online transactions, developing a reputation for information security, as PayPal® has done, is a core part of their business strategy.

 

Managing Humans, Not Cogs in a Machine

Companies are not machines, and their employees, suppliers and customers are not mechanical components. Nevertheless, many managers aspire to the mechanistic ideal where everything hums along “like clockwork.” This makes cultural “management” far more challenging. When humans are treated as cogs in a machine, they return the favor by exhibiting all the learning capability of spur gears. Entire industries have ground to a halt because they tried to suppress the learning capability of their members. More prosaically, numerous industries perform sub-optimally because they fail to take advantage of the intelligence and resourcefulness of the people within them.

 

Most of the literature on “corporate culture” is either an impressionistic sensitizing of managers to the facts of corporate culture, which is often progress, or inventories of culturally iconic practices and artifacts, which can be somewhat useful, in that it opens a storehouse of knowledge of different cultural forms.

 

A third type, which is more pernicious, is a substantial body of consultant literature that suggests that an attentive manager can change his culture if he simply manipulates his personnel policies, or his reward system, or his corporate communication, or the corporate dress code, or the office furniture, in some appropriate manner. This literature points to companies that go through “cultural changes,” including (as an extreme example) one company that “changed” its “culture” three times in ten years. This view of culture as packaged and commodified sends messages of manipulation throughout an organization.

 

America’s contemporary security culture has several blind spots: A fascination with techno-fixes rather than balancing trust and mistrust, and a desire for immediate results and a short attention span. An effective security culture will be that which understands that these must be tempered with people-centeredness, patience and respect.

 

Learning How to Trust in Complex Environments

Generalized trust is indicative of a strong social bond that is not reducible to legal formulations or corporate procedures. It has a flexibility and an adaptability that transactional trust lacks. It can be spontaneous, as happens with two individuals coming together in an extreme situation, or it can be built up over generations, as in the example of family alliances in business and government. It can be reinforced by demonstrations of irrational beyond-the-call-of-duty commitment, or it can be liquidated by corporate downsizings.

 

In between generalized trust and transactional trust is strategic trust, a nuanced and articulate understanding of whom to trust with which information in which situation. A culture of cybersecurity is a complex amalgamation of generalized and transactional and strategic trust relationships. This culture cannot be designed, in the sense that an engineer designs a complex piece of machinery, but it can be cultivated, in the sense that a gardener cultivates a flower garden.

 

A resilient security culture is one that creates opportunities for this learning within clearly defined boundaries. Such learning opportunities range from training exercises to after-action analyses of security breaches. Redundant personnel or “multiple eyes” means that errors are more likely to get trapped before they escalate into accidents. In the open environment of an air traffic control tower or aircraft carrier flight deck, there are many eyes watching what the controller or the deckhand is doing, and all have the authority to intervene if an error is spotted. Perversely, the paranoid, secretive aspect of many security environments assures that a careless or determined leaker or hacker can carry on for weeks before being detected.

 

The second aspect of high reliability organizations that would form a foundation of a culture of enterprise cybersecurity would be clearly agreed-upon goals, from top to bottom. On the flight deck, shared commitment to safety means that all personnel, from the captain down to the deckhands, participate in making sure that operations are safe.

 

By contrast, if one aspect of the organizational culture is a culture of alienation, or if there is cynical disillusionment regarding the organization’s integrity, then the only issues in a security breach are “how soon?” and “how massive?” The greatest breach of enterprise cybersecurity in recent years, resulting in the WikiLeaks disclosures of military and diplomatic activities in Iraq, allegedly came from a low-level computer operator, PFC Bradley Manning, at a time when there was widespread cynicism and disillusionment, by military and civilians alike, regarding the American mission in Iraq.

 

Staying True to the Mission

A resilient security culture, like the safety culture in high reliability organizations, is one face of organizational integrity, in which a strong culture of inclusion—“we’re all in this together” —coexists with a clearly defined and accepted mission and commitment of resources appropriate to strategic objectives. Anything less—untrained and embittered personnel, misrepresentation of objectives, executive overcompensation, blindness to environmental threats—assures that compromised security will eventually compromise an enterprise’s mission and possibly its existence.

 

Acknowledgement:

Research for this article was supported by grants from the National Science Foundation, 0428216. The conclusions represent the opinion of the author, and not the National Science Foundation.

 

References Cited

Batteau, Allen
2000 Negations and Ambiguities in the Cultures of Organization. American Anthropologist 102(4), 726-740.

 

Pidgeon, Nick F.
1991 Safety Culture and Risk Management in Organizations. Journal of Cross-Cultural Psychology 22(1), 129-140.

 

National Commission on Terrorist Attacks upon the United States.
2004 The 9/11 Commission Report. New York. W. W. Norton.

 

Roberts, Karlene
1990 New Challenges in Organization Research: High Reliability Organizations. Industrial Crisis Quarterly 3:111-125.

 

Roberts, Karlene, and Denise M. Rousseau
1989 Research in Nearly Failure-Free, High-Reliability Organizations: Having the Bubble. IEEE Transactions on Engineering Management 36(2), 132-139.

 

Rosen, Michael
1985 Breakfast at Spiro’s: Dramaturgy and Dominance. Journal of Management11(2), 31-48.

 

Schneier, Bruce
2003 Beyond Fear: Thinking Sensibly about Security in an Uncertain World. New York. Copernicus Books.

 

Van Maanen, John
1991 The Smile Factory: Work at Disneyland. In Frost, Peter J., Larry F. Moore, Meryl Reis Louis, Craig Lundberg, and Joanne Martin, eds., Reframing Organizational Culture, pages 58-76. Newbury Park, California. Sage Publications.