A Nudge towards Security
by Austin Bowman
Consumers are their own worst enemies when it comes to cybersecurity, but the security experts and policymakers are not responding usefully to consumer failures in this area. After each major cybersecurity event, WannaCry being a recent example, security experts and policymakers lament the poor security practices of consumers and push for more education and awareness training thinking that if only consumers were educated, cyberspace would practically secure itself. But this solution puts the onus on consumers to understand and engage complicated issues that frequently change, and completely overlooks the power security experts and policymakers already have to increase the security of consumers. Policymakers and security experts should use this power because it is well known that consumers routinely make irrational decisions. These decisions are in fact so frequent that they are predictable. Rather than scolding consumers to be better, companies, agencies, and every organization involved in information security should be nudging consumers towards making the right (i.e., secure) choice.
Cognitive biases and irrational thinking occur for a variety of reasons, but they are incredibly pervasive in situations where consumers face indistinguishable options and are not given frequent feedback. Education informs consumers and leads them to make better choices. That’s why the “RECAP” idea (Record, Evaluate, and Compare Alternative Prices), explained by Richard Thaler and Cass Sunstein in their 2008 book, “Nudge”, is so effective. Frequent feedback—or better yet, direct comparisons to peers—lead consumers to make more informed (i.e., better) decisions. Default choices frequently guide behavior as well. Simply changing the default option can have dramatic consequences and produce positive outcomes.
Cybersecurity professionals, application developers, and policymakers should all be looking for ways to apply these insights to securing cyberspace and protecting consumers. True, education is a powerful tool that has been shown to work in the past. Education and nagging children led to the mass adoption of seatbelts, not heavy-handed government intervention. But education takes time, and unlike seatbelts, cyber hygiene is difficult to see. Consumers should be educated on proper cyber hygiene, but security will not come until such behavior becomes routine. Companies and agencies can speed up the process by nudging consumers in the right direction.
Take authentication for example. Current best practice focuses on two-factor authentication, but consumers rarely seek out this feature and companies obscure it behind multiple security menus. Google and Facebook both offer two-factor authentication, but because of the “power of defaults”, the majority of consumers never activate this feature. Companies could easily require consumers to activate two-factor authentication when they sign up. Most sites already ask for a phone number, so why not automatically enroll consumers to provide increased security? Consumers could still opt-out later if they desired, but research shows that they are unlikely to do so.
Updates and patches could also benefit from the power of defaults. Rather than making patches “opt-in,” automatic patching could become “opt-out.” Patches have been known to cause issues for large organizations when the patch is unstable, but for the average consumer, the cost-benefit of automatic patching is significantly lower. Microsoft, Apple, and other large firms could require consumers to “opt-out” of automatic patching rather than require consumers to update on their own—something many fail to regularly do. Google phones already opt-in customers to beta test new features and apps; why not opt them in for secure measures as well? Microsoft could also automatically enable “Show file extensions”—making it easier to spot potentially malicious files—instead of hiding it behind layers of menu settings.
Companies can nudge consumers towards better cyber hygiene using consistent feedback and the desire to conform. A consumer’s behavior is often impacted by what they perceive to be the actions of others, and sometimes even an emoji can make a difference. Companies and agencies could send consumers monthly emails detailing their online-security behavior compared to that of everyone else. Consumers practicing good cyber hygiene (e.g., frequently changing passwords, using two-factor authentication, etc.) would see that they are doing better than their peers; they may even be rewarded with a smiley face. Others with poor hygiene would see a frown, and provided with suggestions on how to improve their performance. Ridiculous as it may sound, these minor interventions have proven incredibly effective in other areas, so why not apply the same research to cybersecurity?
Companies and governmental agencies should also be in favor of nudges for one reason: increased self-protection. Poor cyber hygiene leads to easily preventable security incidents, and organizations end up paying the price. A single lapse in individual security can create massive headaches for the individual, but also the organization and the consumers it services. By nudging consumers towards secure options, service providers would be more able to protect themselves and their clients from needless risks.
Nudging behavior towards more prosocial outcomes may rub some individuals the wrong way. Such measures could be viewed as paternalistic or manipulative, and in a way they are. But companies and organizations are already manipulating the outcome of a consumer’s decision by choosing “opt-in” defaults and not providing feedback. This needlessly perpetuates insecurity for both the consumer and everyone else within the same network. The open-ended nature of a nudge gives consumers a choice. If a consumer would rather not have two-factor authentication they can simply opt out, but research shows they likely won’t and that’s a good thing.
Consumers, companies, and policymakers should focus on a small nudge in favor of security . Education would likely bring about the behavior change necessary to better secure cyberspace, but that change will take time—perhaps even a generation or two. Governmental intervention would likely be uncalibrated and inflexible given the fast-paced nature of cyberspace. By contrast, nudges would be immediate, calibrated, and participation would be entirely up to the consumer. If a simple nudge creates a safer and more secure cyberspace, we should respond, “Nudge away.”
Austin Bowman is currently a Master’s candidate at The Fletcher School of Law and Diplomacy at Tufts University, where he is concentrating on International Security Studies and International Political Economy. His interests include cybersecurity, information warfare, European and NATO policy, terrorism and counterterrorism, illicit finance, and U.S.-Russia relations.