8.7 Privacy Peril, Beacon, and the TOS Debacle: What Facebook’s Failures Can Teach Managers about Technology Planning and Deployment
- Understand the difference between opt-in and opt-out efforts.
- Recognize how user issues and procedural implementation can derail even well-intentioned information systems efforts.
- Recognize the risks in being a pioneer associated with new media efforts, and understand how missteps led to Facebook and its partners being embarrassed (and in some cases sued) as a result of system design and deployment issues.
Conventional advertising may grow into a great business for Facebook, but the firm was clearly sitting on something that was unconventional compared to prior generations of Web services. Could the energy and virulent nature of social networks be harnessed to offer truly useful consumer information to its users? Word of mouth is considered the most persuasive (and valuable) form of marketing,V. Kumar, J. Andrew Petersen, and Robert Leone, “How Valuable Is Word of Mouth?” Harvard Business Review 85, no. 10 (October 2007): 139—46. and Facebook was a giant word of mouth machine. What if the firm worked with vendors and grabbed consumer activity at the point of purchase to put it into the news feed and post it to a user’s profile? If you rented a video, bought a cool product, or dropped something in your wish list, your buddies could get a heads-up, and they might ask you about it. The person being asked feels like an expert, the person with the question gets a frank opinion, and the vendor providing the data just might get another sale. It looked like a home run.
This effort, named Beacon, was announced in November 2007. Some forty e-commerce sites signed up, including Blockbuster, Fandango, eBay, Travelocity, Zappos, and the New York Times. Zuckerberg was so confident of the effort that he stood before a group of Madison Avenue ad executives and declared that Beacon would represent a “once-in-a-hundred-years” fundamental change in the way media works.
Like News Feeds, user reaction was swift and brutal. The commercial activity of Facebook users began showing up without their consent. The biggest problem with Beacon was that it was “opt-out” instead of “opt-in.” Facebook (and its partners) assumed users would agree to sharing data in their feeds. A pop-up box did appear briefly on most sites supporting Beacon, but it disappeared after a few seconds.E. Nakashima, “Feeling Betrayed, Facebook Users Force Site to Honor Their Privacy,” Washington Post, November 30, 2007. Many users, blind to these sorts of alerts, either clicked through or ignored the warnings. And well…there are some purchases you might not want to broadcast to the world.
“Facebook Ruins Christmas for Everyone!” screamed one headline from MSNBC.com. Another from U.S. News and World Report read “How Facebook Stole Christmas.” The Washington Post ran the story of Sean Lane, a twenty-eight-year-old tech support worker from Waltham, Massachusetts, who got a message from his wife just two hours after he bought a ring on Overstock.com. “Who is this ring for?” she wanted to know. Facebook had not only posted a feed that her husband had bought the ring, but also that he got it for a 51 percent discount! Overstock quickly announced that it was halting participation in Beacon until Facebook changed its practice to opt in.E. Nakashima, “Feeling Betrayed, Facebook Users Force Site to Honor Their Privacy,” Washington Post, November 30, 2007.
MoveOn.org started a Facebook group and online petition protesting Beacon. The Center for Digital Democracy and the U.S. Public Interest Research Group asked the Federal Trade Commission to investigate Facebook’s advertising programs. And a Dallas woman sued Blockbuster for violating the Video Privacy Protection Act (a 1998 U.S. law prohibiting unauthorized access to video store rental records).
To Facebook’s credit, the firm acted swiftly. Beacon was switched to an opt-in system, where user consent must be given before partner data is sent to the feed. Zuckerberg would later say regarding Beacon: “We’ve made a lot of mistakes building this feature, but we’ve made even more with how we’ve handled them. We simply did a bad job with this release, and I apologize for it.”C. McCarthy, “Facebook’s Zuckerberg: ‘We Simply Did a Bad Job’ Handling Beacon,” CNET, December 5, 2007. Beacon was eventually shut down and $9.5 million was donated to various privacy groups as part of its legal settlement.J. Brodkin, “Facebook Shuts Down Beacon Program, Donates $9.5 Million to Settle Lawsuit,” NetworkWorld, December 8, 2009. Despite the Beacon fiasco, new users continued to flock to the site, and loyal users stuck with Zuck. Perhaps a bigger problem was that many of those forty A-list e-commerce sites that took a gamble with Facebook now had their names associated with a privacy screw-up that made headlines worldwide. Not a good thing for one’s career. A manager so burned isn’t likely to sign up first for the next round of experimentation.
From the Prada example in Chapter 3 "Zara: Fast Fashion from Savvy Systems" we learned that savvy managers look beyond technology and consider complete information systems—not just the hardware and software of technology but also the interactions among the data, people, and procedures that make up (and are impacted by) information systems. Beacon’s failure is a cautionary tale of what can go wrong if users fail to broadly consider the impact and implications of an information system on all those it can touch. Technology’s reach is often farther, wider, and more significantly impactful than we originally expect.
Predators and Privacy
While spoiling Christmas is bad, sexual predators are far worse, and in October 2007, Facebook became an investigation target. Officials from the New York State Attorney General’s office had posed as teenagers on Facebook and received sexual advances. Complaints to the service from investigators posing as parents were also not immediately addressed. These were troubling developments for a firm that prided itself on trust and authenticity.
In a 2008 agreement with forty-nine states, Facebook offered a series of aggressive steps. Facebook agreed to respond to complaints about inappropriate content within twenty-four hours and to allow an independent examiner to monitor how it handles complaints. The firm imposed age-locking restrictions on profiles, reviewing any attempt by someone under the age of eighteen to change their date of birth. Profiles of minors were no longer searchable. The site agreed to automatically send a warning message when a child is at risk of revealing personal information to an unknown adult. And links to explicit material, the most offensive Facebook groups, and any material related to cyberbullying were banned.
Reputation Damage, Increased Scrutiny, and Recovery—Learning from the Facebook TOS Debacle
Facebook also suffered damage to its reputation, brand, and credibility, further reinforcing perceptions that the company acts brazenly, without considering user needs, and is fast and loose on privacy and user notification. Facebook worked through the feeds outrage, eventually convincing users of the benefits of feeds. But Beacon was a fiasco. And now users, the media, and watchdogs were on the alert.
When the firm modified its terms of service (TOS) policy in spring 2009, the uproar was immediate. As a cover story in New York magazine summed it up, Facebook’s new TOS appeared to state, “We can do anything we want with your content, forever,” even if a user deletes their account and leaves the service.V. Grigoriadis, “Do You Own Facebook? Or Does Facebook Own You?” New York, April 5, 2009. Yet another privacy backlash!
Activists organized; the press crafted juicy, attention-grabbing headlines; and the firm was forced once again to backtrack. But here’s where others can learn from Facebook’s missteps and response. The firm was contrite and reached out to explain and engage users. The old TOS were reinstated, and the firm posted a proposed new version that gave the firm broad latitude in leveraging user content without claiming ownership. And the firm renounced the right to use this content if a user closed their Facebook account. This new TOS was offered in a way that solicited user comments, and it was submitted to a community vote, considered binding if 30 percent of Facebook users participated. Zuckerberg’s move appeared to have turned Facebook into a democracy and helped empower users to determine the firm’s next step.
Despite the uproar, only about 1 percent of Facebook users eventually voted on the measure, but the 74 percent to 26 percent ruling in favor of the change gave Facebook some cover to move forward.J. Smith, “Facebook TOS Voting Concludes, Users Vote for New Revised Documents,” Inside Facebook, April 23, 2009. This event also demonstrates that a tempest can be generated by a relatively small number of passionate users. Firms ignore the vocal and influential at their own peril!
In Facebook’s defense, the broad TOS was probably more a form of legal protection than any nefarious attempt to exploit all user posts ad infinitum. The U.S. legal environment does require that explicit terms be defined and communicated to users, even if these are tough for laypeople to understand. But a “trust us” attitude toward user data doesn’t work, particularly for a firm considered to have committed ham-handed gaffes in the past. Managers must learn from the freewheeling Facebook community. In the era of social media, your actions are now subject to immediate and sustained review. Violate the public trust, and expect the equivalent of a high-powered investigative microscope examining your every move and a very public airing of the findings.
For Facebook, that microscope will be in place for at least the next two decades. In a late 2011 deal with the U.S. Federal Trade Commission, Facebook settled a series of governmental inquiries related to issues such as the ones outlined above—events that Zuckerberg admits added up to “a bunch of mistakes” made by the firm. Facebook agreed to undergo twenty years of regular third-party privacy audits, and to a host of additional restrictions that include getting users’ consent before making privacy changes, and making content from deleted profiles unavailable after 30 days. If Facebook fails to comply with these terms, it will face fines of $16,000 per violation per day.L. Gannes, “Facebook Settles with the FTC for 20 Years of Privacy Audits,” AllThingsD, November 29, 2011.
- Word of mouth is the most powerful method for promoting products and services, and Beacon was conceived as a giant word-of-mouth machine with win-win benefits for firms, recommenders, recommendation recipients, and Facebook.
- Beacon failed because it was an opt-out system that was not thoroughly tested beforehand and because user behavior, expectations, and system procedures were not completely taken into account.
- Partners associated with the rapidly rolled out, poorly conceived, and untested effort were embarrassed. Several faced legal action.
- Facebook also reinforced negative perceptions regarding the firm’s attitudes toward users, notifications, and their privacy. This attitude only served to focus a continued spotlight on the firm’s efforts, and users became even less forgiving.
- Activists and the media were merciless in criticizing the firm’s terms of service changes. Facebook’s democratizing efforts demonstrate lessons other organizations can learn from, regarding user scrutiny, public reaction, and stakeholder engagement.
- A combination of firm policies, computerized and human monitoring, aggressive reporting and follow-up, and engagement with authorities can reduce online predator risks. Firms that fail to fully engage this threat put users and communities at risk and may experience irreparable damage to firms and reputations.
Questions and Exercises
- What was Beacon? Why was it initially thought to be a good idea? What were the benefits to firm partners, recommenders, recommendation recipients, and Facebook? Who were Beacon’s partners, and what did they seek to gain through the effort?
- Describe “the biggest problem with Beacon”? Would you use Beacon? Why or why not?
- How might Facebook and its partners have avoided the problems with Beacon? Could the effort be restructured while still delivering on its initial promise? Why or why not?
- Beacon shows the risk in being a pioneer—are there risks in being too cautious and not pioneering with innovative, ground-floor marketing efforts? What kinds of benefits might a firm miss out on? Is there a disadvantage in being late to the party with these efforts as well? Why or why not?
- Why do you think Facebook changed its terms of service? Did these changes concern you? Were users right to rebel? What could Facebook have done to avoid the problem? Did Facebook do a good job in follow-up? How would you advise Facebook to apply lessons learned form the TOS controversy?
- Investigate the current policies regarding underage users on Facebook. Do you think the firm adequately protects its users? Why or why not?
- What age is appropriate for users to begin using social networks? Which services are appropriate at which ages? Are there social networks targeted at very young children? Do you think that these are safe places? Why or why not?