Amster Rothstein and Ebenstein, LLP - Intellectual Property Law

Is Facebook Killing Privacy Softly? The Impact of Facebook’s Default Privacy Settings on Online Privacy

- The NYU IP and Entertainment Law Ledger, April 21, 2011
Author(s): Michael J. Kasdan

“IMPORTANT!! Tomorrow, Facebook will change its privacy settings to allow Mark Zuckerberg to come into your house while you sleep and eat your brains with a sharpened spoon. To stop this from happening go to Account > Home Invasion Settings > Cannibalism > Brains, and uncheck the “Tasty” box. Please copy and repost.”
- Satirical Status Post from Friend’s Facebook Status on February 15, 2011.


Introduction
Since launching its now ubiquitous social networking website out of the Harvard dorm room of Mark Zuckerberg in early 2004, Facebook has rapidly become one of the most dominant websites on the planet. And “rapid” doesn’t quite do it justice. It has been estimated that over 40% of the U.S. population has a Facebook account.1 Facebook now boasts over 600 million active user accounts2 and was recently estimated to be adding user accounts at the unbelievable clip of well over half a million new users per day.3

The very nature of a social networking site like Facebook is to provide its users with a platform through which they can share massive amounts of personal information. Facebook has created a platform where users can post personal data, such as their contact information, birthdays, favorite movies, books, music and news articles, share scads of written comments and notes, post pictures and videos of themselves and others, associate themselves with various products, services, and groups, and post information about where they are and what they are doing.

Over its six-year existence, Facebook’s privacy policy - the set of rules that dictate which information is shared and with whom - has undergone significant revisions that have had the effect of collectively encouraging, and in some cases requiring, users to share more personal information with bigger groups of people and companies. Facebook’s original privacy policy provided that no personal information would be shared with any other user who did not belong to a group specified in the user’s privacy settings. The principle behind this policy was one of user control over personal information. By contrast, under today’s Facebook privacy policy, owners of numerous websites and applications may access broad categories of user information, and the default settings are such that many categories of user information will be widely accessible, unless users carefully review and modify them.

This article explores the background, impact, and legal and policy challenges posed by Facebook’s evolving privacy policy.

Background
Facebook (www.facebook.com) is a social-networking website that is privately owned and operated by Facebook, Inc. Facebook is free to use. Once registered, users of the Facebook.com website may create a personal profile and can then create their “social network” by inviting other users to be their “friends.” Users can upload photos and albums and update their “status” to inform their friends of their whereabouts, actions, and thoughts. Users and their friends may communicate with each other through private and/or public messages (i.e., privately, through email, or publicly, by writing or posting a comment on another user’s “wall”), view and comment on each other’s status updates and postings, and share and comment on each other’s pictures, videos, and other Internet content.4

Facebook users can also associate with and recommend (i.e., “like”) brands, products, services, web pages, and articles posted all over the Internet by clicking a “like” button on Facebook or on those web pages. When a user’s friend views that same web page, they can see which of their friends have “liked” the page. A user’s “likes” are also posted in that user’s “newsfeed,” which is a running list of comments, pictures, status updates, etc. of that user and his friends that is visible to friends. These “likes” are, of course, also recorded by Facebook’s business partners that are associated with brands, products, and services. Most recently, Facebook has added a “check-in” or “places” feature. Using this feature, Facebook users can indicate that they are currently at a restaurant, store, bar, or other real-world location. This information is posted onto their profile and is also recorded by Facebook for use by its business partners, which may include, for example, the restaurant or bar at which the user has checked in.

In addition, Facebook has partnered with certain third-party websites, such as Yelp, to provide Facebook “personalization features” for its users. Specifically, if a user has a Facebook account and goes to the Yelp website, a site that collects user reviews about businesses such as restaurants and bars, that user will be able to see which of his Facebook friends have reviewed a particular business, which friends have “liked” a particular business, and review his Facebook friends’ Yelp reviews and “likes.”5

Facebook users can also access third-party applications (“apps”) on the Facebook site. These apps include trivia quizzes, games, and other interactive content. Many of these applications gather personal information about the user and his Facebook friends.

There clearly are tremendous benefits to the social networking experience on Facebook. The broad disclosure by users and their friends of all sorts of personal details about themselves is central to Facebook’s functionality. It is in large part what makes Facebook interesting, interactive, and fun to its users. It is also equally (if not more) important to Facebook as a business6, and the key to its ability to monetize Facebook.com. Indeed, much of the perceived value of Facebook as a business is Facebook’s ability to gather personalized information about its massive user base and to leverage that user base. The costs to these same activities, in terms of the sacrifices to one’s own personal privacy, may be harder to spot at first, but they are also significant.7

Facebook’s Privacy Policy - A Brief History
Facebook’s privacy policy has undergone a significant shift over its relatively short existence. Its original policy limited the distribution of user information to a group of that user’s choice (thus creating a private space for user communication). By contrast, its current policy makes much user information public by default and requires other information to be public. This public information is accessible by Facebook and its business partners and advertisers. The shift in Facebook’s default privacy settings over time is perhaps most strikingly illustrated by an info-graphic created by Matt McKeon, a developer at the Visual Communication Lab at IBM Research.8 The blue shading indicates the extent that the viewing of various categories of information is limited to a user’s friends, friends of friends, all Facebook users, or the entire Internet. Heavier shading towards the outer part of the circle indicates that the information is more widely accessible.


Facebook has been criticized by certain privacy advocates and industry watch groups for its revision of its privacy policies. For example, after Facebook rolled out its revised privacy settings in late 2009, the Electronic Frontier Foundation (“EFF”) concluded that these changes reduce the amount of control the users have over their personal data while at the same time push Facebook’s users to publicly share more of their personal information than before.9 As the EFF put it, viewing Facebook’s successive privacy policies from 2005-2010 “tell[s] a clear story. Facebook originally earned its core base of users by offering simple and powerful controls over their personal information. As Facebook grew larger and became more important . . . [it] slowly but surely helped itself - and its advertising and business partners - to more and more of its users’ information, while limiting the user’s options to control their own information.” 10

Under Facebook’s current privacy policy, certain personal information, such as a user’s name, profile pictures, current city, gender, networks, and pages that user is a “fan” of (now, pages that user “likes”) is deemed “publicly available information.” And this user information is now accessible by Facebook applications that are added by any of that user’s Facebook friends, even if that user does not use these applications. In March, 2011, Facebook announced that it would be moving forward with a plan to give third-party developers and external websites the ability to access Facebook users’ home addresses and cell phone numbers.11 Facebook users may not restrict access to this information to a more controlled group or prevent application developers from accessing it. 12

In addition, when a Facebook user “likes” a product or service or “checks-in” to a place, such as Starbucks, the coffee company displays that information, both in the user’s news feed and also as part of a paid advertisement for Starbucks. This functionality is called “Sponsored Stories,” and Facebook users cannot opt out of the use of their information in Sponsored Stories if they “like” or “check-in” to a business or service.13 Based on these changes, Facebook users are now sharing a lot of personal information with the third party companies that partner with Facebook to develop applications and advertisements. 14

Finally, Facebook’s “privacy transition tool,” which guides users through the configuration of privacy settings will “recommend” (i.e., preselect by default) each user’s privacy settings for sharing information posted to Facebook, including status messages and wall posts, to be set to share with “everyone” on the Internet. The prior default setting for such information had been limited to each users’ “Networks and Friends” on Facebook. As discussed in the following section of this Article, default settings are often outcome determinative. It is human nature to accept and not change the suggested default settings. In this way, Facebook’s “privacy transition tool” results in more users shifting their privacy level to share their information with more people than before. 15

This erosion of privacy should come as no great surprise. Social networks like Facebook benefit from loose privacy rules: “the more incentives [Facebook] create[s] for people to share data, the more valuable the network . . . because [Facebook] ha[s] data you can resell or study for marketing trends.”16 Controlling, storing, using, and providing access to or analytics concerning vast stockpiles of user data is tremendously lucrative. Because Facebook makes money through targeted advertising and the like, reducing the privacy settings of its service is to its financial benefit.17 To this end, Mark Zuckerberg and Facebook have taken the position that “‘Facebook has always been about friends and community and that therefore the default has been skewed towards sharing information rather than restricting it.”18 This position also aligns with Facebook’s profit motive, monetization end-game, and growing valuation. 19

Default Settings Matter a Great Deal
What is important to keep in mind in the ongoing debate about Facebook’s privacy settings is the significant power of default settings in affecting user behavior and outcomes. When defending its increasingly “public” default privacy settings, Facebook often focuses on the fact that it gives its users the ability to change these privacy settings to control information (though not all information) more tightly, if they so choose. But the reality is that defaults are often determinative. Most users surely clicked through the new default settings without realizing it. And while users could, theoretically, change these more public “recommended” settings by navigating through the detailed privacy settings, doing so takes more effort. 20

Defaults have a particularly strong influence in software. System or device defaults are rarely altered by users. And commentators have observed that “psychological studies have shown that the tiny bit of extra effort needed to alter a default is enough to dissuade most people from bothering, so they stick to the default despite their untapped freedom.”21 With the rise of ubiquitous network software systems like Facebook, the outcome-determinative nature of defaults has the ability to fundamentally influence social concerns, such as privacy.22

Indeed, the evolution of Facebook’s privacy settings demonstrate the company’s understanding of the importance of default settings. On the one hand, Facebook does provide a good deal of granular control to its users in terms of privacy settings. But on the other hand, as studies in human computer interaction and behavioral economics show, users tend to favor the status quo or default settings.23 In the case of Facebook, these are the privacy recommendations and default settings that are provided. Furthermore, Facebook’s programs that pass information to its third-party business partner sites, such as Yelp, require users to “opt out,” which means that Facebook will freely disseminate user information unless the user affirmatively objects. Therefore, even though Facebook offers detailed privacy options, by pre-selecting the default settings for those user privacy settings and requiring users to affirmatively opt-out, Facebook is effectively “dictating what kind of privacy [users] will or will not have.”24

The Risks
So what? Aside from throwing around important-sounding words like “privacy issues,” what is the big deal?

Recently, industry watch groups, like the EFF and Consumer Reports, as well as the U.S. government, have articulated a host of real-world concerns. For example, posting personal information (including birthdates, street addresses, whether you are home or away), can expose a user to crime of either the cyber- or real-world variety.25 In addition, the privacy settings of users and users’ Facebook friends can expose users to harassment, malware, spyware, identity theft, viruses and scams. For example, a recent article estimated that out of the 18 million Facebook users who used “apps” of Facebook’s business partners and advertisers, roughly 1.8 million (or 10%) of their computers were infected by these applications. Many of these applications access a large swath of personal information, often without the user realizing it.26

Aside from the above crime risks, there are serious “social” and “commercial” risks as well. Sharing the likes and dislikes of users and their friends, as well as the places to which they go and the products they recommend could lead to a world where companies, advertising agencies, and others who seek to influence your behavior are able to track each individual user to such an extent that they can compile a set of incredibly granular and personal details about each person, including what time he gets up, where he goes, what he buys, what he reads, what his political views are, etc.27 For many, that may be an uncomfortable place to be.

Addressing the Issue of Online Privacy - Personal Choice, Regulation and Enforcement, or Both?
Broadly speaking, there are two general approaches to addressing the implications of online privacy settings, such as Facebook’s. It is unlikely that either one of these approaches alone will adequately address the privacy concerns raised above.

The first approach is to rely on the market and users to drive changes to privacy settings, when required. This approach relies on users to recognize that their privacy settings are important and to take the time and responsibility to set them.28 This laissez faire approach relies on individuals to take more care about what default settings they are agreeing to and to demand change in areas of paramount importance. Users who care should certainly take more care in setting their privacy options. However, there are limitations to relying on users alone. When settings and choices are not apparent to users, or defaults are repeatedly set in such a way that the vast majority of users are unlikely to understand the consequences of their selections or be able to demand change, it seems that more may be required.29

The second approach is to rely on government regulation and enforcement to ensure that there are clearly laid-out privacy options.30 In this regard, the U.S. government recently has began to raise questions about Facebook’s privacy policy. For example, when Facebook announced plans to enable its partners to access users’ addresses and phone numbers, Congressman Edward Markey (D-Mass) and Joe Barten (R-Texas), the Co-Chairmen of the House Bipartisan Privacy Caucus, sent a letter to Facebook CEO, Mark Zuckerberg, seeking answers about the company’s plans.31 Similarly, in May of 2010, the Article 29 Data Protection Working Party, a coalition of European data protection officials, sent a letter to Facebook criticizing the changes it made to its privacy policy and default privacy settings.32 The Working Party argued that significant changes to a privacy policy and settings relating to sharing of user information should require the active consent of users rather than mere notice of the changes to users.

The Federal Trade Commission (FTC) has likewise become more active in investigating online privacy violations. Section 5 of the FTC Act grants the FTC the power to pursue claims against entities which engage in unfair or deceptive acts or practices in interstate commerce with respect to consumers.33 In the past, the FTC has taken action against websites for violating their own privacy policies as a deceptive trade practice. The FTC has also used its Section 5 powers to pursue claims against online companies related to spyware and adware, etc. 34

Most significantly, last month, the FTC settled its Section 5 investigation into the privacy practices of Google in relation to Google Buzz, a social networking tool in Gmail that Google introduced last year. As part of the settlement, Google agreed to start a privacy program, to undergo privacy audits for a period of 20 years, and to obtain user consent before changing the way that any Google product shares personal information. 35

As relevant to Facebook, privacy interest groups led by the Electronic Privacy Information Center (“EPIC”) have recently filed multiple Complaints with the FTC, accusing Facebook of Section 5 violations relating to the privacy interests of Internet users.36 EPIC’s first FTC Complaint against Facebook focuses on Facebook’s practices relating to the sharing of user information with third-party app developers. In particular, it alleges that the mandatory public disclosure of certain user information to the public, including third-party app developers, is an unfair practice. The Complaint also alleges that Facebook’s policies regarding third-party app developers are misleading and deceptive, and provide for more information sharing and less user control of that information without a clear way for users to opt out.37 EPIC’s second Complaint against Facebook focuses on newer changes to Facebook, including the “like” feature and “instant personalization” feature, both of which, it is alleged, cause the sharing of user information in ways that are deceptive to the user. 38

EPIC’s Facebook Complaints may provide the FTC with the vehicle to take on Facebook, should it perceive the need to do so. At the very least, the FTC’s recent landmark settlement with Google signals that the FTC is ready and willing to use its Section 5 powers to remedy privacy violations in connection with social networking, where it deems appropriate.

The FTC has also provided guidance by issuing a Privacy Report entitled “Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers,” which seeks to provide a framework for consumers, businesses, and policymakers to address online privacy issues.39 The FTC Report concludes that industry efforts to address privacy through self-regulation have been too slow and have failed to provide adequate and meaningful protection to consumers. Among the recommendations in the FTC’s proposed framework is that consumers be presented with a clear and easy to understand choice about the collection and sharing of their data at the time and in the context in which they are making decisions. The FTC framework also addresses the tracking, collection, and sharing of user data with advertisers, recommending the adoption of a universal mechanism for implementing a user’s choice to opt out of such practices.40 In response, Facebook has argued against excessive regulation, indicating that Internet companies should be self-regulated so as not to stifle innovation. 41

It is unclear what ultimate impact the FTC Report and the proposals of other commentators will have on the industry or on policymakers, if any. These developments at least have the effect of bringing about public debate over many core privacy issues implicated by social networks and other online companies.

In this regard, some of the policies championed by the FTC and others are making their way before Congress in newly proposed privacy bills. Specifically, Representative Jackie Speier (D-California) introduced Bill HR 654 that would direct the FTC to put forth standards that provide an online mechanism for consumers to opt out of the collection and use of their personal information online and would require online advertisers and third-party website operators to disclose their practices with respect to data collection and use.42 These regulatory and legislative efforts may provide some baseline requirements for privacy policies and provide users with a privacy bill of rights. As with the FTC Report, it is not yet clear what shape such privacy legislation will take, and the extent to which legislators will seek to address such privacy issues through legislation.43

Conclusion
In the age of instantaneous sharing of information on Facebook, it is fair to ask whether privacy is dead or dying, and whether online social networks like Facebook are killing it. Despite what may be seen as an unstoppable cultural imperative to socialize, connect, share, communicate, and post information about oneself at a dizzying pace, it is important not to lose sight of the risks to handing over control over our personal information. As we status-update our way through the information age, both users and regulators alike must continue to closely monitor companies that receive access to the information we share. At the same time, we must also carefully weigh the benefits of increased interconnectivity against the costs of reduced privacy.


View the article here.


1 See Roy Wells, 41.6% of the U.S. Population Has a Facebook Account, socialmediatoday (Aug. 8, 2010), http://socialmediatoday.com/index.php?q=roywells1/158020/416-us-population-has-facebook-account.

2 See Nicholas Carlson, Goldman to clients: Facebook has 600 million users, Business Insider (Jan. 5, 2011), http://www.msnbc.msn.com/id/40929239/ns/technology_and_science-tech_and_gadgets/.

3 See Justin Smith, Facebook Now Growing by Over 700,000 Users a Day, and New Engagement Stats, Inside Facebook (July 2, 2009), http://www.insidefacebook.com/2009/07/02/facebook-now-growing-by-over-700000-users-a-day-updated-engagement-stats/.

4 For the BBC’s truly hilarious take on the Facebook paradigm, see Facebook in Real Life, http://www.youtube.com/watch?v=BYNYLq_KvW4 (last visited March 22, 2011).

5 See Yelp Partners With Facebook For A Personal Experience, Yelp Web Log (Apr. 21, 2010), http://officialblog.yelp.com/2010/04.

6 Facebook is presently a privately held company. Recently, a consortium including Goldman Sachs invested $500 million “in a transaction that values [Facebook] at $50 billion.” Susan Craig & Andrew Ross Sorkin, Goldman Offering Clients a Chance to Invest in Facebook, DealBook (Mar. 29, 2011) http://dealbook.nytimes.com/2011/01/02/goldman-invests-in-facebook-at-50-billion-valuation/.

7 It is significant to note that there are clear generational differences at work as to how a particular person will assess these types of trade-offs. For example, I use Google’s Gmail service because it is slick and functional. My father, however, will not, because he is greatly bothered by the fact that Google pushes context-based advertising at its Gmail users based upon the content of a user’s emails. Likewise, on Facebook, the younger generation is more apt to publicly share private details through status updates or to publicly share embarrassing pictures of their Saturday night escapades. In online social networks like Facebook, there is a quid pro quo in which privacy is gladly exchanged in favor of social interaction. To these users, the social and community benefits of sharing such information far outweigh the more subtle-to-perceive downside of diminished privacy. See Schneier, Google and Facebook’s Privacy Illusion, http://www.forbes.com/2010/04/05/google-facebook-twitter-technology-security-10-privacy (April 6, 2010).

8 See http://www.allfacebook.com/infographic-the-history-of-facebooks-default-privacy-settings-2010-05 (May 9, 2010) & http://www.goso.blog/2010/06/facebook-default-privacy-settings-over-time.

9Kevin Bankston, Facebook’s New Privacy Changes: The Good, The Bad, and The Ugly, Electronic Frontier Foundation (Dec. 9, 2010), http://www.eff.org/deeplinks/2009/12/facebooks-new-privacy-changes-good-bad-and-ugly.

10For a complete timeline of the changes to Facebook’s privacy policy from 2005 to present, see Facebook’s Eroding Privacy Policy: A Timeline, Electronic Frontier Foundation, http://www.eff.org/deeplinks/2010/04/facebook-timeline/ (April 28, 2010).

11See Bianca Bosker, Facebook To Share Users Home Addresses, Phone Numbers with External Sites, HuffPost Technology (Feb. 28, 2011) http://www.huffingtonpost.com/2011/02/28/facebook-home-addresses-phone-numbers_n_829459.html.

12 “When you connect with an application or website it will have access to General Information about you. The term General Information includes you and your friends’ names, profile pictures, gender, user IDs, connections, and any content shared using the Everyone privacy settings . . . . The default privacy setting for certain types of information you post on Facebook is set to ‘everyone.’” See Facebook’s Eroding Privacy Policy: A Timeline, Electronic Frontier Foundation, http://www.eff.org/deeplinks/2010/04/facebook-timeline/ (April 28, 2010)(quoting Facebook’s Privacy Policy).

13 Clint Boulton, Facebook Invites Privacy Concerns with Sponsored Story Ads, eWeek (Jan. 26, 2011), http://www.eweek.com/...

14 Bankston, supra note 10.

15 Id.

16 Privacy: The Slow Tipping Point, Carnegie Mellon University (2007 Interview Transcript, Podcast Interview of Alessandro Acquisti).

17 Bruce Schneier, Google and Facebook’s Privacy Illusion, Forbes (Apr. 6, 2010)(quoting Mark Zuckerburg), http://www.forbes.com/2010/04/05/google-facebook-twitter-technology-security-10-privacy.

18 Memmott, Zuckerberg: Sharing Is What Facebook Is About, http://www.npr.org/alltechconsidered/2010/05/27/127210855/facebook-zuckerberg- (May 27, 2010).

19 See Craig, supra note 6.

20 Forbes Magazine notes that companies like Facebook are driven by market forces “to kill privacy” by controlling defaults, limiting privacy options, and making it difficult to change such settings. This results in making it “hard ...to opt out.” Schneier, supra note 18.

21 See Pat Coyle, Triumph of the default in sports social networks, Technium Blog (Aug. 18, 2010) http://www.coylemedia.com/2010/08/18/power-of-the-default-in-sports-social-networks.

22 Jay Kesan and Rajiv C. Shah, Establishing Software Defaults: Perspectives from Law, Computer Science, and Behavioral Economics, Notre Dame Law Review, Vol. 82, pp. 583-634, 2006 (available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=906816).

23 Examples that illustrate the power of defaults are found not only in the technology field but across many other fields. One oft-cited study of defaults is the work of Madrian and Shea, who studied the impact of defaults on money saving tendencies by changing the defaults of 401(k) retirement plans. Specifically, they changed the default enrollment rule so that new employees had to choose to opt out of contributing to the 401(k) plan rather than to opt into it. The results were striking. Changing this one simple default rule brought participation in the 401(k) plan from less than 40% to over 85%. Furthermore, those who participated made few subsequent changes to their default plan. This study indicates that defaults can strongly influence real-life decision-making, and that people generally defer to defaults in their decision-making. Whether the cause of this behavior is momentum, laziness, procrastination, passivity, a tendency to follow the guidance or advice of experts, or some other phenomena, the effect of defaults is very powerful and very real. See Simon Kemp, Psychology & Economics in Regulation, Institute of Policy Studies (Feb. 19, 2010); Sendhil Mullainathan, Psychology and Development Economics (June 2004) (unpublished manuscript) (on file with Harvard University Department of Economics) see also James M. Poterba, Behavioral Economics and Public Policy: Reflections on the Past and Lessons for the Future, in Policymaking Insights from Behavioral Economics (Christopher L. Foot et al eds., 2007); Kesan, supra note 23.

24 Acquisti, supra note 17. For humorous satirical commentary on this phenomena, see Entire Facebook Staff Laughs As Man Tightens Privacy Settings, The Onion (May 26, 2010), http://www.theonion.com/articles/entire-facebook-staff-laughs-as-man-tightens-priva,17508/.

25 See Jeff Fox, Why Facebook Users Need Protection, HuffPost Technology (May 4, 2010), http://www.huffingtonpost.com/jeff-fox/why-facebook-users-need-p_b_562945.html; see also Protecting Your Computer from Online Threats, Consumer Reports (June 2010), http://www.consumerreports.org/cro/magazine-archive/2010/june/electronics-computers/so...

26 Id.

27 See e.g., JC Raphael, Facebook Privacy: Secrets Unveiled, PC World (May 16, 2010), http://www.pcworld.com/article/196410/facebook_privacy_secrets_unveiled.html.
28 “It’s very simple: Facebook is a business and their goal is to make money. They make money through advertising and selling virtual goods. The more of your personal information they can mine, the more likely their advertising will result in revenue for Facebook and to their clients. . . . What about privacy settings? You need to set them, it’s your responsibility and no one else’s. Facebook wants you to share as much as possible since it helps them monetize your account. Consequently the default settings tend to be “opt out” rather than “opt in,” knowing that most people review their privacy settings. . . . You are responsible for what information you post about yourself, the Facebook friends you link to, the privacy settings and the applications you use.” Howard Steven Friedman, You Are Responsible for Your Own (Facebook) Privacy, HuffPost Technology (Mar. 3, 2011) (emphasis added), http://www.huffingtonpost.com/howard-steven-friedman/you-are-responsible-for-y_b_830652.html.

29 Another aspect of a laissez faire approach to dealing with online privacy would be to rely on the market to provide competing social networking systems that address privacy differently than Facebook. In other words, if clearer and tighter privacy controls are something that consumers want and value, a market competitor to Facebook should offer a competing alternative. Cf. Hiroki Tabuchi, Facebook Wins Relatively Few Friends in Japan, New York Times (January 9, 2011), http://www.nytimes.com/2011/01/10/technology/10facebook.html (noting Facebook’s relative lack of success in Japan, whose Internet users are “fiercely private.” In Japan, Facebook’s competitors, which “let members mask their identities, in distinct contrast to the real-name, oversharing hypothetical user on which Facebook’s business model is based,” have been far more successful). However, because the value of a social network is largely based on the fact that all of one’s friends are members, the sheer size and momentum of Facebook in the U.S. market may well prevent viable competitors from easily emerging.

30 Schneier, supra note 18 (stating that “[i]f we believe privacy is a social good, something necessary for democracy, liberty, and human dignity, then we can’t rely on market forces to maintain it” and calling for broad legislation that would protect personal privacy by giving people control over their personal data).

31 See Thomas Clayburn, Facebook Faces Congressional Privacy Interrogation, Information Week (Feb. 5, 2011), http://www.informationweek.com/news/internet/social_network/showArticle.jhtml?articleID=229201226. Facebook responded to this inquiry in a letter on February 23, 2011 (available at http://markey.house.gov/docs/facebook_response_markey_barton_letter_2.2011.pdf ) in which it highlighted that Facebook users have various different levels at which they can set their privacy options and that users must give applications seeking to access their personal information permission to do so. Information concerning the Congressmens’ response to Facebook can be found at Markey, Barton Respond to Facebook (Feb. 28, 2011), http://markey.house.gov/index.php?option=content

32 Article 29 Data Protection Working Party, Press Release, May 12, 2010, available at http://ec.europa.eu/justice/policies/privacy/news/docs/pr_12_05_10_en.pdf

33 Act of March 21, 1938, ch. 49, § 3, 52 Stat. 111 (codified at 15 U.S.C. § 45(a)(1)(1994)).

34 See e.g., Federal Trade Commission, Privacy Initiatives, http://business.ftc.gov/legal-resources/29/36 (Last visited February 18, 2011).

35 See C. Miller and T. Vega, Google Unveils New Social Tool as It Settles Privacy Case, New York Times (March 20, 2011); Google Agrees to Implement Comprehensive Privacy Program to Protect Consumer Data, http://www.ftc.gov/opa/2011/03/google.shtm (March 30, 2011). A copy of the consent order is available at http://www.ftc.gov/os/caselist/1023136/110330googlebuzzagreeorder.pdf (Last Visited, April 6, 2011).

36 The EPIC Complaints are available at http://epic.org/privacy/inrefacebook/EPIC-FacebookComplaint.pdf (“EPIC I”) and http://epic.org/privacy/facebook/EPIC_FTC_FB_Complaint.pdf (“EPIC II”). For additional general background regarding the EPIC Complaints, see http://epic.org/privacy/inrefacebook and http://epic.org/privacy/facebook/in_re_facebook_ii.html.

37 EPIC I, supra note 37.

38 EPIC II, supra note 37 at ¶¶ 65-94.

39 See generally FTC Staff Releases Privacy Report, Offers Framework for Consumers, Businesses and Policymakers, Federal Trade Commission (Dec. 1, 2010), http://www.ftc.gov/opa/2010/12/privacyreport.shtm; FTC Staff, FTC Staff Report: Protecting Consumer Privacy in an Era of Rapid Change (2010), http://www.ftc.gov/os/2010/12/101201privacyreport.pdf

40 Similarly, the EFF recently proposed a “Bill of Privacy Rights for Social Network Users.” The Proposed Bill of Privacy Rights includes (i) “the right to informed decision-making” about who sees their personal data, (ii) “the right to control” the use and disclosure of their data, including requiring a default opt-in permission by users, so that user data is not shared unless a user makes an informed decision to share it, and (iii) “the right to leave” a social network, at which point the user data is permanently deleted from the social network’s databases and those of its partners. See Kurt Opsahl, A Bill of Privacy Rights for Social Network Users, Electronic Frontier Foundation (May 19, 2010), http://www.eff.org/deeplinks/2010/05/bill-privacy-rights-social-network-users; see also Dani Manor, Proposed New Bill of Rights for Facebook Users, Electronic Frontier Foundation (May 21, 2010), http://www.allfacebook.com/eff-proposes-new-bill-of-rights-for-facebook-users-2010-05.

41 See e.g., Katie Kindelan, What You Should Know About Facebook’s Response to the FTC, Social Times (Feb. 25, 2011), http://www.socialtimes.com/2011/02/what-you-should-know-about-facebooks-response-to-the-ftc/; see also Bianca Boscer, Facebook Response to FTC’s Privacy Plans, Huffpost Technology (Feb. 23, 2011), http://www.huffingtonpost.com/2011/02/23/facebook-responds-to-ftcs_n_827260.html; Leigh Goessl, Facebook Response to FTC Privacy Investigation, Helium (Feb. 27, 2011), http://www.helium.com/items/2103119-facebook-response-to-ftc-privacy-investigation.

42 See H.R. 654, 112th Cong. (2011).; See also Bert Knabe, Two Privacy Bills Introduced by Representative Jackie Speier, Lubbock Avalanche-Journal (Feb. 14, 2011), http://lubbockonline.com/interact/blog-post/bert-knabe/2011-02-14/two-privacy-bills-introduced-representative-jackie-speier-d.

43 Cf. Farhad Manjoo, No More Privacy Paranoia, Slate (April 7, 2011), http://www.slate.com/id/2290719/pagenum/all/#p2 (noting that regulators must carefully balance the costs of privacy protection with its benefits).


* Michael J. Kasdan was a Partner at Amster, Rothstein & Ebenstein LLP and is a 2001 graduate of NYU School of Law. He is a Facebook user. The views and opinions expressed in this article are his own. Mr. Kasdan also authored Student Speech in Online Social Networking Sites: Where to Draw the Line (November 22, 2010).




View all Published Articles

Print

Upcoming Events

View all Events >

RSS FEED

Never miss another publication. Our RSS feed (what is RSS?) will inform you when new articles have been posted.

Amster Rothstein & Ebentein RSS Feed Subscribe now!

Linkedin AREnet Access
©2007-2024 Amster Rothstein & Ebenstein LLP.    All rights reserved. | Disclaimer