LEGAL CORNER: Update on Privacy Issues, the Fair Housing Act and a Lawsuit Against Facebook
John Dolgetta, Esq. | April 2018
On March 27, 2018, the National Fair Housing Alliance, the Fair Housing Justice Center Inc., Housing Opportunities Project for Excellence, Inc. and the Fair Housing Council of Greater San Antonio filed a lawsuit against Facebook, Inc. in the United States District Court, Southern District of New York (see https://bit.ly/2HprYkb), alleging that Facebook’s advertising platform violates the Fair Housing Act.
The lawsuit charges that Facebook allows users and advertisers to specifically exclude or block individuals, who are members of legally protected classes under both federal and state law, from receiving advertisements in connection with the sale and rental of real property and dwellings.
Private Information and User Data—‘The Holy Grail’
In March 2016, we addressed the importance of data security and privacy policies in an article entitled “Beware of Cyber Threats: The Importance of Implementing Data Security and Privacy Policies” (see https://bit.ly/2JwP6h7). That article focused on the specific threats that exist if a client’s private information was breached and the risks associated with such a breach. In recent weeks, privacy concerns have again been at the center of attention in light of the Facebook and Cambridge Analytica scandal. Mark Zuckerberg was called to testify before Congress to address the issues surrounding the use and dissemination of private data by Facebook.
One of the issues discussed during the first day of testimony (held on Tuesday, April 10, 2018) dealt with the use of Facebook’s advertising platform by users, specifically real estate professionals, to engage in illegal discrimination in violation of the FHA and New York State fair housing and anti-discrimination laws.
Facebook, like Google and Twitter, amass incredible amounts of private data and user information, which is integral to the operation of their businesses. While many view these companies as online technology companies, they have actually become some of the largest media companies in the world. The private information collected is the “holy grail” for these technology/media giants and their success and viability depend on it. The primary use of this information is targeted advertising. Facebook made over $40 billion in revenue from advertising in 2017, which comprises nearly 98% of its total revenue.
The gathering of private information of individuals is widespread. In this fast-paced technological environment, real estate agents, attorneys, brokerage firms, Realtor associations and multiple listing services must not only implement policies to safeguard private information, but must also be cognizant of the fact that companies like Facebook, Twitter and Google, are utilizing a consumer’s private information in order to allow for targeted advertising. These targeted ads permit users to engage in the illegal discriminatory practices alleged in the complaint against Facebook. Unfortunately, to date, there has been little regulation that would restrict or control the use and dissemination of said private information.
Facebook and Violations of the Fair Housing Act
This year marks the 50th anniversary of the Fair Housing Act and, unfortunately, the discrimination it sought to protect against is still occurring. According to the FHA, it is unlawful “[t]o make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin, or an intention to make any such preference, limitation, or discrimination.” (See 42 U.S.C. § 3604(c) at https://bit.ly/2v25L94). It is clear from the language above that not only those who create the advertisements are subject to liability, but those who “cause” the ads “to be made, printed, or published” are liable as well. The FHA further provides, in pertinent part:
“Discriminatory notices, statements and advertisements include, but are not limited to: (1) Using words, phrases, photographs, illustrations, symbols or forms which convey that dwellings are available or not available to a particular group of persons…(3) Selecting media or locations for advertising the sale or rental of dwellings which deny particular segments of the housing market information about housing opportunities…. (4) Refusing to publish advertising for the sale or rental of dwellings or requiring different charges or terms for such advertising….” (See 24 C.F.R. Section 100.75 at https://bit.ly/2v25L94).
Throughout the hearing, Congressional leaders likened Facebook to a publisher rather than a technology company, and to that end, it was generally the view that Facebook was ultimately responsible for policing the content published on its platform and for the advertising that is approved for publication and disseminated to its two billion-member community. The law is well settled that the “publisher” is prohibited from “publishing” content that is unlawful, and that “…these prohibitions apply to both the person who drafted or placed the ad as well as the publisher of the ad because the negative effect of discriminatory advertising would be magnified if widely circulated by newspapers and other mass media. (See United States v. Hunter, 459 F.2d 205, 215 (4th Cir. 1972).”
The Plaintiffs explain that Facebook users who create housing advertisements are permitted to select on the advertising application certain descriptive criteria they would like “excluded” or “included” from the target audience. For example, advertisers are permitted to select pre-populated demographic fields and other descriptive fields, such as “parents of toddlers (01-02)”, “ethnicity” (although the “ethnicity” category was removed in late 2017), “disabled” and many more. In addition, advertisers are also permitted to insert customized keywords that they would like “excluded” from or “included” in a particular targeted housing advertisement. The advertisement is then submitted to Facebook for approval, and once approved, it is published on the user’s personal page or boosted to potentially millions of Facebook users.
Once the advertisement is posted, Facebook then utilizes algorithms to target specific Facebook users who align with the specifically chosen parameters and descriptive terms or categories. Facebook, through its algorithms, uses private information garnered from an individual’s Facebook account, including the user’s “likes,” friends, photographs, postings, “shared” content, etc., in order to restrict certain Facebook users from viewing or having access to a particular housing advertisement. Ultimately, this targeted form of advertising is by its very nature discriminatory, if legally protected classes are being blocked or restricted.
The individual users on Facebook would not even be aware that they are being discriminated against because they would never see the advertisement or the underlying descriptive parameters that direct the ads to only those that have been “targeted” to receive it. In essence, a user’s own private data is being used against the user and, more importantly, being used for purposes that may clearly violate of the law.
Real Estate Licensees Cannot Hide Behind Social Media Platforms
It is clear that Facebook’s advertising platform, and other similar media and technology company platforms, are an incredibly effective way for individuals and real estate professionals to advertise and generate business. However, Facebook and other companies should not be permitted to engage in discriminatory practices and should not allow or aid their users to engage in unlawful behavior. Although the creator of the ultimate advertisement has the option not to participate in discriminatory practices, the fact that Facebook is allegedly allowing individuals, through its selection and advertisement creation process, to pick specific categories that are clearly “protected classes,” it is creating the impression that housing segregation based on race, gender, ethnicity and other protected classes, is legal or acceptable, thus facilitating future discrimination by others using its advertising platform.
It is also extremely important for principal brokers and managers to review and approve all advertising before it goes live. They must be aware of the issues surrounding the use of social media platforms and warn licensees to refrain from engaging in discriminatory behavior. Agents may not even realize they are clicking on certain parameters, which are in violation of the law. Brokers must inform licensees of the pitfalls and potential for liability when creating and placing ads on social media sites. Brokers are responsible for the supervision of their agents and liability will ultimately extend to the principal brokers if they fail to adequately supervise their agents.
What Government Agency is in Charge of Protecting Consumer’s Privacy?
The Federal Trade Commission is charged with the protection of consumers and enhancing competition. The FTC’s legal authority comes from Section 5 of the Federal Trade Commission Act, which prohibits “unfair or deceptive practices” in the marketplace. One of the functions of the FTC is to protect a consumer’s privacy and promote data security. The FTC issues an annual “Privacy and Security Update” wherein it outlines the enforcement actions and lawsuits initiated by it involving privacy issues and data breach issues. (See https://bit.ly/2mQTE7w). In 2017, the FTC brought enforcement actions involving more than 130 spam and spyware cases, and initiated more than 50 lawsuits involving general privacy matters. Since 2002, the FTC has also brought more than 60 cases against companies involving data breaches.
Is Anyone Looking Out For the Consumer?
As was evident from the massive dissemination of the private user information of more than 87 million Facebook users without their knowledge or permission, it is clear that there is a need for regulation.
Don’t Tread Lightly
In this age of social media advertising, the use of social media platforms for networking, e-mails, text messaging, transmission and storage of electronic data over wireless networks, storage of private information in the cloud, Wi-Fi, remote access and much more, the control we have over our private data and information is slowly diminishing. The loss of control can result in unwanted and unforeseen consequences, as is clearly evident with the potential for discrimination on these social media platforms and the use of private information as occurred in the Cambridge Analytica scandal. While legislation is clearly needed, the amount of regulations needs to be carefully considered. If control over private information is clearly turned back over to the individual, both the individual and the companies can thrive.
Editor’s Note: The foregoing article is for informational purposes only and does not confer an attorney-client relationship.