Meta is in a position to revise its Facebook advertising algorithm system within this year. Regarding the meta, allegations have been raised that the algorithm restricts the housing advertisements that users can see based on race, gender, and zip code. The U.S. Justice Department found the charges and fined Meta $115,054.
The New York Times (NYT) reported on the 21st (local time) that Meta decided to completely replace the system in order to reduce discrimination caused by Facebook’s advertising algorithm. According to a U.S. Department of Justice announcement released today, Meta has been accused of using data about race, religion, disability, occupation, family, and national origin to manipulate access to housing ads.
The Ministry of Justice paid attention to ‘Look like Audience’ and ‘Special Ad Audience’, which are ad targeting tools developed by Meta. It is a tool to find Facebook users with similar data to individual groups selected by advertisers.
The Ministry of Justice said that Meta restricted users who had access to home ads with the tool based on race, religion, and gender.
How to change Facebook’s advertising system
The U.S. Department of Justice has determined that Facebook’s advertising algorithm system has violated the Fair Housing Act. Even though everyone has the right to see house ads, Facebook blocks them with a specific algorithm. As a result, Meta was fined $115,054.
According to the New York Times, the Justice Department has also called on Meta to “stop using target advertising tools (Look like Audience, Special Ad Audience) by the end of this year.” This applies to the residential advertising sector. It also requested the development of a new advertising system. This is to solve the situation in which Facebook’s ad access is controlled by algorithms based on race, gender, etc. of Facebook users.
When a new system is applied, the Ministry of Justice selects a third-party reviewer and conducts continuous investigation. As before, Facebook’s advertising algorithm plans to check whether it is visible only to certain users. If the Department of Justice’s requirements are not pursued or found to be deficient, the United States will file a lawsuit against Mehta through federal court.
Facebook’s advertising algorithm bias was first revealed in a 2016 report by ProPublica, a non-profit media outlet in the United States. This is the first article to reveal that marketers exclude certain racial groups for advertising purposes.
Then, in 2019, Ben Carson, then Minister of Housing and Urban Development, publicly criticized Facebook for having an advertising system that viciously discriminates against people based on race, religion or disability. In the same year, the Ministry of Housing and Urban Development sued Facebook for violating the Fair Housing Act and aggravating housing discrimination. At the time, the development ministry criticized “Facebook’s housing advertisement system does not reach a variety of users.”