Meta has launched the AI-based Variance Reduction System (VRS) in the US, to reduce bias in the distribution of advertisements, after working in partnership with the Department of Justice for a year.
The company had announced its plan to create VRS for equitable distribution of ads as a part of a settlement with the DOJ, which was representing the US Department of Housing and Urban Development (HUD), in June last year.
The development of VRS was part of the settlement for a complaint raised in August 2018, in which Meta social network platforms were accused of violating the Fair Housing Act, based on their targeting options and delivery processes for housing advertisements. The company’s ads were said to facilitate certain demographic groups over others, through indirect profiling.
The company said it will be extending the use of VRS to US employment and credit ads in the coming year.
“Additionally, we discontinued the use of Special Ad Audiences, an additional commitment in the settlement,” said Roy L. Austin Jr., vice president of civil rights and deputy general counsel at Meta, in a blog. Special Ad Audiences was a feature that let advertisers use audience selection restrictions for ad sets related to housing, employment and credit.
In 2016, an investigation by ProPublica revealed that advertisers were able to create Facebook ads that excluded people based on ethnic affinities.
The VRS and the elimination of Special Ad Audiences may help Meta rebuild trust with users, advertisers and regulators, but there are caveats, according to Forrester Principal Analyst Brandon Purcell.
"While this is a step in the right direction, advertisers need to realize that by using it, they are implicitly agreeing to Meta’s definition of fairness," Purcell said, in a statement emailed to Computerworld. "Mathematically speaking, there are 21 different ways of representing 'fairness.' Here, Meta seems to be optimizing equal 'accuracy' across groups — to give a simple example, men and women who are equally eligible should have the same likelihood of seeing the ad. The question is who determines 'eligibility'? This approach ignores all the historical inequities which are codified in the data that these systems rely on."
How VRS works
“The VRS uses new machine learning technology in ad delivery so that the actual audience that sees an ad more closely reflects the eligible target audience for that ad,” Meta said.
The ad-buying process begins with the ad creator defining several aspects of their housing campaign. Unlike earlier, in the new system, ad creators cannot use targeting features such as age, gender, and postal code when defining the audience for their ads. Once approved, the ad is included in the millions of ads already available on its platforms, Meta explained in a video.
Once the ad is seen by a certain number of users, the VRS will measure the aggregate age, gender, and estimated race or ethnicity distribution of those who have seen the ad and compare that with the broader eligible audience who could have seen the ad.
To determine the estimated race or ethnicity distribution, VRS relies on a Bayesian Improved Surname Geocoding (BISG) method with added privacy enhancements. BISG is a method developed by RAND cooperation that uses geocoded addresses and surnames, refining census data to estimate race and ethnicity.
“This method is built with added privacy enhancements including differential privacy, a technique that can help protect against re-identification of individuals within aggregated datasets,” Meta said.
The next time the ad is displayed, the VRS will use the latest aggregate demographic measurements to distribute the ad to an audience that more closely reflects the eligible target audience. As more people view the ad, the VRS will remeasure and update accordingly, adjusting the pacing of ads to help ensure an equitable distribution of ads.
“Before the VRS ever makes adjustments in our ad system, it is trained through a form of machine learning called reinforcement learning; here the VRS learns how to use pacing effectively to reduce demographic differences,” Meta said.
Meanwhile, privacy-related legal issues continue to persist for Meta, which was fined $414 million by the Irish Data Protection Commission on January 4 for using private information for personalization of ads on its Facebook and Instagram services.
(This story has been updated with comments from Forrester analyst Brandon Purcell.)