Meta takes a step in the right direction with its cross-validation program, but it’s far from significant.
Meta has responded to multiple recommendations from its review board regarding a controversial cross-validation program that allows VIP users to avoid being caught by automatic content moderation. In its response, Meta agrees to accept many of the council’s suggestions, but without implementing changes that would increase the transparency of profiles registered with the program.
Meta takes a step in the right direction with its cross-validation program
The Menlo Park response in question came after the board publicly criticized the program, accusing it, in part, of putting “company interests”ahead of people’s rights. While the company described the program as a “second layer of analysis”to help it avoid mistakes, the review board noted that cross-validation cases often lag so far behind that dangerous content stays published much longer than it should.
Altogether, Meta is committed to accepting, at least in part, 26 of the 32 recommendations. This includes how cross-validation cases are handled internally, as well as a promise to disclose more information to the board of directors about the program. The company is also looking to reduce the number of pending files.
But this is far from essential.
But, and perhaps most importantly, the Meta has balked at the board’s recommendation to make public the names of politicians, actors, business executives and other public figures who enjoy the protection of cross-checks. The company stated that such behavior with this program “may have many unforeseen consequences that render the program useless and unmaintainable”and explained that it could lead to the program being abused.
The American giant also refused to implement anything that would let people know they are benefiting from cross-validation. Meta rejected a recommendation to require users participating in cross-validation to “explicitly agree”to abide by the platform’s terms of service. And Meta said it is “evaluating the feasibility”of a recommendation to allow users to opt out of cross-validation, which would also notify them that they are part of the program. “We will be working with human rights and civil rights groups to evaluate our options for managing this aspect to increase user autonomy from cross-validation,” she wrote.
While Meta’s response shows that the company is ready to make changes to this program, it’s also clear that it remains very reluctant to reveal the most important details. This illustrates previous criticism from the supervisory board, which accused the company last year of not being “fully open”about cross-checks.