Meta action on explicit deepfakes under review by Oversight Board

Meta’s Oversight Board will review two cases about how Facebook and Instagram handled content containing artificial intelligence- (AI) generated nude images of two famous women, the board announced Tuesday. The board is soliciting public comments about concerns around AI deepfake pornography as part of its review of the cases. One case concerns an AI-generated nude image made to look like an American public figure that was removed automatically by Facebook after being identified by a previous poster as violating Meta’s bullying and harassment policies. The other case concerns an AI-generated nude image made to resemble a public figure from India, which Instagram not intitially remove after it was reporterd. The image was later removed after the board selected the case and Meta determined the content was left up “in error,” according to the board. The board is not naming the individuals involved to prevent further harm or risk of gender-based harassment, a spokesperson for the Oversight Board said. The board, which is run independently from Meta and funded by a grant provided by the company, can issue a binding decision about content, but policy recommendations are non-binding and Meta has final say about what it chooses to implement. The board is seeking public comments that address strategies for how Meta can address deepfake porn, as well as on the challenges of relying on automated systems that can close appeals in 48 hours if no review has taken place. The case in India, where a user reported the explicit deepfake, was…Meta action on explicit deepfakes under review by Oversight Board