X (formerly Twitter) more quickly resolves complaints that urge the platform to review content posted that infringes copyright than those related to videos or non-consensual intimate images (NCII), according to a recent study.
The social network allows you to report possible breaches related to the Rules and Terms of your service, through different mechanisms that address both the content included in specific publications, lists or direct messages, among others.
Likewise, child sexual exploitationpornography or unauthorized use of copyrighted material.
A period of up to 30 days
In all cases, the platform ensures that it confirms receipt of correctly submitted complaints within a period of 24 hours. However, anticipate that, although they are usually resolved “in a few days”, resolution times vary and can take up to 30 days. This depends on factors outside the control of the social network, such as the need for a user to submit information and whether they decide to request a review of the measures.
A group of researchers has discovered that X prioritizes reports related to copyright infringement – that is, those submitted under the Digital Millennium Copyright Act (DMCA) – against those that demand the deletion of intimate content published without the consent of its owners.
These may be those captured with hidden cameras or showing full or partial nudity, sexual acts, videos and images taken in an intimate context or those that superimpose or otherwise digitally manipulate one person’s face on the unclothed body of another. This is known as ‘deepfake’.
‘Revenge Porn’
A group of experts has carried out an investigation to determine how without their consent or have been threatened with it – which is also called ‘revenge porn’ – and that specific legislation is urgently needed to eliminate NCIM online both on this and other platforms.
Despite this growing problem, they have studied how with non-consensual images or videos (NCII).
To reach this conclusion, they created ten different accounts on X and uploaded a total of 50 nude images generated by Artificial Intelligence (AI). Half of them were reported under the mechanism of ‘copyright infringement’, while the other half, for including non-consensual nudity.
The researchers have insisted in their report that, to test X’s capabilities, they used five unique photographs, each of which represented an AI-generated personality. This ensured that the study did not depend on a single image to represent all cases of NCII. Each of these five photographs was also duplicated ten times to obtain five reports per photograph in each of the two reporting mechanisms.
Three weeks to delete a nude
Thus, the copyright condition resulted in the complete deletion of these images within 25 hours, while X took longer to resolve complaints related to non-consensual intimate images: no images were removed until more than three weeks after having informed the platform. This means that they remained visible during the review period.
And not only that: publications with sexual content had 9.08 visits on average after those three weeks, compared to publications reported for infringing copyright, with an average of 7.36 visits. Although minimal, due to the recent creation of these accounts, it shows that their distribution was greater.
Finally, the experts have indicated that their report highlights the need for stronger regulations and protocols aimed at protecting victims of the dissemination of non-consensual content. Also, it contributes to a “broader” understanding of the responsibilities of the platform and the way in which laws influence its behavior.
#prioritizes #resolving #complaints #content #violates #copyright #sexual #nonconsensual #content