NewsInstagram recommendation algorithms drive pedophile networks, research reveals

    Instagram recommendation algorithms drive pedophile networks, research reveals

    The researchers discovered that the social network allows you to search for ‘hashtags’ related to child sexual abuse, which lead to profiles that seek to sell pedophile content or even meet with minors.

    Instagram’s* recommendation algorithms help connect and promote a wide network of profiles that sell child pornography material, according to a joint investigation by The Wall Street Journal and researchers from Stanford University and the University of Massachusetts Amherst (USA). .USA).

    The researchers discovered that the social network allowed users to search for ‘hashtags’ related to child sexual abuse, including explicit terms such as #pedowhore (pedophile, in English) or #preteensex (preadolescent sex), which led to profiles that offer pedophile content. , including videos of minors having sex with animals or with others children.

    Read Also:   Pyongyang calls the extension of maneuvers by the US and South Korea a "tremendous irreparable mistake"

    When searching for those tags, the platform warned users that “these results may contain images of child sexual abuse” and stated that the production and consumption of such material causes “extreme harm” to minors, while offering ” get resources” on the topic or “see results anyway”.

    Actress Bella Thorne reveals that she was sexualized by a director when she was 10 years old

    The experts created profiles that followed only one of the accounts that used the aforementioned ‘hashtags’, and immediately the algorithms began to recommend sellers or buyers of child sexual content and accounts linked to off-platform sales sites. In their research, the researchers they found 405 vendors.

    Read Also:   LIVE: Protest in Paris against the approval of the pension reform without the vote of Parliament

    Similar activity was discovered on other social networks. On Twitter, 128 profiles were found offering to sell child sexual abuse material. According to the investigation, this platform did not recommend these accounts to the same extent as Instagram and removed them much faster.

    “The most important platform for these networks of buyers and sellers seems to be Instagram”the experts pointed out.

    In a comment to The Wall Street Journal, a Meta spokesperson said that thousands of hashtags that sexualize minors have been blocked and restrictions have been placed so that algorithms do not recommend users search for terms related to sexual abuse. “Child exploitation is a horrible crime,” the spokesperson said, adding that the company is “continually investigating ways to actively defend against this behavior” and that in the last two years it has eliminated 27 pedophile rings.

    Read Also:   Lasso at the gates of a political trial: what process awaits him?

    * Social network belonging to the Meta company, classified in Russia as an extremist organization.

    Source: RT

    This post is posted by Awutar staff members. Awutar is a global multimedia website. Our Email: [email protected]


    Please enter your comment!
    Please enter your name here

    six + 7 =

    Subscribe & Get Latest News