YouTube has not detected child sexual abuse material (CSAM) on its platform based on “multiple thorough investigations” or received examples or evidence of any such content on the Google-owned video platform from regulators, a company spokesperson said in a statement on October 16.
This statement comes after the Ministry of Electronics and Information Technology (MeitY) issued notices to YouTube and other platforms such as X (formerly Twitter) and Telegram on October 6, directing them to remove CSAM content from their respective platforms.
These notices from MeitY stated that non-compliance with these requirements will be deemed a breach of Rule 3(1)(b) and Rule 4(4) of the IT Rules, 2021 and it may result in loss of safe harbour protection that social media platforms enjoy under the Sec 79 of the IT Act, according to a press release from the government’s Press Information Bureau.
Minister of State for Electronics & Rajeev Chandrasekhar said, “The government is determined to build a safe and trusted internet under the IT rules. The IT rules under the IT Act lay down strict expectations from social media intermediaries that they should not allow criminal or harmful posts on their platforms. If they do not act swiftly, their safe harbour under section 79 of the IT Act would be withdrawn and consequences under the Indian law will follow.”
YouTube said that it has submitted its formal response to Indian regulators without disclosing any further details.
“We have a long history of successfully fighting child exploitation on YouTube. No form of content that endangers minors is allowed on YouTube, and we will continue to heavily invest in the teams and technologies that detect, remove and deter the spread of this content. We are committed to work with all collaborators in the industry-wide fight to stop the spread of CSAM,” the spokesperson said in the statement.
YouTube said that it has removed over 94,000 channels and over 2.5 million videos for violations of its child safety policy in Q2 2023.
In India, the video-sharing platform said it surfaces a warning at the top of search results for specific search queries related to CSAM, which states that child sexual abuse imagery is illegal and links to the National Cyber Crime Reporting Portal.
The company said that an individual must be at least 13 years old to use YouTube or a parent or legal guardian must enable it for them. Accounts belonging to people under 13 without parental supervision are terminated when discovered. Additionally, it disables comments, restricts live features and limits recommendations for videos that could expose minors to predatory attention.
YouTube said it also offers technology such as CSAI Match, an API that helps identify re-uploads of previously identified child sexual abuse material in videos, along with expertise to smaller partners and NGOs to combat child sexual exploitation on its platforms. The company also encourages users and NGos to flag content through its Priority Flagger initiative.