Share this
Google’s monthly transparency report said that it received 26,087 complaints from users and has removed around 61,114 pieces of content based on the complaint. Reports also show that google has removed 3,75,468 pieces of content in the month of November
In October google received 24,569 complaints from the users and removed 48,594 pieces of content based on those complaints in October, on the other hand, 3,84,509 piece of content was removed as a result of automated detection. All this information has come out with India’s IT rule that came into force in May.
The complaint filed is related to third-party content which is believed to violate the local laws on the SSMI platform.
“Some requests may allege infringement of intellectual property rights, while others claim violation of local laws prohibiting types of content on grounds such as defamation. When we receive complaints regarding content on our platforms, we assess them carefully,” it added.
The content was removed under various categories like copyright, trademark, court order, and graphic sexual content.
According to Google, a single complaint can contain numerous things that could be related to the same or distinct pieces of content, and each unique URL in a complaint is treated as a separate “item” that must be removed.
The “removal actions” number for user complaints represents the number of items were a piece of content was removed or restricted as a result of a specific complaint during the one-month reporting period, whereas the “removal actions” number for automated detection represents the number of instances where Google removed content or prevented the bad actor from accessing the Google service as a result of automated detection processes.
In addition to user reports, Google claimed it invests substantially in combating dangerous information online and employs technology to detect and remove it from its platforms.
“This includes using automated detection processes for some of our products to prevent the dissemination of harmful content such as child sexual abuse material and violent extremist content.
“We balance privacy and user protection to quickly remove content that violates our Community Guidelines and content policies; restrict content (e.g., age-restrict content that may not be appropriate for all audiences); or leave the content live when it doesn’t violate our guidelines or policies,” it added.
According to IT rules, platforms with 5 million users will have to publish periodic compliance reports every month wherein they would mention the number of complaints received and a number of actions taken.