The Ministry of Electronics and Information Technology is has prepared up a report based on the key findings which are related to India in the international documents which are collected by “Facebook”, whistleblower Frances Haugen, including alleged discrepancies in algorithmic recommendations that lead new users in the country to “misinformation and hate speech.”
The Sources said that “If needed, we will call their executives to explain how their algorithms work and the action they have taken so far to counter misinformation and hate speech. For now, we will have to study (the revelations made by Haugen).”
Furthermore, the report is likely to be prepared and is going to be finalized by this week. And it will even contain details like, how Facebook has failed to check upon the spreading of misinformation and also the hate speech on the platform in India is just because it doesn’t have the right tool to Flag or Monitor the content in both Hindi and Bengali.
The findings of a Facebook researcher in Kerala from a self-created user account, which encountered several instances of hate speech and misinformation on the basis of algorithmic recommendations of the platform, are also likely to be included in the report, sources said.
In the Complaint about the US Securities and Exchange Commission (SEC), Haugen mentioned that despite being aware that “RSS users, groups, and pages promote fear-mongering, anti-Muslim narratives”, Facebook could not take action or flag this content, given its “lack of Hindi and Bengali classifiers”.
Apart from the revelation made by Haugen regarding the alleged inaction by Facebook towards the hate of speech and misinformation which is being spread in India, the reports published by The “New York Times”, reported that it is based upon the algorithmic recommendations made to the test user account it had created, the company had undertaken “deeper, more rigorous analysis” of its recommendation systems in India.
“This exploratory effort of one hypothetical test account inspired deeper, more rigorous analysis of our recommendation systems, and contributed to product changes to improve them. Product changes from subsequent, more rigorous research included things like the removal of borderline content and civic and political Groups from our recommendation systems,” a Facebook spokesperson had said.
The New York Times further went on to report that the Facebook researcher’s report “was one of the dozens of studies and memos written by Facebook employees grappling with the effects of the platform on India”. And the report has even mentioned that “They provide stark evidence of one of the most serious criticisms levied by human rights activists and politicians against the world-spanning company: It moves into a country without fully understanding its potential effects on local culture and politics and fails to deploy the resources to act on issues once they occur.”
Bihar secured the first position with a gold Award at the 40th India International Trade Festival (IITF) for its excellence in display in state govern...
As many as 99 more medical college students and faculty members were tested Covid-19 positive in Karnataka's Dharwad, taking the total number of a...