The English website of the Islamic magazine - Al-Mujtama.
A leading source of global Islamic and Arabic news, views and information for more than 50 years.
WhatsApp's New Feature
‘WhatsApp’ recently added a new feature that's causing some controversy. It uses artificial intelligence to create stickers. But here's the problem: when people type the word "Palestine" or similar words, the stickers that show up often depict children carrying weapons. On the other hand, when people type "Israeli boy," the stickers show kids playing soccer and reading!
Guardian's Investigation
The Guardian did some tests to see if this was true. They found that when they searched for words like "Palestinian" or "Palestine," stickers with weapons appeared. But when they searched for "Israeli boy," they only saw innocent pictures of kids. The newspaper even had screenshots to prove it!
Meta's Response
The company that owns WhatsApp, Meta, said that they didn't mean to be biased. They explained that sometimes their artificial intelligence makes mistakes and shows the wrong things. They promised to fix it and make it better.
Concerns and Calls for Investigation
Some people are really upset about this. They think it's not right to show pictures of Palestinian children with guns. They say it's racist and unfair. One Australian senator even called for an investigation into these images. They want Meta to be held accountable for what they've done.
Meta's Troubles with Palestinian Content
This isn't the first time Meta has faced criticism for how they handle content related to Palestinians. In the past, they've been accused of hiding posts that support Palestinians and not treating them fairly. A study even found that their actions during a conflict in 2021 violated the rights of Palestinian users.
WhatsApp's new feature is causing a lot of trouble. It's important to treat everyone fairly and not show biased images. Will Meta fix this problem?!
Source: The Guardian