https://arab.news/gndkb
In a world where artificial intelligence is on the lips of everyone, each for their own reasons, I have often wondered if the benefits it provides outweigh the risks.
Among the few valid but gimmicky concerns we often hear are whether a super-intelligent AI will wipe out humanity, steal our jobs or enable killer robots that could annihilate one species and not another.
My concern has always been to ask: at what cost? And whether AI will impact the future of humankind, such as the norms and values on which our social and economic models have rested for centuries.
Above all, how much are the key ingredients for machine learning and AI costing the tech companies? That is the cost of harvesting and labeling data, which is human labor-intensive and costly, not to mention the environmental footprint of the mega-processors and their consumption and cooling in the age of trying to reduce emissions that cause global warming — an area that could form the discussion of another article all on its own.
The social impact of AI is yet to be determined. But since AI rests on data and large-language models, it needs extensive human input during the training phase. Big tech companies have resorted to service providers that hire independent workers, often from low-income countries and are often working in less-than-perfect conditions and earning very little. This has led some people to warn about the creation of “digital sweatshops” and the exploitation of workers in the developing world.
Some people warn about the creation of ‘digital sweatshops’ and the exploitation of workers in the developing world
Mohamed Chebaro
Last year, Time magazine reported on how Kenyan workers who were contracted to monitor text data for ChatGPT for any “toxicity” were paid less than $2 dollars per hour and were not compensated for being exposed to explicit and traumatic content.
It was revealed by Untold Magazine last month that such workers were even recruited from refugee camps. Refugees from Syria, some as young as 21, have been recruited from camps in Bulgaria. After brief initial training in “digital skills” and some English language training, refugees work part-time for data labeling companies in what is known as “microwork” or “click work,” which employers describe as trivial and straightforward. But the work they engage in tells a different story.
These refugees spend their days labeling images of people based on race, gender and age, as well as carrying out what is known as the “semantic segmentation” of satellite-sourced images, a critical task for computer vision that involves the meticulous separation and labeling of every pixel in an image.
The report claimed that this form of data work holds particular importance in generating training data for AI, especially for computer vision systems embedded in devices such as cameras, drones and even weapons.
The tasks at hand called on workers to separate the trees from the bushes, cars from people, roads from buildings, etc. But experts claimed that such work — belittled by the employer as small, low-skilled and not requiring expertise — in reality does require contextual knowledge, including an understanding of what vegetation and vehicles look like in specific regions, causing some to be suspicious that AI could be used for weapons and military applications.
According to the World Bank, there are between 154 million and 435 million data workers worldwide, the majority situated in the Global South. Their work tends to be on a freelance basis and they earn a few cents per piece or task, often commissioned by employment platforms or companies online. They have no employment protections whatsoever.
Their labor contributes to the development of algorithms that could discriminate and cause harm
Mohamed Chebaro
In the case of the Syrian refugees in Bulgaria, their suspicions are not unfounded, as the number of autonomous drones and other such technologies has grown dramatically over the past few years. Tech companies and militaries have been racing to integrate AI technology into their reconnaissance, target identification and decision-making processes, which many bill as the future tools of warfare, despite their clear limited use in the Ukraine-Russia war and, more widely, in the Israeli war on Gaza.
Cheap labor is often the name of the game in gig or temporary freelance work, but in the context of generative AI and machine-learning models, the harm is multiplied. It is bad not only for the poorly paid workers, but also due to the fact that their labor contributes to the development of algorithms that could discriminate and cause harm in a way that blurs the lines of accountability and transparency regarding incorrect or biased information, as the data is often from third or even fourth-party service providers.
One risk that has already been experienced is from AI-powered software tools used in surveillance and facial recognition, which have been proven to discriminate on the basis of race and gender.
The ethical concerns and moral dilemmas for the workers should be addressed, as some of their work might adversely affect their own communities. Many data labelers are kept in the dark about the end users and their remit in terms of its usage.
If used in a proper way, AI has the potential to be transformative and to make a real difference to everyday life. But AI companies’ lack of transparency, such as the veil of secrecy they apply to their programs’ development in the name of safeguarding their competitive edge, coupled with weak or nonexistent official scrutiny, encourages a culture of impunity, even if we are to believe the premise that all companies working in this sector have the best interests of humanity as their priority.
I am yet to be convinced, especially when business ventures are focused primarily on profit and power. Rarely are they set up for the universal benefit of humankind — not least their poorly compensated workers.
- Mohamed Chebaro is a British-Lebanese journalist with more than 25 years’ experience covering war, terrorism, defense, current affairs and diplomacy. He is also a media consultant and trainer.