Data poisoning tool ‘Nightshade’ might impact AI 

The tool modifies images so that their inclusion contaminates the data sets used to train AI

technological innovations, nonprofit sector, technology, teacher training sessions, community engagements
The nonprofit sector, particularly in primary education, emerges as a crucible of transformative potential.

According to Cointelegraph, researchers from the University of Chicago developed a tool that allows artists to “poison” their digital art. The tool is believed to stop developers from training artificial intelligence (AI) systems in their work.

Sources revealed that the tool modifies images so that their inclusion contaminates the data sets used to train AI with incorrect information. It is believed that the tool is named “Nightshade,” named after the family of plants known for their poisonous berries.

With insights from a report by MIT’s Technology Review, Nightshade changes the pixels of a digital image. This change might trick an AI system into misinterpreting it. Use cases such as tech review mentions can convince the AI that an image of a cat is a dog and vice versa, Cointelegraph added.

Furthermore, “The researchers don’t yet know of robust defences against these attacks, the implication being that even robust models such as OpenAI’s ChatGPT could be at risk,” Vitaly Shmatikov, a professor at Cornell University, concluded.

(With insights from Cointelegraph)

Follow us on TwitterFacebookLinkedIn

Get live Share Market updates, Stock Market Quotes, and the latest India News and business news on Financial Express. Download the Financial Express App for the latest finance news.

This article was first uploaded on October twenty-five, twenty twenty-three, at forty minutes past nine in the morning.
Market Data
Market Data