Faked nude images of more than , women have been created from social media pictures and shared online, according to a new report. Clothes are digitally removed from pictures of women by Artificial Intelligence AI , and spread on the messaging app Telegram. Some of those targeted "appeared to be underage", the report by intelligence company Sensity said. Deepfakes are computer-generated, often realistic images and video, based on a real template. One of its uses has been to create faked pornographic video clips of celebrities. But Sensity's chief executive Giorgio Patrini said the shift to using photos of private individuals is relatively new.
Update June 27, p. Read more, here. A programmer created an application that uses neural networks to remove clothing from the images of women, making them look realistically nude. The software, called DeepNude, uses a photo of a clothed person and creates a new, naked image of that same person.