Expert Insights With Balaji Kapsikar


The world of deepfake technology has undergone a remarkable transformation, moving beyond its initial role as a tool for face-swapping in explicit content or enhancing gaming experiences. From influencing political narratives to shaping hyper-realistic simulations, deepfakes have become a powerhouse for content creation. However, amidst the progress, one facet that didn’t go away with time is the dark side of deepfake technology.

Technology and cyber risk expert Balaji Kapsikar, boasting 13 years of diverse experience, shared his insights on the evolving nature of deepfake technology and the emerging trends during a conversation with TCE at CyberCon 2023.

As the Head of Technology & Cyber Risk at Funding Societies Singapore, Kapsikar shed light on the intricate facets of deepfake technology and its implications for our interconnected society.

What is Deepfake and How Accessible is it?

Deepfake technology, a fusion of “deep learning” and “fake,” utilizes advanced generative techniques to seamlessly replace one person’s face with another, primarily manipulating facial appearances. This technique, driven by artificial intelligence, has rapidly evolved, raising concerns about its accessibility and potential misuse.

In a conversation with TCE, Kapsikar emphasized the increasing accessibility of deepfake tools, driven by the integration of artificial intelligence and machine learning. These tools have become readily available to users, becoming more familiar after the introduction of technologies like ChatGPT.

The proliferation of AI-generated tools, including images and deepfake videos, has provided scammers with a powerful arsenal to exploit unsuspecting individuals, leading to concerns about misinformation, identity theft, and privacy breaches.

“Of course, it is artificial intelligence and machine learning that are included in the deepfake technology and since we are now already starting a live into the technological era deepfake technology is also now getting easily accessible to any users and it happened recently like after ChatGPT introduction. People are getting more familiar with many AI-generated tools like AI-generated images or AI-generated deepfake videos. Scammers especially are very much using this particular technology to kind of scam your close ones”, said Balaji Kapsikar.

The Dark Side of Deepfake Technology

Kapsikar pointed out the darker dimensions of deepfake technology, noting its rise as a tool for scams and deceptive practices. Exploiting the capabilities of AI, malicious actors can manipulate images of individuals, altering ages and creating deepfake videos for nefarious purposes.

“Deepfake is now getting more popular as a kind of scamming tool as well. This can be easily misused as well and if it is, let’s say, with the AI, even if you have a photograph of the child, you can increase the age by using the AI and after that particular age or the adult age you can kind of create a deep fake video out of that. So, some of these things may happen and especially if it is for teenage girls or something, those deepfake videos can be showcased as a sextortion kind of attack and used as a use against them”, added Kapsikar. 

The dark side of deepfake technology extends beyond personal privacy concerns, encompassing the potential for malicious applications and deceptive practices. As accessibility to these tools grows, so does the risk of deepfakes being weaponized to manipulate public opinion, fabricate false narratives, and damage reputations by convincingly altering audio and video content.

A growing threat emerges as cybercriminals leverage deepfake technology for fraudulent activities and social engineering. The ethical implications are profound, necessitating the implementation of robust countermeasures to mitigate the harmful consequences and safeguard individuals and society from the malevolent aspects of deepfake advancements.

Media Disclaimer: This report is based on internal and external research obtained through various means. The information provided is for reference purposes only, and users bear full responsibility for their reliance on it. The Cyber Express assumes no liability for the accuracy or consequences of using this information.





Source link