DigitalNewsServices

24×7 Live News

US Top news

Deepfakes to make biometrics unreliable india business news

Bengaluru: There will be attacks using AI-generated by 2026 Deepfakes on facial biometrics This would mean that 30% of enterprises would no longer consider it Identity Verification And certification solutions to be had reliable In isolation, according to research advisory firm Gartner.
“Over the past decade, there have been several inflection points in the field of AI that allow the creation of synthetic images. These artificially generated images of real people’s faces, known as deepfakes, can be used malicious actor To weaken biometric authentication or make it inefficient, said Akif Khan, VP (analyst) at Gartner. “As a result, organizations may begin to question the reliability of identity verification and authentication solutions, as they will not be able to tell whether face Whether the person being verified is a living person or a deepfake.”
spread of deepfake It has emerged as a big threat. Identity verification and authentication processes using face biometrics Today we rely on presentation attack detection (PAD) to assess user liveness. PAD leverages software and hardware to combat biometric fraud. “Current standards and testing procedures for defining and assessing PAD mechanisms do not cover digital injection attacks AI-generated Deepfakes that can be created today,” Khan said.
Research from Gartner states that presentation attacks are the most common attack vector, but injection attacks are expected to increase by 200% in 2023. Preventing such attacks will require a combination of PAD, injection attack detection (IAD), and image inspection.
To help organizations protect themselves against AI-generated deepfakes beyond face biometrics, chief information security officers and risk management leaders need to choose vendors that can demonstrate they have the capabilities and a plan that is current. beyond the standards and who are monitoring, classifying and quantifying them. New types of attacks.



Source link

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *