In a concerning wave of recent cyberattacks, hospitals and healthcare institutions are being warned about sophisticated social engineering schemes targeting their IT help desks. This alert comes from both the U.S. Department of Health and Human Services (HHS) and the American Hospital Association (AHA), who have highlighted the use of stolen identities and advanced impersonation tactics by attackers aiming to compromise hospital systems.
The Scheme Explained
Attackers are employing a deceptive strategy that involves calling IT help desks and posing as hospital employees, typically those from financial departments. They provide stolen employee details—such as corporate IDs and social security numbers—to pass security checks. By claiming a broken smartphone, they persuade help desk personnel to enroll a new device for multi-factor authentication (MFA), which is actually under the attacker's control. This allows them to gain unauthorized access to sensitive areas of the hospital's digital environment.
Impact and Consequences
Once inside the system, these cybercriminals can manipulate financial transactions and reroute funds to their accounts, often transferring them overseas eventually. Such breaches not only lead to financial losses but also compromise patient data and disrupt hospital operations, posing a severe risk to patient care and institutional integrity.
The HIPAA Journal notes that ransomware attacks on health systems have escalated, with the average cost of a data breach reaching unprecedented levels, severely impacting hospital finances and operations​.
Moreover, the American Hospital Association (AHA) has been actively working to bolster defenses against these cyber threats by issuing cybersecurity advisories and providing resources to help healthcare institutions safeguard their systems. These efforts are crucial, given that health systems are high-value targets for cybercriminals, leveraging both ransomware and social engineering tactics​.
Adding to the complexity, the healthcare sector must contend with a multitude of new cyber vulnerabilities regularly. Recent advisories from the Health Sector Cybersecurity Coordination Center (HC3) have pointed out vulnerabilities across widely used technologies like those from Microsoft, Google, and Apple, which necessitate prompt and thorough action to patch and mitigate risks.
Going One Step Further
Criminals are not just using voice cloning AI for phone calls but also facial cloning to really sell the impersonation. This method involves using AI to create convincing replicas of a target's voice and facial expressions.
Typically, attackers gather the necessary audio and video samples of the target by scraping online content or through deceptive practices such as fake job interviews over video calls. Once they have enough material, the AI is trained to mimic the subject’s speech patterns, tone, and facial movements.
I recently sat down with a client who had this exact thing happen to them, where a board member had been lured into a conference call by a fake high profile head hunter who got them on a 20 min Zoom call about potential opportunities at a competing organization, but was actually simply using this to get enough video and audio to clone them.
These AI clones can be used in various fraudulent schemes, most notably to trick companies or banks into transferring money under the guise that the request is coming from a trusted source. This type of cyber attack not only poses a severe threat to individual privacy but also to organizational security, potentially leading to significant financial losses and damages to reputation.
We are now in uncharted territory regarding impersonation attacks and this emerging threat underscores the need for heightened cybersecurity awareness and the implementation of more stringent verification processes within organizations to counter the misuse of AI technologies in impersonation scams.
The LastPass Deepfake Incident
This attack vector was recently attempted at LastPass where hackers attempted using deepfake audio to impersonate the company's CEO in a voice phishing attempt. The goal was to trick an employee into granting access to secure networks, though this attempt ultimately failed. This incident is part of a troubling trend where cybercriminals employ advanced technologies to create highly convincing scams.
Luckily the attempt was not successful because the attackers used WhatsApp as the vector to communicate which is not a typical business / corporate communications channel and thus raised suspicions.
For more on how cyber threats are adapting and suggestions on safeguarding digital platforms, refer to the detailed report on BleepingComputer.
A Successful Heist
One striking example of a successful deepfake attack occurred when cybercriminals used AI-generated video and voice cloning to impersonate a Hong Kong’s company executive. During this scam, attackers conducted a video call where the AI impersonated the executive's face and voice convincingly enough to direct a financial transfer. The employees, deceived by the realistic appearance and sound of their supposed superior, followed the instructions believing them to be legitimate and unfortunately transferred 25 million dollars to the attackers.
This case obviously resulted in significant financial loss and highlighted the urgent need for organizations to enhance their verification processes for identity confirmation, especially in scenarios involving requests for sensitive actions or information.
Training Resources:
For individuals looking for a hands on training that includes all of the above topics, Covert Access Team (covertaccessteam.com) provides training courses focused on physical penetration testing, lockpicking, bypassing techniques, social engineering and other essential skills.
Covert Access Training - 5 day hands on course designed to train individuals and groups to become Covert Entry Specialists
Physical Audit Training - 2 days of intensive physical security training focused on enhancing facility defenses and bolstering security measures against attackers
Elicitation Toolbox Course - 2 day course that focuses on elicitation and social engineering as critical aspects of Black Teaming
Counter Elicitation Course - 2 day course that teaches how to identify and protect from elicitation tactics aimed at extracting confidential information.
Cyber Bootcamp for Black Teams - 2 day course designed explicitly for physical penetration testers who need vital cyber skills to add to their toolbox.
Private Instruction - Focused learning & training based on your needs .