Welcome to this week’s edition of the Threat Source newsletter.

We took a week off for summer vacation but are back in the thick of security things now.

My first exposure to deepfake videos was when Jordan Peele worked with BuzzFeed News to produce this video of former President Barack Obama appearing to use a few not-safe-for-work words. Since then, headlines have been popping up everywhere about the dangers of deepfake videos, with people even going as far as making deepfake pornographic videos which are incredibly problematic in so many ways.

And now, we must start worrying about deepfake voices, which somehow is even scarier to me. At least with deepfake videos, it is (so far) fairly easy to spot a fakeif you spend a lot of time on the internet, and social media companies have improved their flagging systems for this type of content.

But with fake AI voices — built off machine learning and archives of the person in question speaking — there is nothing to see. And the technology is already pretty convincing.

An AI voice company worked on the recent “Top Gun: Maverick” movie to recreate the voice of actor Val Kilmer, who lost his natural voice due to throat cancer and uses an electric voice box in his everyday life. I’ve not seen this movie, but I would imagine the average moviegoer had no idea this was the case.

Something I have seen is “Obi-Wan Kenobi” on Disney+, which also used the assistance of AI voice cloning technology to support the dialogue of a yet-to-be-named character in the show (my guess is Darth Vader/Anakin Skywalker because I just don’t think Hayden Christensen has the Vader voice in him).

And in the most terrifying potential application of this technology, Amazon unveiled the potential that its Alexa voice assistant could read part of a book in a boy’s deceased grandmother’s voice. Hard stop there. I miss my late grandparents as much as anything and would give anything for another phone call with them. That doesn’t mean I’d settle for a robotic recreation of their voice.  How long before deepfake voices become a thing? I know if someone was able to duplicate my voice, my grandmother would be easily convinced that I was in danger or needed money for something. Or how long before some scammer makes it seem like George Clooney is finally calling you to ask you out on that date you’ve always wanted if you send him a $500 Amazon gift card?

It’s amazing how far AI technology has come. And I can’t say I’m necessarily mad or concerned about its ability to make Darth Vader sound like Darth Vader. But tech companies can’t help themselves, so it’s only a matter of time before this technique gets out of hand and into cyber attackers’ control.

The one big thing

Ransomware actors love to operate on the dark web so they can stay anonymous and undetected. It allows them to trade stolen credentials, share tactics with one another and leak targets’ data. Recently, we’ve discovered several ways in which our researchers have uncovered their tactics and have been able to unmask these actors. The resulting de-anonymization taught us a great deal about several groups’ campaigns and malware, including the relatively new DarkAngels.

Why do I care? 

This exercise taught us a great deal about how ransomware actors operate. We can see ransomware operators take several precautions to obscure their true identity online and the hosting location of their web server infrastructure. We also learned that most ransomware operators use hosting providers outside their country of origin (such as Sweden, Germany and Singapore) to host their ransomware operations sites. This is all valuable information to use in the fight against ransomware. 

So now what? 

If you’re a security researcher yourself, there are several techniques outlined in our recent post that could be of use to you. We hope to arm the security community with as much information as possible so that others may help us in the global fight against ransomware and threat actors.  

Other news of note

The recent Supreme Court decision overturning nationally protected abortion rights in the U.S. has wide-ranging consequences that even reach the tech industry. Major social media companies are being asked to consider whether they’d cooperate with government investigations into women who potentially had abortions in states where it’s illegal by passing along personal information. Many women are now skeptical of period-tracking apps, too, some of which have privacy policies that openly state they would share user data with law enforcement. And the state-by-state method of abortion policy will only make the topic more complicated to untangle. (Yahoo! News, Axios, Vice Motherboard)

A new mobile spyware is targeting Android and iOS users across Europe and Asia. The recently discovered Hermit comes from Italian vendor RCS Labs and can steal data and record and make phone calls. The spyware disguises itself as legitimate apps to trick the user into downloading it, though researchers say these apps don’t appear on Google or Apple’s app stores. Hermit shows that attackers and governments alike are still using spyware like the NSO Group’s widespread Pegasus tool. Although spyware is not strictly illegal in many countries, it is often used by governments and state-sponsored groups to target vulnerable users, including high-profile activists, politicians and journalists. (Wired, ThreatPost)

A multi-stage remote access trojan is targeting several small and home office routers, potentially going back as far as 2020. The newly named ZuoRAT exploits known vulnerabilities in some Cisco, Netgear, Asus and DrayTek routers and infects other devices on the network and downloads other malware via DNS and HTTP hijacking. Security researchers say they’ve so far discovered at least 80 victims. (Dark Reading, Ars Technica)

Can’t get enough Talos?

Upcoming events where you can find Talos

A New HOPE (July 22 - 24, 2022)
New York City

BlackHat U.S. (Aug. 6 - 11, 2022)
Las Vegas, Nevada

DEF CON U.S. (Aug. 11 - 14, 2022)
Las Vegas, Nevada

Most prevalent malware files from Talos telemetry over the past week

SHA 256: e4973db44081591e9bff5117946defbef6041397e56164f485cf8ec57b1d8934  
MD5: 93fefc3e88ffb78abb36365fa5cf857c  Typical Filename: Wextract  
Claimed Product: Internet Explorer  
Detection Name: PUA.Win.Trojan.Generic::85.lp.ret.sbx.tg

SHA 256: e12b6641d7e7e4da97a0ff8e1a0d4840c882569d47b8fab8fb187ac2b475636c  
MD5: a087b2e6ec57b08c0d0750c60f96a74c  Typical Filename: AAct.exe
Claimed Product: N/A  
Detection Name: PUA.Win.Tool.Kmsauto::1201

SHA 256: c67b03c0a91eaefffd2f2c79b5c26a2648b8d3c19a22cadf35453455ff08ead0  
MD5: 8c69830a50fb85d8a794fa46643493b2  Typical Filename: AAct.exe  
Claimed Product: N/A  
Detection Name: PUA.Win.Dropper.Generic::1201

SHA 256: 8664e2f59077c58ac12e747da09d2810fd5ca611f56c0c900578bf750cab56b7  
MD5: 0e4c49327e3be816022a233f844a5731  Typical Filename: aact.exe  
Claimed Product: AAct x86  
Detection Name: PUA.Win.Tool.Kmsauto::in03.talos

SHA 256: 125e12c8045689bb2a5dcad6fa2644847156dec8b533ee8a3653b432f8fd5645    
MD5: 2c8ea737a232fd03ab80db672d50a17a    Typical Filename: LwssPlayer.scr    
Claimed Product: 梦想之巅幻灯播放器    
Detection Name: Auto.125E12.241442.in02