5 IT security lessons CTOs need to know in the age of deepfakes

Tech security lessons every CTO needs to know

This article was previously published for the Forbes Tech Council and was written by our CEO, Emilien Coquard.

On October 3, 2023, a TikTok ad appeared with the biggest YouTuber in the world, MrBeast, offering iPhone 15s for just $2.

It sounded too good to be true. And it was. Jimmy Donaldson (better known as MrBeast) revealed he wasn’t running a giveaway. Across the internet, an unknown actor had created a deepfake video of MrBeast to scam his many followers.    

As a CTO, you probably don’t have to worry about falling prey to a TikTok scam, but the technology behind it has serious IT security implications for your organisation and offshore development centres

Here are five lessons you need to know and take action on today to ensure your company is safe.

Lesson 1: We can no longer trust our eyes or ears

Deepfakes are media, typically videos, that have been manipulated to look and sound like someone else. 

While this technology isn’t new, recent advances in AI have made them far more convincing and harder to detect. Remember that TikTok scam from before? Well, TikTok’s content moderation uses humans and AI vetting. This is the most publicized case of deepfake technology fooling moderation tools. 

The technology behind deepfakes is spreading rapidly too. 

With more bad actors accessing these tools, we can no longer trust our eyes or our ears.  

Lesson 2: Human error is the main cause of security leaks … and will get worse 

Between 2015 and 2017, a scammer defrauded wealthy French business owners to the tune of €80m. He didn’t write any clever code or even use deepfake technology. 

His only tool? 

A silicon mask of the French Defense Minister, Jean-Yves Le Drian. 

This was the latest of Gilbert Chikli’s scams, documented in the Netflix series The Masked Scammer. All of these crimes relied on impersonating someone else to get access to sensitive data or money. 

And they’re rife across the world. 

An estimated 88% of all IT security breaches come from employee errors. Phishing scams have long played on these mistakes and a study by Stanford University found 41% of phishing victims had been targeted by emails claiming to be senior executives. 

Just last year, one of the People and Culture team at my company reported an attempted CEO fraud attack. He had received a WhatsApp message claiming to be me and looking to gain several hundred dollars worth of untraceable vouchers. 

Fortunately, he knew better due to his training and our safety procedures, so reported the issue. Less prepared employees in other companies won’t be so lucky. 

Just imagine how much more likely they are to agree to such a request if it comes from a video of the CEO and not just a WhatsApp text. 

tablet graphics the scalers ebook scaling beyond borders
Build future-proof dev teams

Discover how to scale development teams in 2023 and beyond

DOWNLOAD EBOOK

Lesson 3: Employee training programs are vital

To prevent human errors, we need to educate our employees. 

Deepfake detection solutions can only help so far. If an employee gets a message outside of the usual channels, it will bypass any detection services. Plus as we saw earlier, there’s a constant arms race between tools to create deepfakes and detection tools. 

Employee training has long been at the core of any IT security program — they are even more vital now. 

Many of the principles behind avoiding phishing attacks help prevent deepfake attacks, but we still need to highlight the specific challenges. 

Useful measures include adopting procedures for certain requests — transfers, purchases, sending files and links — to help detect impersonators. This can include the right channels to use, how to format the request, and specific spoken passwords or phrases to use to show their authenticity. 

You shouldn’t just rely on using video calls. If you do, you open yourself to the same sort of error that BlueBeux, a Brazilian Crypto exchange faced when they fell victim to a deepfake scam over Zoom impersonating Patrick Hillmann, the former CSO of Binance. 

Lesson 4: Limit access and use strict security

You can’t give away what you don’t have. 

That’s why sandboxing data and access helps limit security breaches. If an attacker targets a low-level employee, they won’t be able to access more valuable information. 

While a good principle for all data security, it’s particularly relevant with deepfake attacks. Scammers who attempt to trick users into giving access won’t have the power for real harm. 

With additional security practices such as multi-factor authentication, regularly updating login details, and managed device security, any successful attack will have a hard time expanding its breach.  

Lesson 5: Have a plan ready to respond

Even with the best plan in place, human errors still happen. 

When they do, you need to be able to respond fast to mitigate the damage, protect your customers, and preserve your company’s reputation. 

If you followed the last step, this will be easier. With a sandboxed security approach, it’s easier to locate breaches and limit their impact. 

But beyond remedying the breach, you need to be ready to communicate without causing panic. It’s far better to provide a rapid update with positive news rather than a late, unclear apology. 

A final note: Don’t forget your partners

The best security processes and practices are useless if you have a partner who is a security vulnerability. 

A press release might sound better when a third party is the source of a security breach, but customers rarely care. To them, you and your partner are the same. 

Make sure you vet your partners and ensure they are following the best practices for IT and data security. For example, if they hold ISO 27001 certification, you can be sure they’re following the best practices for handling your, and your customers’ data. 

Colmore case study page preview
Driving value: Colmore case study

How we helped a FinTech scale its data insights team to an 85-person R&D hub in Bangalore

DOWNLOAD CASE STUDY

Are you deepfake ready? 

I’ve yet to see or hear of a deepfake scam in our company or any of the dedicated offshore teams we build for our partners, but it’s probably only a matter of time before I do. 

That’s why we’ve implemented training programs and reviewed authentication processes across all our teams. As the old saying goes, failing to prepare is preparing to fail. 

Make sure your company and partners are deepfake resistant too.