When the user turns on the automatic backup function of iCloud, the password on the MetaMask wallet will be stored online, which becomes a vulnerability for hacker attacks.
On April 17, an NFT investor nicknamed “recovery_dom” stole a total of 132.86 ETH and over 252,400 USDT from his MetaMask wallet due to iCloud security issues. The total value of these properties is approximately $655,000. Prominent NFT investor Serpent has warned users about the scam.
According to Serpent, due to suspicious activity, the person received a series of messages asking to reset the Apple ID password. Meanwhile, @revive_dom has received numerous calls posing as Apple employees. Out of an abundance of caution, @revive_dom sent Apple employee impersonators a personal iCloud password reset verification code.
Shortly after, the attackers stole all the assets in the victim’s MetaMask wallet.
The official MetaMask Twitter channel warned users about the iCloud attack after Serpent was released. Specifically, this security issue occurs when users turn on automatic backups on Apple devices. After that, the security key (password) of the digital wallet will be stored on the platform.
MetaMask warns that if a user’s iCloud account password is weak and insecure, the password could be stolen. Attackers can trick users into obtaining login credentials and steal the security characters of cryptocurrency wallets.
According to MetaMask, to turn off automatic backups on Apple devices, users need to go to Settings > Apple ID/iCloud > iCloud Backup and choose to disable the feature.
NFT investor Serpent also gave his personal experience after being attacked. According to Serpent, users should use cold wallets to store cryptocurrencies and never reveal personal information to anyone.
“It’s important to note that caller information can be easily spoofed. A big company like Apple would never call a user,” Serpent added.
Apple quietly withdraws controversial plan
Apple has quietly removed information from the child safety section of its website about its plans to scan its iCloud Photo Library for illegal child sexual abuse (CSAM) content.
The Cupertino giant has announced plans to crack down on CSAM starting in August and is expected to be officially operational by the end of the year. However, after facing criticism from experts, human rights groups and even internal employees, Apple halted development of the feature.
In early September, the company announced that it would “spend more time” gathering feedback and making improvements.
It is unclear whether Apple will continue to scan CSAM images in the future. A company spokesperson confirmed to The Verge that the schedule has remained the same since the September adjustment.