Combating crimes such as possession of child pornography is a complex task and often limited by the privacy, data protection and cybersecurity measures for tech device users, although Apple seems determined to take a proactive approach against these terrible crimes.
Jane Horvath, the company’s chief privacy officer claims that, after an update on user privacy policies, Apple began scanning photos and videos stored on users’ cloud accounts to check if the user has sexually abused children related content.
During a press conference, Horvath stated that Apple employs sophisticated image scanning technology for illegal content detection, and the company disables analyzed accounts in the event of founding illegal material; the method employed by Apple is still unknown, but more details could be revealed in the future.
Its privacy and data protection policies have created a number of issues for Apple, especially with law enforcement agencies, which consider the company to interfere with criminal investigations by refusing to unlock tech devices belonging to suspicious individuals or encrypting their messaging services.
However, Apple seems willing to cooperate more extensively with authorities in combating child sex crimes, although they anticipate that communications encryption will not be eliminated: “Removing encryption will not solve this problem, we will use the best available technology to help authorities detect child pornographic material,” Hovarth said during its presentation at the Consumer Electronic Show (ECS) in Las Vegas, where the latest innovations in technology and cybersecurity.
On the other hand, an Apple spokesperson referred to a disclaimer posted on the company’s official website, mentioning that: “We are committed to protecting children throughout the Apple ecosystem, we will continue to support innovation in combating these crimes. To support this commitment, Apple employs image matching technology with a special focus on child exploitation. Just like spam filters on our email service, these systems use electronic signatures to identify illegal material.”
As mentioned above, accounts that store illegal content will be in breach of Apple’s new terms, so they will be disabled.
Although Apple did not go into details about how this technology works, cybersecurity specialists mention that the method is likely to be based on a filtering system known as PhotoDNA, which compares images to a database previously established. This method is used by other companies, such as Google and Facebook.
Regarding iPhone encryption, Horvath defended the company’s stance, after the cybersecurity community anticipated that the FBI would start a new controversy over Equipment developed by Apple. A few weeks ago, the federal agency expressed its rejection of this policy because Apple refused to unlock the iPhone of a guy involved in a tiple homicide in Florida, USA. The International Institute of Cyber Security (IICS) recommends that users wish to learn more about changes in Apple’s policies to target the company’s official platforms.