Even as governments around the world applauded Apple’s decision to proactively search for unlawful photographs of child sexual assault, WhatsApp has slammed the company’s new kid safety capabilities as a “very alarming… surveillance system.”
The standoff sets the stage for a fight between other digital companies and government authorities who want them to use comparable technologies.
Apple’s new technology, which set “a benchmark for other tech companies,” was “welcomed” by an Indian government official on Friday, according to the Financial Times, while one EU official claimed the tech giant had devised a “very elegant solution.”
Senator Richard Blumenthal of the United States termed Apple’s new system “innovative and courageous.”
Sajid Javid, the UK’s health secretary and former home secretary, tweeted, “Time for others — including Facebook — to follow their example.”
Apple’s Silicon Valley rivals, on the other hand, are claimed to be “incandescent” about its system for scanning images on US users’ iPhones before uploading them to iCloud, which will be released as part of the next version of iOS.
Will Cathcart, WhatsApp’s CEO, stated, “This method adds something very worrying into the world.” “This is an Apple-designed and run surveillance device that may be used to scan private content for anything they or a government chose to control.
It’s alarming to see them take action without consulting specialists.
“We will not adopt it at WhatsApp,” he added.
The positive response from lawmakers will only add to security and privacy advocates’ concerns that Apple has set a hazardous precedent that might be exploited by authoritarian regimes or overzealous law enforcement.
WhatsApp, Telegram, and Signal, all controlled by Facebook, as well as Google’s Android operating system, are already being pressured to follow Apple’s lead.
In a blog post, India McKinney and Erica Portnoy of the Electronic Frontier Foundation remarked, “To say we are unhappy with Apple’s plans is an understatement.”
“Apple’s end-to-end encryption compromise may placate government agencies in the United States and overseas, but it is a surprising U-turn for customers who have depended on the company’s privacy and security leadership.”
In recent months, political pressure has grown on technology companies throughout the world to grant governments access to encrypted communications, such as conversations, images, and videos.
Prime Minister Narendra Modi of India recently signed legislation requiring technology platforms like WhatsApp to track the source of illegal messages, thereby destroying end-to-end encryption.
WhatsApp is currently fighting the government in court in an attempt to thwart the new restrictions.
In an open letter signed by the “Five Eyes” countries, as well as Japan and India, officials including UK home secretary Priti Patel and former US attorney-general William Barr said they “urge industry to address our serious concerns where encryption is applied in a way that completely precludes any legal access to content” last October.
They stated that one of the reasons they believe tech companies should develop alternative methods to provide authorities access to device content is because of child abuse, and that there is a “growing consensus among states and international institutions that action must be taken”
Apple’s commitment to confine itself to scanning for child abuse imagery has been met with skepticism by critics.
“I hate getting all slippery-slope, but when I look over the slope, governments all over the world are slathering it in oil, and Apple just pushed its customers over the edge,” said Sarah Jamie Lewis, a cryptography researcher and executive director of the Canadian NGO Open Privacy.
While no such legislation exists in the United States, Apple’s move coincides with the UK and EU preparing new legislation — the Online Safety Bill and Digital Services Act — that would place greater responsibility on tech companies to prevent the spread of child pornography and other forms of harmful content.
Apple’s decision to pursue its own system rather than participate in a cross-industry negotiation with regulators around the world has enraged its Silicon Valley neighbors, especially after they banded together in support of the company’s 2016 legal battle with the FBI over access to a terrorist suspect’s iPhone.
During an online video conversation with academics at Stanford University on Thursday, Matthew Green, a security professor at Johns Hopkins University, stated, “Some of the reaction I’ve heard from other competitors to Apple is incandescent.”
During the same talk, Alex Stamos, the former Facebook security executive who is now the director of the Stanford Internet Observatory, said that Apple “doesn’t care at all” that everyone is trying to find a delicate international balance. He stated, “Clearly, there would be instant pressure on WhatsApp.”
In an internal memo released on Thursday, an Apple executive addressed the uproar caused by its actions.
In a note received by Apple site 9to5Mac, Sebastien Marineau wrote, “We’ve had many favorable replies today.”
“We recognize that some people are confused, and that some are concerned about the ramifications, but we will continue to clarify and detail the features so that everyone understands what we’ve built.
Apple’s announcement has elicited no public responses from Facebook or Google.
Some have previously chastised Apple for not doing more to prevent harmful content from spreading, particularly on iMessage.
The company has been unable to access any photographs or videos transmitted between its users since the iPhone’s messaging software is end-to-end encrypted.
Messages between two senior Apple engineers, which were produced as evidence in the iPhone maker’s recent legal dispute with Epic Games, suggest that some employees thought the business should do more.
Eric Friedman, head of Apple’s Fraud Engineering Algorithms and Risk unit, said in the email, which was initially revealed by the Tech Transparency Project, that “we are the greatest platform for spreading child porn” when compared to Facebook.
Friedman continued, “We have decided not to know in enough locations where we really can’t say” how much child sex abuse material might be there.
[…] Despite official praise, WhatsApp criticizes Apple’s child safety “monitoring.” […]