Apple appeals against security research firm while touting researchers

Experts said they were also surprised that Apple revived a fight against a major research tool provider just after arguing that researchers would provide a check on its controversial plan to scan customer devices.

August 18, 2021 09:27 am | Updated 12:47 pm IST

The Apple Inc logo is seen at the entrance to the Apple store in Brussels, Belgium.

The Apple Inc logo is seen at the entrance to the Apple store in Brussels, Belgium.

Apple Inc on Tuesday appealed a copyright case it lost against security startup Corellium, which helps researchers examine programmes like Apple's planned new method for detecting child sex abuse images.

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

A federal judge last year rejected Apple's copyright claims against Corellium, which makes a simulated iPhone that researchers use to examine how the tightly restricted devices function.

Also Read | Explained | What is Apple’s new child safety feature and why is it criticised?

Security experts are among Corellium's core customers, and the flaws they uncovered have been reported to Apple for cash bounties and used elsewhere, including by the FBI in cracking the phone of a mass shooter who killed several people in San Bernardino, California.

Apple makes its software hard to examine, and the specialised research phones it offers to pre-selected experts come with a host of restrictions. The company declined to comment.

The appeal came as a surprise because Apple had just settled other claims with Corellium relating to the Digitial Milennium Copyright Act, avoiding a trial.

Experts said they were also surprised that Apple revived a fight against a major research tool provider just after arguing that researchers would provide a check on its controversial plan to scan customer devices.

Also Read | After criticism, Apple to only seek abuse images flagged in multiple nations

"Enough is enough," said Corellium Chief Executive Amanda Gorton. "Apple can't pretend to hold itself accountable to the security research community while simultaneously trying to make that research illegal."

Under Apple's plan announced earlier this month, software will automatically check photos slated for upload from phones or computers to iCloud online storage to see if they match digital identifiers of known child abuse images. If enough matches are found, Apple employees will look to make sure the images are illegal, then cancel the account and refer the user to law enforcement.

"We'll prevent abuse of these child safety mechanisms by relying on people bypassing our copy protection mechanisms,' is a pretty internally incoherent argument," tweeted David Thiel of the Stanford Internet Observatory.

Because Apple has marketed itself as devoted to user privacy and other companies only scan content after it is stored online or shared, digital rights groups have objected to the plan.

Also Read | Apple says it won't expand new child safety feature to any government request

One of their main arguments has been that governments theoretically could force Apple to scan for prohibited political material as well, or to target a single user.

In defending the program, Apple executives said researchers could verify the list of banned images and examine what data was sent to the company in order to keep it honest about what it was seeking and from whom.

One executive said that such reviews made it better for privacy overall than would have been possible if the scanning occurred in Apple's storage, where it keep the coding secret.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.