If you own an Apple device, that device may already be scanning the images on it. An update that went live just last week can determine whether images on devices such as iPhone are illegal. Privacy activists are up in arms over this development, concerned that the program involved could be repurposed by authoritarians for censorship.
Apple has implemented the feature to detect whether images on a phone victimize children. While the endeavor sounds noble, there is room for the suspicion that this cause in particular has been selected because few people would disagree with it. However, the same thing could be said about counter-terrorism, which was the reasoning behind the Patriot Act, which continually violates the privacy of Americans.
There aren’t many people out there that need to be told that Cheese Pizza is illegal. I knew that it’s illegal to possess or share, but I had no idea that it was such a pervasive issue that it needed to be fought against on the device level.
The way the software works is by checking the hash values of images on the device. A hash value is a code that could be used to identify an image, and in the case of the software, it checks the hash values of images against a blacklist of hash values for known illegal images.
This is going to be misused. Hard.
When Cheese Pizza is brought up, it’s customary to virtue signal, and this will be mine: as far as I know, the most provocative image on my computer is just a drawing that doesn’t even depict any real people. I don’t even know whether any of the oceans of anime images on my computer clearly show any tube steaks or roast beef sandwiches. But I doubt that the algorithm is going to be looking for anime doodles, anyway.
My concern is two-fold: whether there would be false positives (which might not often happen), and whether the algorithm could be reverse-engineered and repurposed by authoritarians to go after free expression that should be protected.
For one thing, the hash values of images can be changed. One way of doing this is to simply edit the image, though this wouldn’t likely be effective against an AI that’s been trained to categorize images based on appearance. The problem is, it’s possible for a person to hack the hash value of an image, then share it with other people. An image that seems harmless might have a hash value that trips a flag, resulting in an investigation.
The algorithm is almost certain to be repurposed by more authoritarian regimes. It’s one thing if Cheese Pizza is illegal in one part of the world, but there are places where homosexuality is illegal, and a repurposed iteration of this software could detect LGBT+ material.
Then there’s China. The place is pretty much a country of over a billion slaves. Over there, it’s illegal to say anything against the Chinese Communist Party (CCP). The country already has a social credit system that automatically assigns a numerical value to people depending on their activities, with those scoring low enough being publicly shamed by having their face displayed on electronic billboards. It’s a system that works using cameras and facial recognition, which isn’t perfect. A person who is misidentified could be accused of a crime they didn’t commit.
If Apple’s new software could be used to seek out images on a Chinese blacklist, it could identify anyone who has a chance of being critical of the CCP.
I don’t go looking for illegal stuff, but that doesn’t mean I’m not concerned. Over a year ago, a car on the other side of the state I live in ran a red light, and I was issued a fine because a traffic camera misidentified the car’s license plate as my own. False positives happen, and we live in a world where disinterested state employees could cause problems for someone they’ve never even seen. The technology isn’t perfect, and that makes potential for problems that otherwise might not exist.
Right now, there is a lot of power in the hands of just a few people who manage the technology. As it is, you might not even own the device you use to read these words, and you may only have a license to use the software that runs on it, which can probably be revoked at any time.
It might be Cheese Pizza today, but tomorrow, it could be your politics. Silicon Valley is dominated by leftists as it is, and it’s not hard to find extreme elements of leftism that views anyone right of center as being criminal. Suppose that there’s a meme of Pepe the Frog that was relevant to the participants of the Capitol siege on January 6th, and you have that image on your iPhone somewhere, not aware of its significance. There’s no telling how you might have slipped up.
Did Apple just introduce a form of backdoor that allows external actors to determine whether a file is on your device? I don’t know, but as I see increasing authoritarianism employed just to fight an easily-survivable virus, it’s easy to feel a little concerned.
You can now be psycho-analyzed by your phone. Not really a new development, as it’s been a thing for a long time, as psychological information about you has long been sold to ad companies to the end of serving you targeted advertisements. It’s also been a thing for a while that activist groups engage in slander campaigns to try to make resistance figures out to be pedophiles, because that’s currently the go-to insult that can be used to destroy a person as cheaply as possible, in as few words as possible, and it’s the one insult that bypasses the hearer’s better judgement and causes them to assume the worst about the accused.
I suppose a person could try to fight back against this by switching to a phone running Android, or some kind of Freedom Phone. But it could be that they’ve been doing stuff like this for a while, but it just hasn’t been advertised.
I could imagine that pedophiles might be sweating bullets, about now. But considering the potential for abuse from the software itself, it’s easy to see why they’re not the only ones that should be concerned.