Please consider adding info@fsf.org to your address book, which will ensure that our messages reach you and not your spam box. Read and share online: https://www.fsf.org/news/a-wake-up-call-for-iphone-users-its-time-to-go Dear Free Software Supporter, In the last few weeks, Apple announced that it will begin actively monitoring the photos and videos stored on the iPhones of its users in the United States. Apple is describing its surveillance system as a way to monitor for Child Sexual Abuse Material (CSAM), but whatever it claims it is searching for, what it really is is a way to use proprietary software to constantly search and spy on its users' devices. Technological ethics groups around the world have highlighted the grave implications and dangerous precedent these practices set for a user's privacy and right to control their own device. In short, Apple has stated it will roll out two types of surveillance to all iPhone models receiving a forthcoming update: one which compares photos stored on the device to hashes of a database of known CSAM hashes, and one which (optionally) alerts parents of sexual materials sent from their child's iPhone. Appealing to such an emotionally intense and important issue is a dangerous and shrewd way for Apple to seize more control over iPhone users. Despite the claims in its TV commercials, Apple is an enemy of user freedom and meaningful privacy. We wouldn't agree to having someone come and scour through our house To stop all email from the Free Software Foundation, including Defective by Design, and the Free Software Supporter newsletter, click this link: https://my.fsf.org/civicrm/mailing/optout?reset=1&jid=163968&qid=69286163&h=b283bbf674d51f40.on a whim. We especially wouldn't agree to it if it was a person with a track record of working against our best interests, and exploiting us for their own profit. This is precisely the situation in which millions of Apple users in the US find themselves. In the coming weeks, users will be allowing Apple to comb through a portable computer containing private messages, photos, videos, banking information, and business records; all without anyone being able to verify what it is looking for, or with whom it is sharing that information. Such an action flies in the face of Apple's stated claim to care about user privacy, one which it has taken care to widely advertise. It also raises important questions: what makes Apple a righteous and objective observer of the situation, and why should we trust it? We know that it will not hesitate to capitalize on any data that it gets its hands on, and it rarely budges when asked for more transparency. So long as its surveillance system is proprietary, no one can verify its claims of just what it is searching for, how it goes about looking for it, and who it is delivered to. What we can verify is that a group of Princeton researchers who developed what seems to be very similar software have urged Apple not to use it. Aside from this, all we have are the promises of a corporation that long ago showed it could not be trusted. Apple has sometimes appeared to defend users' data from law enforcement and three-letter government agencies, and has also all too eagerly caved. But this is only one concern. What's crucial to remember, because it's so often glossed over, is that Apple has done and will do nothing to protect Apple users from Apple, and this is precisely the problem that users are now encountering. Apple is completely unwilling to concede any part of the autocratic control it has over its platforms for the benefit of its users. This is fully intentional, and it should give us pause. Once this precedent has been set, what will stop Apple from scanning user devices for other types of photos and other materials, or stop it from providing that information to other types of law enforcement? The nonfree program that's scanning iPhones for CSAM now could be configured to scan for another type of material, and users would be none the wiser. There is absolutely nothing to stop Apple from leveraging this system to perform other types of scans, such as for pro-democracy materials on the phone of a user in a repressive state. If Apple is willing to cave to authorities for something as small as the Taiwanese flag emoji on Apple iPhones in mainland China, can we really expect it to stand up for its users when it is asked for data on someone deemed a political threat? Apple's users should have the right to fully control what programs run on their devices, and what data those programs send. Just last year, we detailed the ways in which Apple prevents its users from running apps Apple has not approved, or perhaps to put it more clearly, to prevent users from running programs Apple doesn't like. In that article, we highlighted the ethical problems with a notary service of this type, and showed why such a service should be opt-in only, and that users should be permitted to select a reputable third-party other than Apple to do this notarizing for them using free "as in freedom" software. While we implore Apple to listen to the thousands of people asking it to change direction here, we also implore all iPhone users to take this as a wake-up call, and to demand not just an end to this specific abuse of proprietary software control, but an end to the entire underlying power dynamic. Things never had to be this way. While we condemn Apple's plan and urge it to reverse it, its ability to exert autocratic influence over the millions of Apple devices around the world stems chiefly from the fact that these devices run a nonfree operating system: iOS. If Apple's global community of users had control over iOS, such an antifeature would likely never have been accepted into the code in the first place. While Apple's advertising likes to make the claim that their products put privacy first, a close analysis of the matter confirms that true user privacy is impossible without free software. The ethical aspects of such a rushed and sweeping plan are difficult to capture all at once, and it's equally difficult to predict how Apple's implementation of its plan will really play out. We continue to follow the implementation of this technology closely, and while we can't give an exhaustive analysis, we can at least leave off with a quote that we think sums up the rights that Apple's users are entitled to, and the demands they should make: "We all deserve control over our digital lives. [..] It's time to stand up for the right to privacy -- yours, mine, all of ours. This problem is solvable -- it isn't too big, too challenging, or too late." Thanks, Tim Cook. We couldn't agree more, and we hope you'll tell Tim you agree with him on this one, too. Greg Farough Campaigns Manager |
No comments:
Post a Comment