A wise person once said, "If something on the internet is free, that means you are the product." With every new privacy scandal or data breach that breaks the news, this statement proves more and more correct.
Recently, companies like Facebook and Amazon have taken heat from privacy advocates, who have accused the tech giants of selling user data to advertisers and companies without permission. As bad as these instances were, a recent scandal brewing with a popular cloud-based photo storage app might have them beat.
According to reports, this company's free application had a darker side. Without informing any of its users, this app harnessed millions of stored photos to train face-recognition AI. This data was then sold to private companies and developers, with hopes of potentially reaching law enforcement and military contracts. If you've been using this app for storing photos, you may want to delete it off your phone before your face gets added to the database.
The hidden side of Ever photo storage
Ever is one of the most popular cloud-storage apps on Apple's App Store and Google Play Store. It regularly achieved #1 status on both stores in the productivity category, with fans loving its unlimited storage capacity and easy-to-use organization tools. At its peak, the app stored billions of photos in its cloud servers, and saw thousands of new users signing up each day.
New information uncovered by NBC revealed a disturbing connection between Ever's storage app and another program created by the same developer: Ever AI.
This app is a facial recognition tool that brags of its enormous reference database, but as investigators uncovered, this database includes user photos harvested by Ever's cloud storage. These photos are fed into the AI's algorithm for training, allowing it to get even better at telling people apart.
Essentially, if you used Ever to store your photos, your photos were used to improve an AI's ability to recognize human faces. Worse yet -- the company is still doing it, and most users still don't know it's happening.
How is Ever AI able to use my photos?
The company doesn't attempt to hide the fact that it's in the business of AI research, but the lack of transparency to users has privacy advocates concerned. They say that even if Ever doesn't directly share data with third-parties (something the company claims not to do) the fact that users aren't fully informed how their content is being used is a privacy nightmare waiting to happen.
The revelations about Ever AI have led to a drop in new and active users for the service, but sadly, the genie is already out of the bottle. Ever isn't alone in the data harvesting business, after all. Companies like 23andMe already sell user DNA data to law enforcement and criminal databases, which is a goal that Ever claims it has yet to achieve.
Given enough time, a talented face-recognition AI could see high demand from military and police forces looking to expand their reach with technology.
Regardless of how this data is used, it's certainly worth questioning the ethics of harvesting in the first place. If users aren't completely and obviously informed about how their data is being used, might we see more violations of privacy in the future? Right now, the answer looks fairly grim.
In light of this knowledge, I would highly advise people signed up for Ever to stop using the service. If you're looking for alternative cloud-storage apps, Dropbox is a useful tool that doesn't use your face to train robots. Plus, you can store other files and documents with it!
Major wireless carriers sued for selling location data used by bounty hunters
Just a few months ago, a shocking report detailed how the major wireless carriers in the U.S. would sell your real-time location data to brokers who would then turn around and re-sell it to others, including bounty hunters. Now, a massive class action lawsuit has been filed involving more than 300 million wireless customers. You might be impacted by the suit.