As machine deep learning is making leaps and bounds in terms of development and understanding, the inevitability of a Skynet or Black Mirror-style AI-assisted entity terrifies us.
With the help of the internet, social media sites, our smartphones, our computers – the amount of data that’s being recorded about human behavior is unprecedented.
Now the question is this – can an algorithm-based AI tap into this repository of human knowledge and behavior patterns, and ultimately shape human behavior itself?
That’s the thought-provoking but scary proposition this leaked Google video presents us with. It’s a glimpse of what power this database and advancements in machine learning can hold over us in the future.
The Selfish Ledger
The 9-minute Google video in question is titled “The Selfish Ledger” (an obvious play on Richard’s Dawkins 1976 book about gene-based evolution, “The Selfish Gene”).
It was created in 2016 by Nick Foster, the head designer at Google X, Google’s research and development division. Recently acquired by The Verge, it was meant to be distributed internally within Google but it is now posted online for everyone to see.
Written as an ambitious proof-of-concept exercise, it applies early theories of evolution, specifically those of Lamarck’s, to the ever-growing data sets that technology allows.
The Ledger as a psychological tool
According to the video, these profiles (called “ledgers” by Foster) could be used, not just for tracking and recording individual behavior, but also for nudging us toward a preset goal.
On an individual scale, the video presents various ways a person can specify targets and have the ledger itself help them achieve their goals.
In theory, you can specify “environmental health” as one of your goals and the ledger will increasingly offer you options that will move you closer to that goal. For example, Google may suggest locally grown produce and organic vegetables when you shop online.
Here’s another interesting possibility. As the ledger becomes more goal oriented, it may see gaps in your data that prevent it from fully understanding you. To fill these data gaps, it could then fabricate 3-D printed data-collecting gadgets that will be aesthetically and functionally appealing to you.
The Ledger as a societal force
Additionally, the video also suggests that on a larger scale, this accumulated data can be passed on to succeeding generations and use it to predict or even modify collective behavior.
As these multi-generational data sets continue to grow and be analyzed, Foster suggests that it may be possible to use them to “develop a species-level understanding of complex issues such as depression, health, and poverty.”
Should we be concerned?
To be fair, this video is purely speculative and is in line with Google X’s goal to invent and launch radical “moonshot” technologies. Think of it as a laboratory where wild ideas are born.
A Google spokesperson officially stated the video was meant to be thought-provoking and it does not apply to any Google products in development.
“We understand if this is disturbing — it is designed to be. This is a thought-experiment by the Design team from years ago that uses a technique known as ‘speculative design’ to explore uncomfortable ideas and concepts in order to provoke discussion and debate. It’s not related to any current or future products,” the spokesperson told The Verge.
But still, speculative or not, the video demonstrates the potential for this unprecedented database of specific human behavior to shape, not just our consumerist tendencies, but to influence and modify entire societies themselves.
The technology to execute this exists and I won’t be surprised if “The Selfish Ledger” or some of its key concepts are already in use now.
In other news, here’s how technology will tell you who’s who during the Royal Wedding
Here’s another creepy use of technology. Apparently, Amazon’s Recognition software, along with GrayMeta and UI Centric, will be able to identify each guest of the Royal Wedding via facial recognition. Click here to read the full scoop.