The Tech in Westworld's Episode 3 Shocker Is Already Here and Much Worse Than You Think

The last episode of Westworld had a huge reveal about it the real world outside the parks, one that embodies the show’s newest motto: ‘free will is not free’. The whole may seem far-fetched, like human robots, but it’s actually terrifying when you realize that predictive algorithms are already there …and have a much bigger hold on you than you think.

Illustration for article titled The Tech in iWestworlds / i Episode 3 Shocker is already there and much worse than you think

The third episode of Westworld season three, “The absence of field,Had Delores (Evan Rachel Wood) finally confirmed what we always suspected: Rehoboam, the predictive algorithm powered by Incite, Inc., has everything under control… This massive AI calculates every person’s life from birth to death, using predictive algorithmsdetermine the full future of people. It is the reason why Caleb (Aaron Paul) keeps getting rejected for jobs – he has already decided that he will die by suicide and is not worth saving or investing in.

Obviously, we don’t have a giant AI data ball that knows when we’ll die – at least not yet – but predictive algorithms are many scarier and more progressive reality than deadly robots. To explain it, I included Gizmodo tech reporter Shoshana Wodinsky, who covers everything about data collection and privacy. Watch the video above for our in-depth discussion, covering everything from how predictive algorithms are made to all the twisted ways they are used (hint: That ‘city mapping’ scene from episode one is already happening). We have also provided some explanations below.


Beth Elderkin: What are predictive algorithms?

Shoshana Wodinsky: A predictive algorithm takes the sum of your past behavior – for example, the things you bought or the apps you downloaded – and uses it to make a ‘prediction’ about how likely you are to take a similar action undertakes in the future. A good example of this is the predictive algorithms developed by tech companies like Netflix – if you watch a lot of cartoons (which I do) it is a predictive algorithm that measures how you would be more interested in something like that Planet Earth instead of something like that Fuller House.

Elderkin: How is our data collected?

Wodinsky: It all comes from our devices. Phones, tablets, computers, TVs, digital billboards, anything “smart” or connected to the Internet. All of these very real things that many of us use every day are constant compile information about you based on all kinds of behavior and send it to thousands of third parties – not just Facebooks and Googles of the world. The searches you perform online, the online shopping carts you leave behind, the keywords in emails or texts are all part of that data profile that these companies compile, not to mention real location data collected when you pass something like a digital billboard, or inside most big chains. Even the apps that you download, forget and never use again are compiled here.

Elderkin: What are predictive algorithms for?

Wodinsky: Well, 99 percent of the time, the purpose of these types of predictive algorithms is to sell your shit – which is why they can be so invasive. When it comes to marketing like the new version of Animal CrossingFor example, the marketer on the other side of that transaction would like to know what type of person clicks on these ads so they can continue to target you with more products in the future. This does not only mean that you know the raw purchase history of a data item Animal Crossing lover, but could very well include other details, such as your age, ethnicity, gender identity, income, relationship status – the works.

In some cases, predictive algorithms can also be used to track the average shelf life of products so you can be retargeted in the future. So if you have a track record of playing handheld games for about 200 hours before putting them down for good, these same predictive algorithms can target you with a new Switch game around the time you might have your last round plays village goodness.

Elderkin: How are predictive algorithms used differently?

Wodinsky: Take in social scores, as we saw with Caleb and his friends the first episode. China is testing a “social credit system, “Where citizens are ranked according to their public and online behavior. Like Westworld, a bad score can limit your ability to get a job; but it can also block access to trains and even slow down your internet. That happens in the United States, albeit in a smaller way. Some marketers will target people based on ‘desirability scores’ or’Westworld & # 39; s Episode 3 Shocker Is Already Here and Much Worse Than You Think “,” “]]”href =” https://aws.amazon.com/marketplace/pp/Consumer-Vitality-US/prodview-qv56u7pws7upw?ascsubtag=f98bad481361c86e8eb11ef780e94312d54b0fa7&tag=gizmodoamzn-20 “target =” _ top “data-amazonas = “gizmodoamzn-20” data-amazonsubtag = “[t|link[p|1842508176[au|5876237249235885598[b|gizmodo[lt|text”>vitalityscores[t|link[p|1842508176[au|5876237249235885598[b|gizmodo[lt|text”>vitaliteitsscores, “Which can take into account everything from a person’s education to their income, their criminal history and shopping history to assess whether or not they are a potential buyer of a particular brand. It’s a little different from China, or Westworldbut not so far away.

Elderkin: How bad is it going to be? We will become like all people in it Westworld, mapped out for us all our lives until we die?

Wodinsky: We’re not there yet, but as someone who deals with data privacy on a daily basis, I can tell you that’s where the tech giants want to go. There are more than 7,000 companies in this field, trying to make money every second of every day. Can you do that before you wake up? That is the dream for them.


For more information, be sure to follow us on our Instagram @io9dotcom.

.

Leave a Comment