[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

How Technology Hijacks People’s Minds

---------- Forwarded message ----------
From: "ECOTERRA Intl." <[email protected]>
Date: Sun, 29 May 2016 16:03:05 +0300
Subject: [NATURAL_DEFENCE] How Technology Hijacks Peopleâ??s Minds
To: MAILHUB <[email protected]>

      How Technology Hijacks Peopleâ??s Mindsâ??â??â??from a Magician and
      Googleâ??s Design Ethicist

By Tristan Harris <https://medium.com/@tristanharris>(*) - May 2016-
/Estimated reading time: 12 minutes, but read it for your own peace of mind.

Iâ??m an expert on how technology hijacks our psychological
vulnerabilities. Thatâ??s why I spent the last three years as Googleâ??s
Design Ethicist caring about how to design things in a way that defends
a billion peopleâ??s minds from getting hijacked.

When using technology, we often focus /optimistically/ on all the things
it does for us. But I want you to show you where it might do the opposite.

*/Where does technology exploit our mindsâ?? weaknesses/**?*

I learned to think this way when I was a magician. Magicians start by
looking for /blind spots, edges, vulnerabilities and/ /limits/ of
peopleâ??s perception, so they can influence what people do without them
even realizing it. Once you know how to push peopleâ??s buttons, you can
play them like a piano.

Thatâ??s me performing sleight of hand magic at my motherâ??s birthday party

And this is exactly what product designers do to your mind. They play
your psychological vulnerabilities (consciously and unconsciously)
against you in the race to grab your attention.

I want to show you how they do it.

        Hijack #1: If You Control the Menu, You Control the Choices

Western Culture is built around ideals of individual choice and freedom.
Millions of us fiercely defend our right to make â??freeâ?? choices, while
we ignore how weâ??re manipulated upstream by limited menus we didnâ??t choose.

This is exactly what magicians do. They give people the illusion of free
choice while architecting the menu so that they win, no matter what you
choose. I canâ??t emphasize how deep this insight is.

When people are given a menu of choices, they rarely ask:

  * â??whatâ??s not on the menu?â??
  * â??why am I being given /these options/ and not others?â??
  * â??do I know the menu providerâ??s goals?â??
  * â??is this menu /empowering/ for my original need, or are the choices
    actually a distraction?â?? (e.g. an overwhelmingly array of toothpastes)

How /empowering is this menu/ of choices for the need, â??I ran out of

For example, imagine youâ??re out with friends on a Tuesday night and want
to keep the conversation going. You open Yelp to find nearby
recommendations and see a list of bars. The group turns into a huddle of
faces staring down at their phones /comparing bars. /They scrutinize the
photos of each, comparing cocktail drinks. Is this menu still relevant
to the original desire of the group?

Itâ??s not that bars arenâ??t a good choice, itâ??s that Yelp substituted the
groupâ??s original question (â??where can we go to keep talking?â??) with a
different question (â??whatâ??s a bar with good photos of cocktails?â??) all
by shaping the menu.

Moreover, the group falls for the illusion that Yelpâ??s menu represents a
/complete set of choices/ for where to go. While looking down at their
phones, they donâ??t see the park across the street with a band playing
live music. They miss the pop-up gallery on the other side of the street
serving crepes and coffee. Neither of those show up on Yelpâ??s menu.

Yelp subtly reframes the groupâ??s need â??where can we go to keep talking?â??
in terms of photos of cocktails served.

The more choices technology gives us in nearly every domain of our lives
(information, events, places to go, friends, dating, jobs)â??â??â??/the more
we assume that our phone is always the most empowering and useful menu
to pick from/. Is it?

*/The â??most empoweringâ?? menu is different than the menu that has the
most choices/*/. /But when we blindly surrender to the menus weâ??re
given, itâ??s easy to lose track of the difference:

  * â??Whoâ??s free tonight to hang out?â?? becomes a menu of /most recent
    people who texted us /(who we could ping).
  * â??Whatâ??s happening in the world?â?? becomes a menu of news feed stories.
  * â??Whoâ??s single to go on a date?â?? becomes a menu//of faces to swipe on
    Tinder (instead of local events with friends, or urban adventures
  * â??I have to respond to this email.â?? becomes a menu of /keys to type a
    response/ (instead of empowering ways to communicate with a person).

All user interfaces are menus. What if your email client gave you
/empowering choices of ways to respond, instead of â??what message do you
want to type back?â?? (Design by Tristan Harris)/

When we wake up in the morning and turn our phone over to see a list of
notificationsâ??â??â??it frames the experience of â??waking up in the morningâ??
around a menu of â??all the things Iâ??ve missed since yesterday.â??

A list of notifications when we wake up in the morningâ??â??â??how /empowering
is this menu of choices when we wake up? Does it reflect what we care
about? (credit to Joe Edelman)/

By shaping the menus we pick from, technology hijacks the way we
perceive our choices and replaces them new ones. But the closer we pay
attention to the options weâ??re given, the more weâ??ll notice when they
donâ??t actually align with our true needs.

        *Hijack #2: Put a Slot Machine In a Billion Pockets*

If youâ??re an app, how do you keep people hooked? Turn yourself into a
slot machine.

The average person checks their phone 150 times a day. Why do we do
this? Are we making /150/ /conscious/ /choices/?

How often do you check your email per day?

One major reason why is the #1 psychological ingredient in slot
machines: *intermittent variable rewards*

If you want to maximize addictiveness, all tech designers need to do is
link a userâ??s action (like pulling a lever) with a /variable reward/.
You pull a lever and immediately receive either an enticing reward (a
match, a prize!) or nothing. Addictiveness is maximized when the rate of
reward is most variable.

Does this effect really work on people? Yes. *Slot machines make more
money in the United States than baseball, movies, and theme parks
Relative to other kinds of gambling, people get â??problematically
involvedâ?? with slot machines *3â??4x faster*
to NYU professor Natasha Dow Shull, author of /Addiction by Design./

*But hereâ??s the unfortunate truthâ??â??â??several billion people have a slot
machine their pocket:*

  * When we pull our phone out of our pocket, weâ??re /playing a slot
    machine/ to see what notifications we got.
  * When we pull to refresh our email, weâ??re /playing a slot machine /to
    see what new email we got.
  * When we swipe down our finger to scroll the Instagram feed, weâ??re
    /playing a slot machine/ to see what photo comes next.
  * When we swipe faces left/right on dating apps like Tinder, weâ??re
    /playing a slot machine/ to see if we got a match.
  * When we tap the # of red notifications, weâ??re /playing a slot
    machine /to whatâ??s underneath.

Apps and websites sprinkle intermittent variable rewards all over their
products because itâ??s good for business.

But in other cases, slot machines emerge by accident. For example, there
is no malicious corporation behind /all of email/ who consciously chose
to make it a slot machine. No one profits when millions check their
email and nothingâ??s there. Neither did Apple and Googleâ??s designers
/want/ phones to work like slot machines. It emerged by accident.

But now companies like Apple and Google have a responsibility to reduce
these effects by /converting intermittent variable rewards into less
addictive, more predictable ones/ with better design. For example, they
could empower people to set predictable times during the day or week for
when they want to check â??slot machineâ?? apps, and correspondingly adjust
when new messages are delivered to align with those times.

        Hijack #3: Fear of Missing Something Important (FOMSI)

Another way apps and websites hijack peopleâ??s minds is by inducing a â??1%
chance you could be missing something important.â??

If I convince you that Iâ??m a channel for important information,
messages, friendships, or potential sexual opportunitiesâ??â??â??it will be
hard for you to turn me off, unsubscribe, or remove your accountâ??â??
because (aha, I win) you might miss something important:

  * This keeps us subscribed to newsletters even after they havenâ??t
    delivered recent benefits (â??what if I miss a future announcement?â??)
  * This keeps us â??friendedâ?? to people with whom we havenâ??t spoke in
    ages (â??what if I miss something important from them?â??)
  * This keeps us swiping faces on dating apps, even when we havenâ??t
    even met up with anyone in a while (â??what if I miss that /one hot
    match/ who likes me?â??)
  * This keeps us using social media (â??what if I miss that important
    news story or fall behind what my friends are talking about?â??)

But if we zoom into that fear, weâ??ll discover that itâ??s unbounded/:
weâ??ll always miss something important /at any point when we stop using

  * There are magic moments on Facebook weâ??ll miss by not using it for
    the 6th hour (e.g. an old friend whoâ??s visiting town /right now/).
  * There are magic moments weâ??ll miss on Tinder (e.g. our dream
    romantic partner) by not swiping our 700th match.
  * There are emergency phone calls weâ??ll miss if weâ??re not connected

/But living moment to moment with the fear of missing something isnâ??t
how weâ??re built to live./

And itâ??s amazing how quickly, once we let go of that fear, we wake up
from the illusion. When we unplug for more than a day, unsubscribe from
those notifications, or go to Camp Grounded <http://campgrounded.org>â??â??
the concerns we thought weâ??d have donâ??t actually happen.

/We donâ??t miss what we donâ??t see./

The thought, â??what if I miss something important?â?? is generated /in
advance of unplugging, unsubscribing, or turning offâ??/â?? â??not after.
Imagine if tech companies recognized that, and helped us proactively
tune our relationships with friends and businesses in terms of what we
define as â??time well spent <http://timewellspent.io>â?? for our lives,
instead of in terms of what we might miss.

        Hijack #4: Social Approval

Easily one of the most persuasive things a human being can receive.

Weâ??re all vulnerable to *social approval*. The need to belong, to be
approved or appreciated by our peers is among the highest human
motivations. But now our social approval is in the hands of tech companies.

When I get tagged by my friend Marc, I imagine him making a /conscious
choice /to tag me. But I donâ??t see how a company like Facebook
orchestrated his doing that in the first place.

Facebook, Instagram or SnapChat can manipulate how often people get
tagged in photos by automatically suggesting all the faces people should
tag (e.g. by showing a box with a 1-click confirmation, â??Tag Tristan in
this photo?â??).

So when Marc tags me, /heâ??s actually/ /responding to Facebookâ??s
suggestion,/ not making an independent choice. But through design
choices like this, /Facebook controls the multiplier for/ /how often
millions of people experience their social approval on the line/.

Facebook uses automatic suggestions like this to get people to tag more
people, creating more social externalities and interruptions.

The same happens when we change our main profile photoâ??â??â??Facebook knows
thatâ??s a moment when weâ??re /vulnerable to social approval/: /â??what do my
friends think of my new pic?â?? /Facebook can rank this higher in the news
feed, so it sticks around for longer and more friends will like or
comment on it. Each time they like or comment on it, weâ??ll get pulled
right back.

Everyone innately responds to social approval, but some demographics
(teenagers) are more vulnerable to it than others. Thatâ??s why itâ??s so
important to recognize how powerful designers are when they exploit this

        Hijack #5: Social Reciprocity (Tit-for-tat)

  * You do me a favor, now I owe you one next time.
  * You say, â??thank youâ??â?? I have to say â??youâ??re welcome.â??
  * You send me an emailâ?? itâ??s rude not to get back to you.
  * You follow meâ??â??â??itâ??s rude not to follow you back. (especially for

We are /vulnerable/ /to needing to reciprocate othersâ?? gestures/. But as
with Social Approval, tech companies now manipulate how often we
experience it.

In some cases, itâ??s by accident./Email, texting and messaging apps are
social reciprocity factories/. But in other cases, companies exploit
this vulnerability on purpose.

LinkedIn is the most obvious offender. LinkedIn wants as many people
creating social obligations for each other as possible, because each
time they reciprocate (by accepting a connection, responding to a
message, or endorsing someone back for a skill) they have to come back
to linkedin.com where they can get people to spend more time.

Like Facebook, LinkedIn exploits an asymmetry in perception. When you
receive an invitation from someone to connect, you imagine that person
making a /conscious choice/ to invite you, when in reality, they likely
unconsciously responded to LinkedInâ??s list of suggested contacts. In
other words, LinkedIn turns your/unconscious impulses/ (to â??addâ?? a
person) into new social obligations that millions of people feel
obligated to repay. All while they profit from the time people spend
doing it.

Imagine millions of people getting interrupted like this throughout
their day, running around like chickens with their heads cut off,
reciprocating each otherâ??â??â??all designed by companies who profit from it.

Welcome to social media.

After accepting an endorsement, LinkedIn takes advantage of your bias to
reciprocate by offering *four* additional people for you to endorse in

Imagine if technology companies had a responsibility to minimize social
reciprocity. Or if there was an â??FDA for Techâ?? that monitored when
technology companies abused these biases?

        *Hijack #6: Bottomless bowls, Infinite Feeds, and Autoplay*

YouTube autoplays the next video after a countdown

Another way to hijack people is to keep them consuming things, even when
they arenâ??t hungry anymore.

How? Easy. /Take an experience that was bounded and finite, and turn it
into a bottomless flow/ /that keeps going/.

Cornell professor Brian Wansink demonstrated this in his study showing
you can trick people into keep eating soup by giving them a bottomless
bowl <http://foodpsychology.cornell.edu/discoveries/bottomless-bowls>
that automatically refills as they eat. With bottomless bowls, people
eat 73% more calories than those with normal bowls and underestimate how
many calories they ate by 140 calories.

Tech companies exploit the same principle. News feeds are purposely
designed to auto-refill with reasons to keep you scrolling, and
purposely eliminate any reason for you to pause, reconsider or leave.

Itâ??s also why video and social media sites like Netflix, YouTube or
Facebook /autoplay/ the next video after a countdown instead of waiting
for you to make a conscious choice (in case you wonâ??t). A huge portion
of traffic on these websites is driven by autoplaying the next thing.

Facebook autoplays the next video after a countdown

Tech companies often claim that â??weâ??re just making it easier for users
to see the video /they want/ to watchâ?? when they are actually serving
their business interests. And you canâ??t blame them, because increasing
â??time spentâ?? is the currency they compete for.

Instead, imagine if technology companies empowered you to /consciously
bound your experience /to align with what would be â??time well spent
<http://timewellspent.io>â?? for you. Not just bounding the /quantity /of
time you spend, but the /qualities/ of what would be â??time well spent.â??

        Hijack #7: Instant Interruption vs. â??Respectfulâ?? Delivery

Companies know that messages /that interrupt people immediately are more
persuasive at getting people to respond /than messages delivered
asynchronously (like email or any deferred inbox).

Given the choice, Facebook Messenger (or WhatsApp, WeChat or SnapChat
for that matter) would /prefer to design their messaging system to/
/interrupt recipients immediately (and show a chat box) /instead of
helping users respect each otherâ??s attention.

In other words, *interruption is good for business*.

Itâ??s also in their interest to heighten the feeling of urgency and
social reciprocity. For example, Facebook automatically /tells the
sender when you â??sawâ?? their message, instead of letting you avoid
disclosing whether you read it/ (â??now that you know Iâ??ve seen the
message, I feel even more obligated to respond.â??)

By contrast, Apple more respectfully lets users toggle â??Read Receiptsâ??
on or off.

The problem is, maximizing interruptions in the name of business creates
a tragedy of the commons, ruining global attention spans and causing
billions of unnecessary interruptions each day. This is a huge problem
we need to fix with shared design standards (potentially, as part of
Time Well Spent <http://timewellspent.io>).

        Hijack #8: Bundling Your Reasons with Their Reasons

Another way apps hijack you is by taking /your reasons/ for visiting the
app (to perform a task) and /make them inseparable from the appâ??s
business reasons/ (maximizing how much we consume once weâ??re there).

For example, in the physical world of grocery stories, the #1 and #2
most popular reasons to visit are pharmacy refills and buying milk. But
grocery stores want to maximize how much people buy, so they put the
pharmacy and the milk at the back of the store.

/In other words, they make the thing customers want (milk, pharmacy)
inseparable from what the business wants./ If stores were /truly
organized to support people/, they would put the most popular items in
the front <http://www.economist.com/node/12792420>.

Tech companies design their websites the same way. For example, when you
you want to look up a Facebook event happening tonight (your reason) the
Facebook app doesnâ??t allow you to access it without first landing on the
news feed (their reasons), and thatâ??s on purpose. /Facebook wants to
convert every reason you have for using Facebook, into their reason
which is to maximize the time you spend consuming things/.

In an ideal world, apps would always give you a /direct way/ to get what
you want /separately/ from what they want.

Imagine a digital â??bill of rightsâ?? outlining design standards that
forced the products used by billions of people to support empowering
ways for them to navigate toward their goals.

        *Hijack #9: Inconvenient Choices*

Weâ??re told that itâ??s enough for businesses to â??make choices available.â??

  * â??If you donâ??t like it you can always use a different product.â??
  * â??If you donâ??t like it, you can always unsubscribe.â??
  * â??If youâ??re addicted to our app, you can always uninstall it from
    your phone.â??

Businesses naturally /want to make the choices they want you to make
easier, and the choices they donâ??t want you to make harder./ Magicians
do the same thing. You make it easier for a spectator to pick the thing
you want them to pick, and harder to pick the thing you donâ??t.

For example, NYTimes.com lets you â??make a free choiceâ?? to cancel your
digital subscription. But instead of just doing it when you hit â??Cancel
Subscription,â?? they /send you an email with information on how to cancel
your account by calling a phone number/ thatâ??s only open at certain times.

NYTimes claims itâ??s giving a free choice to cancel your account

Instead of viewing the world in terms of /availability of choices/, we
should view the world in terms of /friction required to enact choices/.
Imagine a world where choices were labeled with how difficult they were
to fulfill (like coefficients of friction) and there was an FDA for Tech
that labeled these difficulties and set standards for how easy
navigation should be.

        Hijack #10: Forecasting Errors, â??Foot in the Doorâ?? strategies

Facebook promises an easy choice to â??See Photo.â?? Would we still click if
it gave the true price tag?

Lastly, apps can exploit peopleâ??s inability to forecast consequences of
a click.

People donâ??t intuitively forecast the /true cost/ /of a click /when itâ??s
presented to them. Sales people use â??foot in the doorâ?? techniques by
asking for a small innocuous request to begin with (â??just one click to
see which tweet got retweetedâ??) and escalate from there (â??why donâ??t you
stay awhile?â??). Virtually all engagement websites use this trick.

Imagine if web browsers and smartphones, the gateways through which
people make these choices, were truly watching out for people and helped
them forecast the consequences of clicks (based on real data about what
it actually costs most people?).

Thatâ??s why I add â??Estimated reading timeâ?? to the top of my posts. When
you put the â??true costâ?? of a choice in front of people, youâ??re treating
your users or audience with dignity and respect. In a Time Well Spent
<http://timewellspent.io> internet, choices could be framed in terms of
projected cost and benefit, so people were empowered to make informed
choices by default, not by doing extra work.

TripAdvisor uses a â??foot in the doorâ?? technique by asking for a single
click review (â??How many stars?â??) while hiding the three page survey of
questions behind the click.

        Summary And How We Can Fix This

Are you upset that technology hijacks your agency? I am too. Iâ??ve listed
a few techniques but there are literally thousands. Imagine whole
bookshelves, seminars, workshops and trainings that teach aspiring tech
entrepreneurs techniques like this. They exist.

The ultimate freedom is a free mind, and we need technology thatâ??s on
our team to help us live, feel, think and act freely.

We need our smartphones, notifications screens and web browsers to be
exoskeletons for our minds and interpersonal relationships that put our
values, not our impulses, first. Peopleâ??s time is valuable
<http://timewellspent.io>. And we should protect it with the same rigor
as privacy and other digital rights.

(*) Tristan Harris <https://medium.com/@tristanharris>/- Ex-Design
Ethicist & Product Philosopher @ Google, former CEO of Apture (acquired
by Google), dabbler in Behavioural Economics, Design and
Persuasion.//Tristan Harris was Product Philosopher at Google until 2016
where he studied how technology affects a billion peopleâ??s attention,
well-being and behaviour. For more resources on Time Well Spent, see
//// <http://timewellspent.io>/http://timewellspent.io///.


*N.B.: **
**Robert Redford
(79, â??The Horse Whispererâ??) does not have a smart-phone nor does he have
a computer - for good reasons. The USAmerican actor and film-director
was already declared once dead on social media at the onset of this
year, but is very well, alive, an activist and kicking. *//

* *Pro-active work to protect nature and human rights requires YOU to
work with us*

*Even if you have no possibility to be at the front-lines with us,
please know we need independent funding.**
*PLEASE consider to contribute to ECOTERRA's work and trust fund. *





Our full footer, being an essential, legal part of this dissemination,
can be found at
- please read it at least once a month to see important security updates.


***ECOTERRA Intl. nodes:*
Canaries - Cairns - Cairo - Calgary - Cape Town - Cassel - Cebu - Cork -
Curitiba - London - Los Angeles - Nairobi - Roma - Paris - Reykjavik -
Stuttgart - Wien - Vanuatu
24 h EMERGENCY RESPONSE PHONE LINE: +254-714-747-090
Become a pro-ative-member of *fPcN-interCultural* (//write to //Friends
of Peoples close to Nature via [email protected])
/ /or of our Marine Group: http://www.ecop.info/