Hello World (book)

*
planted: 30/06/2020last tended: 18/07/2022
A
book
Author
Hannah Fry

blurb: "A look inside the algorithms that are shaping our lives and the dilemmas they bring with them."

I thought this was good. My full review here: Review: Hello World - How to Be Human in the Age of the Machine.

1. Notes

1.1. Intro

Knowing that his preferred clientele would travel to the beach in their private cars, while people from poor black neighbourhoods would get there by bus, he deliberately tried to limit access by building hundreds of low-lying bridges along the highway. Too low for the 12-foot buses to pass under.

GPS was invented to launch nuclear missiles and now helps deliver pizzas.

It’s about asking if an algorithm is having a net benefit on society.

the power of an algorithm isn’t limited to what is contained within its lines of code. Understanding our own flaws and weaknesses – as well as those of the machine – is the key to remaining in control.

For the time being, worrying about evil AI is a bit like worrying about overcrowding on Mars.

After only a few minutes of looking at the search engine’s biased results, when asked who they would vote for, participants were a staggering 12 per cent more likely to pick the candidate Kadoodle had favoured.

All around us, algorithms provide a kind of convenient source of authority. An easy way to delegate responsibility; a short cut that we take without thinking. Who

about our human willingness to take algorithms at face value without wondering what’s going on behind the

Stanislav Petrov was a Russian military officer in charge of monitoring the nuclear early warning system protecting Soviet airspace. His job was to alert his superiors immediately if the computer indicated any sign of an American attack.

having a person with the power of veto in a position to review the suggestions of an algorithm before a decision is made is the only sensible way to avoid mistakes.

1.2. Data

There’s just one issue with that logic: we’re not always aware of the longer-term implications of that trade. It’s rarely obvious what our data can do, or, when fed into a clever algorithm, just how valuable it can be. Nor, in turn, how cheaply we were bought.

Palantir is just one example of a new breed of companies whose business is our data. And alongside the analysts, there are also the data brokers: companies who buy and collect people’s personal information and then resell it or share it for profit. Acxiom, Corelogic, Datalogix, eBureau – a swathe of huge companies you’ve probably never directly interacted with, that are none the less continually monitoring and analysing your behaviour.8

This digital shadow of a pregnancy continued to circulate alone, without the mother or the baby. ‘Nobody who built that system thought of that consequence,’ she explained.

Their approach was to identify small groups of people who they believed to be persuadable and target them directly, rather than send out blanket advertisin

The experimenters suppressed any friends’ posts that contained positive words, and then did the same with those containing negative words, and watched to see how the unsuspecting subjects would react in each case. Users who saw less negative content in their feeds went on to post more positive stuff themselves. Meanwhile, those who had positive posts hidden from their timeline went on to use more negative words themselves

Sesame Credit, a citizen scoring system used by the Chinese government.

1.3. Justice

Nicholas Robinson was sentenced to six months in prison

Johnson escaped jail entirely.

, on the basis of identical evidence in identical cases, a defendant could expect to walk away scot-free or be sent straight to jail, depending entirely on which judge they were lucky (or unlucky

whenever judges have the freedom to assess cases for themselves, there will be massive inconsistencies. Allowing

the best-performing contemporary algorithms use a technique known as random forests,

Random forests have proved themselves to be incredibly useful in a whole host of real-world applications. They’re used by Netflix to help predict what you’d like to watch based on past preferences;22

sparked a heated debate, and not without cause: it’s one thing calculating whether to let someone out early, quite another to calculate how long they should be locked away in the first place.

Unfortunately for Zilly, Wisconsin judges were using a proprietary risk-assessment algorithm called COMPAS

The algorithm’s false positives were disproportionately black.

Chapters on power, data, justice, medicine, cars, crime, art.

Symbiosis seems best. E.g. extra safety mechanisms seem best rather than driverless cars. Ai can detects tumors better than human (faster at least) but bad as a gp. They can augment or make more.efficient a police investigation. But needs human intuition still.

2. Elsewhere

2.2. In the Agora

2.3. Mentions

Recent changes. Source. Peer Production License.