Related Notes:
- [[Book Notes - Human Compatible- AI and the Problem of Control]]
- #data_science
- #machine_learning
- #book
- #Design_Thinking
- [[202102042048 Book Notes - Change by Design]]
- [[Design Thinking]]
- [[Data Science Ethics - Coursera]]
----
My takeaways and summary
1. Must have a clear intention and objective when designing models.
2. Systems/models must have a self-checking feedback loop to constantly test its validity and reliability.
3. Watch out for any perpetuation of problems (self-fulling loop), or unintentional effects. ^771c9a
4. What can be measured may not be valid/important (Proxies) [[Medicine and the McNamara Fallacy]]
----
# Chapter 1. Bomb Parts: What is a model?
- Models need to have feedback loop to check and improve on itself. To validate / reality testing.
- It should not perpetuate mistakes and have confirmation bias.
- Models should be transparent, so that it's code can be checked.
- It shouldn't be opaque, "black box", people cannot seek redress.
- Each design has it's (designers) built-in value, assumptions, and beliefs of what is important (or not important)
- Models are simplifications of real life.
- The intention and effect
- Each model serve a goal. What is that goal? What is the intended and unintended effect?
- Models can be transferred across industry; from shopping to finance
- If focus on short term, reward. "if it brings in more money, that is good." and system keep rewarding that behavior.
# Chapter 2. Shell Shocked: My Journey of Disillusionment
- Math is used to intimate consumer.
- Assumption that people in financial sector take into account risks to product people
- It's business as usual. Bring in more money
- A dehumanising; people become data trail on the screen.
- Math used to predict market movement can be use in predicting clicks on a shopping website.
- Author dis-delusion of math, in financial market.
# Chapter 3: Arms Race: Going to College
- intangible constructs cannot be measured, thus they use proxies.
- proxies are based on assumptions.
- Proxies can be gamed. 15 areas that are measured.
- the standard set up US News chart.
- Create a fulfilling feedback loop
- when scaled up. Unintentional consequences like University ranking
- Gaming become norms. Cheating become norms. Arm race.
# Chapter 4: Propaganda Machine: Online Advertising
- Advertisers knows more about us then ourselves because they have vast collection of our data online. They target our vulnerabilities.
- Related to [[Daring Greatly How the Courage to Be Vulnerable Transforms the Way We Live, Love, Parent, and Lead]] they focus on the "Scarcity" and Lack mentality, insecurity and low self-esteem.
# Chapter 5: Civil Casualties: Justice in the age of big data
- Data, algoritm to do risk analysis. Crime prediction. Hotspot predictors. Deploy police force.
- Look at History data.
- Optimises resources.
- PredPol, software - Target geography, time and each time it occurs
- The problem of reinforcing feedback loops. - When police patrol, they reduce Part 1 crime (serious crime) and police pick up more low level crime (part 2 crimes) - petty crimes (glue sniffing etc, which are common in low SES neighbourood), as more and more dots reported for this kind of crime. Software send more police into the area.
- Victimless crimes.
- Police justify more need for police.
- End up more minorities, and poor people. Rich people don't live in those areas, and let to more crimes in those places.
- ==think of how data create feedback loops and reinforce a skewed view==
- - Seems like they focus on what they can do, catching crimes committed by people from poor neighbourhoods. White collar crimes hard to stop. Justified by the use of "data"..
- The approach is moving towards stopping "potential law breakers" - borrowed from the war against terrorism. Looking at hotspot and patterns.
- The system have an assumption "birds of the same feather flock together"
- Interesting to read more about "Broken Window" study- [Broken Windows Theory of Criminology (simplypsychology.org)](https://simplypsychology.org/broken-windows-theory.html)
# Chapter 6: Ineligible to Serve: Getting a job
- Companies uses software to judge applicants, and the software uses some medical/psychological tests - which are medical actually. So discrimination based on personality/profile.
- They settled on proxies, and history data to predict future.
- Personality tests do not predict job performance.
- The tests are cheap way to exclude people.
- Not valid or reliable.
- They usually don't update and revise their models. ==Systems need constant feedback to improve==
- Now a days companies uses Machine/software to look through stacks of applications before human sees them. machine rank them. without explaining.
- "Our livelihoods increasingly depend on our ability to make our case to machines"
- Like google SEO.
- ==-- If the machine can identify risks based on certain criteria and proxies, then we can give resources to help reduce risks by supporting those criterias/proxies that gave the "alerts" in the first place.==
- The danger of becoming Phrenology.. unscientific way of diagnosing illness. "feeling bumps on the head to diagnose someone as anxious or alcoholic"
# Chapter 7: Sweating Bullets: on the job
- The Just-in-time methodology, to reduce waste and increase bottomline of businesses.
- Now, people become the parts, and scheduling is the new "just in time", treating people as resources. Usually low wage worker.
- Every minute must be busy and about costs.
- The scheduling software system causes chaotic life for workers, unable to do any other things to upgrade themselves. System keeping them down.
- ==scientists need false negative to learn and improve. There must be a way to test it's own model=
- The tests and WMD used to generate a score for teachers are themselves not validated and not scientific. The use of proxies are also not valid/reliable, yet the scores they gave are treated like final, and cannot seek redress against it. That's the danger.
# Chapter 8: Collateral Damage: Landing Credit
- Use of credit score as proxy for other things, like employment, dating.
- Spiralling and defeating feedback loop
- Wealth correlate with race.
- Data broker scraps data from internet and government sources, sell to others. People are product. What if data are wrong? people live affected.
- wrong identity - but people usually dont know
- online profiles - automatic. may be faulty
- e-scores. they project past into the future.
- machine cannot make adjustment to fairness
# Chapter 9: No Safe Zone: Getting Insurance
- Insurance industry is about predicting risks using past data.
- Industry places us into groups that we do not see, and judge us according to the group/population risks.
- We are sorted out into tribes/profiles/groups based on digital data trails/foot prints.
- Increasingly using personal data/profiles to calculate risks for individuals based on lifestyle. But we cannot see the model.
- Other proxies like credit score/ e-Score may affect cost of insurance.
- Thus poor people may have to pay more for insurance.
- i.e based on who you are, who you mingle with ... the system predict how risky you are. Lower SES have higher risks, thus they pay more.
- Privacy is a luxury, only the wealthy can afford. If you don't share data, then it will cost more for the service/product.
- The story of employer having "Wellness programme" for employee. If he doesn't comply with doing those health activities, then he got to pay more for health care.
- What if companies product health-score to sort people into groups.
- ==BMI are not scientific?==
- It is intrusive if company set standards on how a person should look.
# Chapter 10: The Targeted Citizen: Civil Life
- Facebook algorithm shape behaviors.
- Influence what users sees.
- Contagions of moods even online.
- Search results also affect opinions. Potential for misuse.
- To Read - "The selling of the president" - “The selling of the president” is a book by American author Joe McGinniss that was published in 1969. It is about the 1968 presidential campaign of Richard Nixon and how his advertising team remade his public image to appeal to the voters.
- Voters are profile like consumers. Everyone get a customised message.
- Smart targeting. Microtargeting.
# Conclusion
- WMD cause different reality to different people.
- Auditing the models
- Regulation
- Research
- Transparency