# Think Again ## Metadata * Author: [Adam Grant](https://www.amazon.comundefined) * ASIN: B08H177WQP * ISBN: 0753553899 * Reference: https://www.amazon.com/dp/B08H177WQP * [Kindle link](kindle://book?action=open&asin=B08H177WQP) ## Highlights At the turn of the last century, the great hope for the internet was that it would expose us to different views. But as the web welcomed a few billion fresh voices and vantage points into the conversation, it also became a weapon of misinformation and disinformation. By the 2016 elections, as the problem of political polarization became more extreme and more visible, the solution seemed obvious to me. We needed to burst filter bubbles in our news feeds and shatter echo chambers in our networks. If we could just show people the other side of an issue, they would open their minds and become more informed. Peter’s research challenges that assumption. — location: [2295](kindle://book?action=open&asin=B08H177WQP&location=2295) ^ref-65252 --- Psychologists have a name for this: binary bias. It’s a basic human tendency to seek clarity and closure by simplifying a complex continuum into two categories. To paraphrase the humorist Robert Benchley, there are two kinds of people: those who divide the world into two kinds of people, and those who don’t. — location: [2305](kindle://book?action=open&asin=B08H177WQP&location=2305) ^ref-21059 --- From time to time I’ve run into idea cults—groups that stir up a batch of oversimplified intellectual Kool-Aid and recruit followers to serve it widely. They preach the merits of their pet concept and prosecute anyone who calls for nuance or complexity. — location: [2473](kindle://book?action=open&asin=B08H177WQP&location=2473) ^ref-6793 --- If you find yourself saying ____ is always good or ____ is never bad, you may be a member of an idea cult. — location: [2481](kindle://book?action=open&asin=B08H177WQP&location=2481) ^ref-22357 --- In the moral philosophy of John Rawls, the veil of ignorance asks us to judge the justice of a society by whether we’d join it without knowing our place in it. I think the scientist’s veil of ignorance is to ask whether we’d accept the results of a study based on the methods involved, without knowing what the conclusion will be. — location: [2486](kindle://book?action=open&asin=B08H177WQP&location=2486) ^ref-64499 --- Evidence shows that if false scientific beliefs aren’t addressed in elementary school, they become harder to change later. “Learning counterintuitive scientific ideas [is] akin to becoming a fluent speaker of a second language,” psychologist Deborah Kelemen writes. It’s “a task that becomes increasingly difficult the longer it is delayed, and one that is almost never achieved with only piecemeal instruction and infrequent practice.” That’s what kids really need: frequent practice at unlearning, especially when it comes to the mechanisms of how cause and effect work. — location: [2617](kindle://book?action=open&asin=B08H177WQP&location=2617) ^ref-9211 --- This is part of a broader movement to teach kids to think like fact-checkers: the guidelines include (1) “interrogate information instead of simply consuming it,” (2) “reject rank and popularity as a proxy for reliability,” and (3) “understand that the sender of information is often not its source.” — location: [2631](kindle://book?action=open&asin=B08H177WQP&location=2631) ^ref-14304 --- Tradition (n.) Peer pressure from dead people. — location: [2871](kindle://book?action=open&asin=B08H177WQP&location=2871) ^ref-2981 --- It appeared that psychological safety could breed complacency. When trust runs deep in a team, people might not feel the need to question their colleagues or double-check their own work. But Edmondson soon recognized a major limitation of the data: the errors were all self-reported. To get an unbiased measure of mistakes, she sent a covert observer into the units. When she analyzed those data, the results flipped: psychologically safe teams reported more errors, but they actually made fewer errors. By freely admitting their mistakes, they were then able to learn what had caused them and eliminate them moving forward. In psychologically unsafe teams, people hid their mishaps to avoid penalties, which made it difficult for anyone to diagnose the root causes and prevent future problems. They kept repeating the same mistakes. — location: [2879](kindle://book?action=open&asin=B08H177WQP&location=2879) ^ref-38720 --- Focusing on results might be good for short-term performance, but it can be an obstacle to long-term learning. Sure enough, social scientists find that when people are held accountable only for whether the outcome was a success or failure, they are more likely to continue with ill-fated courses of action. Exclusively praising and rewarding results is dangerous because it breeds overconfidence in poor strategies, incentivizing people to keep doing things the way they’ve always done them. It isn’t until a high-stakes decision goes horribly wrong that people pause to reexamine their practices. — location: [3015](kindle://book?action=open&asin=B08H177WQP&location=3015) ^ref-34206 --- For years, NASA had failed to create that separation. Ellen Ochoa recalls that traditionally “the same managers who were responsible for cost and schedule were the ones who also had the authority to waive technical requirements. It’s easy to talk yourself into something on a launch day.” — location: [3056](kindle://book?action=open&asin=B08H177WQP&location=3056) ^ref-64668 --- My favorite test of meaningful work is to ask: if this job didn’t exist, how much worse off would people be? It’s near midlife that this question often begins to loom large. At around this time, in both work and life, we feel we have more to give (and less to lose), and we’re especially keen to share our knowledge and skills with the next generation. — location: [3304](kindle://book?action=open&asin=B08H177WQP&location=3304) ^ref-1018 --- Yes, overthinking is a problem, but underthinking is a bigger problem. I feel for people who get stuck in analysis paralysis. I worry about people who don’t do the analysis in the first place. I’d rather see you embrace the discomfort of doubt than live with the regret of foolish conviction. For me, the difference between reflection and rumination is whether you’re still learning. If you’re pondering a familiar problem without gaining fresh insights, it’s time to seek new information or reach out to your challenge network. — location: [3409](kindle://book?action=open&asin=B08H177WQP&location=3409) ^ref-9793 ---