Daniel Kahneman, the Princeton University psychology professor whose work laid the foundations for behavioral economics,passed awayon Wednesday at age 90. The author’s work — much of which was a collaborative effort with his friend and fellow psychologist Amos Tversky — challenged traditional economic theory, which assumes people make rational decisions with their self-interest at heart. Instead, he argued, people use mental shortcuts, get swayed by emotions, and otherwise simply make choices that frequently fail to yield the best economic outcomes.

Kahneman was awarded the 2002 Nobel Memorial Prize in Economic Sciences “for having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty,” according to the Nobel citation. Tversky, who died in 1996, would have likely been a joint recipient of the prize had he been alive.

His 2011 book Thinking, Fast and Slow popularized the duo’s research. The book is premised on the idea that people are guided by two modes of thought: System 1, an automatic process led by intuition and emotional reactions, and System 2, which is a slower and more deliberate process in which the mind operates more analytically and corrects errors made by System 1. Kahneman suggests that the default setting, most of the time, is System 1, with the mind often employing rules of thumb, cognitive biases, and other mechanisms to shorten the judgment process.

Case in point #1: The framing effect. Experiments conducted by the two psychologists explored the framing effect, a cognitive bias whereby the way something is worded or presented — positively or negatively — can trick your brain into picking one option over another, even if the outcome is the same. Their research demonstrated, for example, that people would be more likely to undertake a 20-minute trip to save USD 5 on the price of a USD 15 calculator than to make the same trip to save the same USD 5 on a USD 125 calculator.

Case in point #2: The conjunction fallacy. Also known as the Linda problem, this reasoning error leads us to believe that a combination of two or more attributes is more probable than any one of the sole attributes — an assumption that goes against the laws of probability.The most widely cited example of this fallacy is a fictitious Linda, a bright 31 year-old philosophy major who is “deeply concerned with issues of discrimination and social justice.” When asked to pick a scenario that more likely describes Linda, the majority of participants opted for “bank teller . . . active in the feminist movement” rather than just “bank teller" — even though the latter encompasses the former.