Improve decision-making with nudges and free lunches
In this series, NUS News profiles the University’s Presidential Young Professors who are at the forefront of their research fields, turning creative ideas into important innovations that make the world better.
We all like to believe that we make good decisions, and for a long time, most economists believed that we did. In textbook economic models, it was assumed that people were super-intelligent rational actors who made perfectly optimal choices.
However, in 2008, Assistant Professor David Peter Daniels would read a book that would shape his future career. Nudge was written by the 2017 Economics Nobel Prize winner Richard H. Thaler and Harvard Law School professor Cass R. Sunstein. The book questioned whether people were all that good at making choices, and pointed out how human decisions are often distorted by systematic biases. Moreover, it turns out that the context in which a choice is presented to us has a big impact on the decisions we end up making. This means that policymakers, by tweaking the context surrounding a choice, have a powerful ability to use “nudges” to steer people towards better decisions.
Still a student at Harvard University when he read Nudge, a young David Daniels knew he wanted to research how biases influence human behaviour. His body of work now covers a wide range of the irrational and imperfect things that people do in decisions, negotiations, and organisations. Hearing the cheerful Californian talk about his research is similar to seeing a magician doing a magic trick. He outlines what assumptions people have about certain behaviours – what they anticipate others will do – but like a magician that pulls out a rabbit from a top hat, Asst Prof Daniels, who is Presidential Young Professor at NUS Business, unveils how lots of his research actually contradicts what we would expect.
Making choices in surprising ways
Asst Prof Daniels focuses on three broad research areas related to choices: influence and negotiation (how good are people at influencing other people?), diversity assessments and policies (what do investors think about gender diversity in companies?), and prosocial behaviours (do major disasters cause people to volunteer less?).
As an example of how his research is often surprising, he talks about prosocial behaviour. “We know that small harms often attract help. If Person A trips and falls while Person B doesn’t, we have a stronger impulse to do something nice for Person A,” he said. “And for a long time, it was assumed that if people feel the impulse to help after small harms, they would feel even more of an impulse to help after big harms.”
“But we find the opposite: after major disasters like a huge hurricane, prosocial behaviour actually goes down,” Asst Prof Daniels said.
His work is full of unexpected twists and turns, as well as an attempt to understand why they happen using an interdisciplinary perspective that draws on psychology, economics, and management research.
His research on diversity has equally fascinating results. Often businesses wish to increase workplace diversity since research suggests that diversity is correlated with improved company performance. But how do people actually evaluate diversity and do they evaluate it well?
In fact, our evaluations of diversity are impacted by a lot of unexpected things. If Team A has more racial diversity than Team B, people rate Team A as also having more gender diversity than Team B – even if both teams have identical numbers of men and women! In other words, information about one kind of diversity “spills over” to distort our evaluations of other kinds of diversity.
By exploring how biases influence our thinking, Asst Prof Daniels hopes to understand how we can become better decision makers.
“A common saying in economics is that there is no such thing as a free lunch,” Asst Prof Daniels commented, “But when you bring psychology and behavioural science into the picture, you often do find free lunches. For example, when people are not currently making the best decision possible, there exist solutions which can make some people better off without making anyone worse off.”
This brings Asst Prof Daniels onto his final area of research: influence and negotiation. He has been studying the behaviour of “choice architects” – influencers who present choices to others. Think: a Wall Street broker presenting their client with a list of various stocks they could buy; a real-estate agent bringing a young couple to see different flats; or a mid-level manager who discusses multiple project ideas with his/her team. All these influencers must decide how to present options to others – hopefully in a way that helps nudge the people around them in beneficial directions.
“For a while now, we've known that it really matters whether you describe choices to other people using a positive frame or negative frame. When you describe choices by highlighting what people stand to gain from doing something – when you use a positive frame – then people act very conservatively and cautiously,” Asst Prof Daniels said, “However, if you talk about what people stand to lose from not doing something – when you use a negative frame – then people start flipping coins and taking risks.”
This is exactly the kind of bias that a choice architect or influencer should be using to influence others. If you want to influence others to act risk averse, you should use a positive frame. If you want to influence others to act risk seeking, you should use a negative frame. Sounds easy enough. But in a study of business managers, Asst Prof Daniels found that managers’ actual behaviour didn’t always follow this pattern. When managers wanted others to act risk averse, they did use a positive frame. So far, so good. But when mangers wanted others to act risk seeking, they still used a positive frame. This was suboptimal; in fact, it meant that managers’ influence strategies were actually backfiring on them!
“There’s a free lunch right there,” he points out. Sometimes in organisations, the smart play is for people to take calculated risks. In such cases, leaders need to resist their instinct to use a positive frame when presenting choices to followers. Instead, leaders should use a negative frame. If they did – if they used a better nudge – then everyone would be made better off.
Using big data to learn more about ourselves
Asst Prof Daniels has now been at NUS for about two years after holding a post at the Hong Kong University of Science and Technology. His work contains a blend of both classic experimental approaches and newer methods that use big data and machine learning. “Classic experiments, where you invite people to participate in a research study, are great because they allow control over randomisation and over causality,” he said, “But ‘new school’ approaches allow you to analyse huge datasets involving naturally-occurring phenomena that you just can’t recreate in a lab. This opens the door to answering many exciting research questions, like how prosocial behaviour changes after an earthquake.”
He believes machine learning and big data will become bigger and bigger parts of research as computers continue to improve: “Big data and machine learning are here to stay – they can really help us learn about ourselves through the unprecedented availability of so much information, and the ability to analyse it using cutting-edge machine learning tools.”
More Proof of Passion stories here.
The NUS Presidential Young Professorship (PYP) scheme supports talented young academics with excellent research track records in advancing their cutting-edge research. More information about the PYP scheme is available here.