You may have heard the phrase, “people have nonlinear utility in money.” Today we’re going to learn what that means.

Understanding your utility valuation of money requires understanding the concept of a “certain equivalent” which we covered in a separate video. Please revisit that video if you need a refresher.

Let’s assume that you’re thinking about a relatively big decision, such as making a big purchase on the order of $1000.

The way to determine your utility in money is to ask yourself a series of modified certain-equivalent questions like the following, with a fixed probability but a variable reward. In other words, fill in the blank in this prompt, for a few different values of X:

What amount of money would you take in exchange for the opportunity to flip a fair coin, with $1000 on a win and $0 on a loss? If your utility were linear in money, then you would say $500, your expected value (because 50% x $1000 + 50% x $0 = $500) but that is unlikely to be true for all possible values of the winning purse. Think about it this way: Unless you are very rich, you will probably tend to prefer a *certain* $800,000 in your hand rather than a 50% *chance* to win $2,000,000, with an equal chance to win nothing. You’d happily leave that potential extra $1,200,000 on the table, because the certain amount is already likely enough to change your life. Thus, rational actors tend to have diminishing marginal utility in money, resulting in utility curves that look something like this, where the red dots indicate the person’s utility relative to money, and the dotted line indicates the imaginary line of complete linearity:

For this sample person, there is a crossover point at $1000. Below $1000, they are actually willing to take relatively aggressive bets. They’d be eager to take a 50-50 shot at winning $5 rather than accept a certain $2.50. However, over $1000, they would gladly take less than half of the expected value in hand rather than taking any risk -- for example, they might gladly accept a mere $600 in hand rather than take a 50% chance at winning $1500.

Everybody will have a different curve expressing their certain equivalent with respect to money, and your curve will change over the course of your life as your level of income, net worth, and overall risk aversion shifts.

What’s the purpose of going through this exercise? Because the shape of your certain equivalent curve with respect to fixed bets is also the shape of your utility curve with respect to money!

(Note the utility has no intrinsic scale, or units; utility is always a relative quantity, and we have not yet established a reference frame.)

In an upcoming exercise we will show that you can solve previously fraught issues -- such as finding the “price of safety” -- by converting both money and safety into utility. This allows us to use the tool of Expected Value, which we learned previously, to evaluate options in terms of utility, which is fundamental, rather than money, which you do not value consistently for different dollar amounts.

For this week’s optional exercise, fill out this chart with your own certain equivalents, and make a plot to get a sense of your own utility-in-money curve. Remember, x is the value that you have a 50% chance of winning, so if your utility were linear with respect to money, then your certain equivalent would be half of x in each case.

You do not need to share your results with the class, because everyone’s curve will likely be a result of their own private financial situation.