每日吃瓜

Skip to main content Skip to secondary navigation
Publication

Can machine-based advice help you make better financial decisions?

Key Takeaways

  • A randomized field trial found that an algorithm-based decision support tool using personalized information helped Medicare beneficiaries make better choices when selecting Medicare prescription drug plans.
  • The tool has the potential to save Medicare beneficiaries as much as $680 million if rolled out broadly.
  • Plan switching, cost reduction, and consumer satisfaction were significantly greater when the tool offered individually tailored expert recommendations of which plans were best.
  • Comparison shoppers already inclined to switch plans were more likely to use the tool, while beneficiaries who would have benefited most from it were least likely to respond to invitations to participate in the trial.

As financial products become more sophisticated and consumer choices multiply, people at all economic levels face bewildering decisions about what options are best for them. In what kind of funds should I invest my retirement money? Should I choose a fixed or adjustable mortgage? Which credit card makes the most sense for me? Among the most complicated choices consumers must make is selecting a health insurance plan. The rules governing coverage are varied and complex, and costs and benefits are difficult to calculate. People often make poor decisions and wind up choosing insurance that costs more and doesn鈥檛 provide the best service.

Research has found that efforts to help consumers make financial decisions through education or provision of information have limited impact. But could algorithm-based online tools that assist or even replace human decision making do a better job? Algorithm-based advice is already in wide use in areas such as retirement planning and is increasingly available in the health insurance market. For example, the Centers for Medicare and Medicaid Services gives beneficiaries the option of using online software to guide them in plan choice.

But surprisingly little research has been done on how algorithm-based assistance affects consumer choices.

To help fill that gap, we and several colleagues[1] investigated whether algorithmic decision software improves Medicare beneficiary selection of Part D prescription drug plans (i, ii, iii, iv).

We found that a custom-designed decision support tool had several positive effects, including increased plan switching, more time spent choosing a plan, cost reduction, and greater satisfaction with the decision-making process.

This policy brief explores the benefits and some of the challenges that tools like these offer.

Algorithmic decision-making tool for Medicare Part D

Enrolling in Medicare Part D requires beneficiaries to choose among private plans offered where they live. Part D is available either as a stand-alone prescription drug plan or as a feature bundled into a Medicare Advantage medical and drug plan, and recipients pay a premium for the benefit. Our study looked at choices of stand-alone plans, examining whether a carefully designed algorithmic online tool can improve choices, help beneficiaries save money, and boost consumer satisfaction. We also wanted to identify the characteristics of people most or least likely to take advantage of such a resource.

We carried out a field trial during the 2017 Medicare Part D open enrollment period in November and December 2016 in cooperation with the Palo Alto Medical Foundation (PAMF), a large Northern California physician group.

Our decision tool, called CHOICE, featured user-friendly design, automatically imported beneficiary prescription drug information from the PAMF database, projected total cost, and assigned a quality rating to each plan based on consumer evaluations. We also contracted with a third-party that constructed a personalized 鈥渆xpert score鈥 for each insurance plan based on projected cost and quality ratings.

A group of 29,451 PAMF patients between 66 and 85 eligible for standalone Medicare drug coverage were invited to take part, but 928 people ultimately completed the study. Most of those participants live in the heart of Silicon Valley鈥攐ne of the most affluent, educated, and technologically proficient areas of the country. Racial and ethnic minorities, women, and people who live in areas with lower incomes and education levels were less likely to respond to our invitation. In these respects, participants were not representative of the Medicare beneficiary population, but were likely the types of Medicare enrollees that are most likely to use algorithmic decision support.

Enrollees were randomly divided into three groups:

  • An information only group that received personalized cost estimates for premiums and drugs, plus descriptions of plan features from the tool.
  • An information and expert group that, in addition to the above information, were provided an expert score for each available plan based on estimated spending and quality ratings. The plans with the highest expert score were highlighted as 鈥漃lans recommended for you.鈥
  • A control group that was not offered the plan selection tool, but instead received reminders about the timing of the open enrollment and directions for finding publicly available resources.

We looked at four primary outcomes to assess the effects of using CHOICE: whether study participants switched Part D plans for 2017; changes in beneficiaries鈥 expected monthly costs; how satisfied people were with the selection process; and how much conflict they experienced in making a decision, which reflected factors such as their confidence that they made the right choice and their understanding of risks and benefits. We also considered the amount of time people reported spending on making a choice and whether they enrolled in one of the three expert recommended plans.

Our main finding was that use of our algorithmic decision support tool helped improve the plan choice process in several ways.

People who used the tool were more likely to actively shop and switch plans compared with members of the control group. They also took more time making a decision, saved more money, and reported less conflict and more satisfaction with the decision-making process.

These effects were significantly more pronounced among study participants in the information plus expert recommendation group. People in that group switched plans at a 38 percent rate compared with a 28 percent rate for members of the control group. By contrast, switching in the information-only group was not significantly greater than in the control group.

We estimate that people in the information plus expert recommendation group saved an average of about $71 per month in premiums and out-of-pocket costs, while those in the individual analysis group saved about $18.

If we applied those larger savings numbers to the nearly 25 million people enrolled in Medicare Part D 鈥攁nd assuming an equivalent rate of participation as in our experiment鈥攖otal annual savings would be on the order of $680 million. This is particularly notable given the tool itself cost less than $1.8 million to develop.

It鈥檚 important to stress that a relatively small subset of those we approached ultimately chose to take part in the trial, and they differed in important ways from those who did not participate. Many of them had more experience with information technology as measured by their use of their physician鈥檚 electronic medical record.

We were also able to use machine learning methods to predict that those who chose not to participate in the trial would have likely been even more responsive on average to the algorithmic recommendation.

How policy can encourage better decision making

Our study has three important implications for policymakers.

First, our research indicates that a well-designed algorithmic decision support tool can help people make better financial choices. While our tool incorporated many features intended to simplify the user experience, we found that individually customized information was most effective when accompanied by an expert recommendation.

Second, when algorithmic recommendations are bundled with a web-based tool, they are unlikely to reach the types of consumers who might benefit the most. Our trial disproportionately attracted consumers who tended to be more affluent, better educated, and used online resources more than average.

However, we found that those who did not enroll in the trial were those who were likely to respond the most to algorithmic advice.

We believe public policy initiatives involving more intensive intervention potentially could widen use of these resources. Many public benefit or insurance programs, including Social Security and Medicare, require beneficiaries to make complex choices. Currently, Medicare offers a decision support tool on its website, but use of it is voluntary, difficult for many older adults to use, and it likely doesn鈥檛 reach many of the people who need it most.

A better approach may be to find ways of engaging people less inclined to use decision support software, perhaps by reaching out to them with personalized information and expert recommendations in ways that don鈥檛 require them to access that information online.

The results of our study also highlight areas of caution. We found that people not only learned more about product features, they also changed the way they value those products in response to an 鈥渆xpert recommendation.鈥 This underscores the importance of making sure consumers can evaluate the criteria underlying those recommendations to protect themselves against fraud and manipulation.

Research reported in this presentation was funded through a Patient-Centered Outcomes Research Institute (PCORI) Award (CDR-1306-03598). The statements in this presentation are solely the responsibility of the authors and do not necessarily represent the views of the Patient-Centered Outcomes Research Institute (PCORI), its Board of Governors or Methodology Committee.


[1] Ming Tai-Seale, Cheryl Stults, Amy Meehan, Roman Klimke, Ting Pun, Albert Solomon Chan, Alison Baskin, Sayeh Fattahi

References

i  Stults, Cheryl D., Alison Baskin, Ming Tai-Seale, and M. Kate Bundorf, 鈥淧atient Experiences in Selecting a Medicare Part D Prescription Drug Plan,鈥 Journal of Patient Experience, 2018, 5 (2), 147鈥152.

ii  Stults, Cheryl D., Sayeh Fattahi, Amy Meehan, M. Kate Bundorf, Albert S. Chan, Ting Pun, and Ming Tai-Seale, 鈥淐omparative Usability Study of a Newly Created Patient-Centered Tool and Medicare.gov Plan Finder to Help Medicare Beneficiaries Choose Prescription Drug Plans,鈥 Journal of Patient Experience, 2018, 6 (1), 81鈥86.

iii  Bundorf, M. Kate, Maria Polyakova, Cheryl Stults, Amy Meehan, Roman Klimke, Ting Pun, Albert Solomon Chan, and Ming Tai-Seale, 鈥淢achine- Based Expert Recommendations and Insurance Choices Among Medicare Part D Enrollees,鈥 Health Affairs, 2019, 38 (3), 482-490.

iv  Bundorf, M. Kate, Maria Polyakova, and Ming Tai-Seale, 鈥淗ow Do Humans Interact with Algorithms? Experimental Evidence from Health Insurance,鈥 2019 NBER Working Paper 25976.

Author(s)
Kate Bundorf
Maria Polyakova
Publication Date
July, 2019