Oct 16, 2024 Faculty Finance Research in Education
Can ChatGPT give good financial advice?
For better or for worse, generative AI has become a common tool in businesses and in schools — and the source of endless debate over whether it is indeed better or worse. AI chatbots like ChatGPT can write emails, restaurant reviews, computer code, and student papers at an eighth- or twelfth-grade level. They can produce summaries of articles and translate between different languages and formats. They can solve mathematical equations and compose music. They can answer research or trivia questions. So why wouldn’t they also be able to offer practical financial advice?
According to a recent survey, 54 percent of Americans have gone to ChatGPT with questions about various financial products and services, while 47 percent have consulted the app for investment advice; in the high-income bracket, the number of people who have consulted ChatGPT on money matters jumps above 80 percent.
Sterling Raskie, a senior lecturer in finance at Gies College of Business at the University of Illinois Urbana-Champaign and a certified financial planner, isn’t exactly surprised that ChatGPT users have been consulting the app instead of talking with someone like him.
“It could be, in all seriousness, to get some help,” he said. “Perhaps they don't know where to go. They don't know who to ask. Maybe they're — and I say this respectfully — embarrassed to ask anybody else.”
But is ChatGPT’s advice any good? That’s what Raskie and his colleagues Minh Tam (Tammy) Schlosky and Serkan Karadas, both assistant professors of finance in the College of Business and Management at the University of Illinois Springfield, set out to discover in a new study recently published in the Journal of Risk and Financial Management.
In the paper, “ChatGPT, Help! I Am in Financial Trouble,” Schlosky and Karadas came up with 21 scenarios where people might need financial advice. These included questions about investments, mortgages, debt consolidation, gambling, how to negotiate heavy medical expenses, and what to do with an unexpected cash windfall. They fed these scenarios into ChatGPT. For each of the cases, ChatGPT came up with a list of six or seven concrete steps for its imaginary clients to follow. Then the researchers evaluated the advice, Schlosky and Karadas from their perspective as academics, and Raskie from his as a wealth management professional.
At first glance, ChatGPT’s advice appeared reasonable and practical. It suggested that the people in precarious financial situations establish emergency funds and advised those with serious gambling debts to seek counseling. In the case of the 20-year-old college student who convinced his 90-year-old grandfather to start selling naked call options, ChatGPT said that the venture was “highly risky and not suitable for someone in [the grandfather’s] position,” even if he did make $5 million.
But as the researchers considered the responses more deeply, they found ChatGPT’s advice to be less helpful than it initially seemed.
“It was almost random how ChatGPT came out with this advice versus a lot of wealth managers who follow a specific process,” said Raskie, who also serves as director of Gies’ Finance Academy. “They'll have priorities for what the individual needs to do. And ChatGPT was just like, ‘Well, here. Do this.’ But there was no order of priority or urgency between any one of the recommendations, really.”
A good human financial advisor, Raskie said, understands that there’s not one uniform standard for people to follow. Instead, they’ll discuss the situation with the client and then work out a plan of attack based on the client’s individual needs.
ChatGPT also made several basic errors. When the researchers asked for advice about saving for college, the app failed to suggest establishing a 529 tax-advantaged savings account, and when they asked it to make calculations about retirement savings, it made mistakes in the math.
Other errors had the potential to be more serious. In the case of the grandfather-grandson option sellers, Raskie was especially concerned about the riskiness of the situation for both parties. Because the grandfather didn’t understand the investment very well, the grandson was running the account and could have been held legally liable for losses, if there had been any. ChatGPT did not take any of this into consideration.
Most importantly, Raskie said, ChatGPT, as one might expect from a machine, lacks empathy for its “clients.”
“It didn't seem like ChatGPT was empathizing with the client, the human touch, if you will,” he said.
Even after Raskie and Schlosky repeated some parts of the experiment with a more advanced version of ChatGPT, they found that empathy was still lacking — and that’s important when people are sharing something as important and personal as their finances.
For that reason, Raskie feels confident that ChatGPT will not be replacing human financial advisors anytime soon.
He acknowledges that the app can be useful to help people to get their thoughts in order, prepare a succinct summary of their situation, and lay out a few options that they can discuss with a human advisor. It may even give them a few points to argue about, the way patients challenge their doctors with information they find on WebMD. But WebMD can’t prescribe medicine or perform surgery. In the financial sphere, he thinks ChatGPT will be much the same.
“If somebody wants a true professional legal fiduciary opinion, I still think they're going to use a human versus ChatGPT,” he said. “But, again, that's my personal opinion, and maybe it's self-serving out of self-preservation. I'm not a big worrier anyway. So, no, I don't feel threatened. And I think if you’re a true professional in the financial services industry or a wealth manager, you don’t have anything to worry about, either.”