Business News

I have a job and run a small business. Using ChatGPT to do my taxes would save me $6,000 in fees. But can I trust him?

ChatGPT and other generative AI chatbots are used for almost everything these days, from writing emails to planning vacations to preparing your taxes. That doesn’t mean it’s always a good idea.

Imagine this hypothetical scenario: Lorin, a 42-year-old market researcher at a company who also started his own business, hates doing his taxes and doesn’t want to spend money on a tax professional. So he decided to ask ChatGPT for help with his tax returns.

He wouldn’t be the first to do so. A study by Invoice Home found that more than two in five Americans (43%) would trust AI to file their taxes rather than hiring a tax professional. [1]. While Gen Z was more open to the idea (49%), older generations were also willing to trust AI with their taxes (25% of Baby Boomers and 18% of the Silent Generation).

But should they? In Lorin’s case, using ChatGPT would save him a significant amount of money. Here’s what Lorin may want to consider before submitting his return.

Let’s face it, the vast majority of people don’t like doing their taxes. Some people downright fear it. So it’s no wonder that more and more people are turning to ChatGPT or other AI chatbots for help.

AI could be useful for some tax preparation tasks, such as generating a list of documents and forms you need to complete. If you have a side hustle or second job, AI could help you locate the forms needed to cover multiple income streams from multiple employers.

This could also help you identify common deductions or tax credits (from here you can determine whether it makes more sense to provide itemized deductions or use the updated standard deduction). And it may be able to point out inconsistencies or missing information for further investigation on your part.

Please keep in mind that the information you see may not be current or accurate. For example, recently announced credits may be missing. Relying entirely on AI without verifying the information yourself with a good source (or with a tax professional) could lead to inaccuracies or even misrepresentations.

“ChatGPT can explain what an ETF is, but it doesn’t know your debt-to-equity ratio, state tax bracket, filing status, deductions, retirement goals, or risk appetite. Because its training data may not match the current fiscal year and latest rate hikes, its guidance might well be out of date when you press Enter,” technology journalist Nelson Aguilar wrote in an article for CNET. [2].

Additionally, it can provide answers that are “biased, outdated, or just plain incorrect, while still looking like a PhD.”

OpenAI, the company behind ChatGPT, describes AI hallucinations as “plausible but false statements generated by language models.”

This happens because the AI ​​does not “understand” the information; rather it generates answers based on patterns in the training data, and language models are generally designed to make guesses rather than admit uncertainty. Additionally, some of this training data may be biased, inaccurate or incorrect. But when AI provides an answer, it appears authoritative, even if that information is fabricated.

That’s where another phenomenon comes in: “sycophancy,” which is the “tendency of AI models to adjust their responses to align with users’ views” and “can cause ChatGPT and its ilk to prioritize flattery over accuracy,” according to Axios. [3].

Read more: Are you richer than you think? 5 Clear Signs You’re Hitting Well Above the US Average

CPA Practice Advisor cites research that reveals that AI often suffers from a “simplexity” problem, meaning “it misinterprets complex concepts in an attempt to make them simple.” The researchers found that “this problem is present even in the IRS Interactive Tax Assistant, finding that it can characterize tax laws in a way that is too favorable to the taxpayer.” [4].

In a study published in the Journal of Emerging Technologies in Accounting, common tax questions from the 2023 and 2024 tax seasons were entered into the ChatGPT 3.5 and 4 models. [5]. The researchers found that ChatGPT’s overall percentage of correct answers ranged between 39 and 47 percent. It also “provides less precise answers to tax questions that are more common, have more complex answers, require an evaluation of taxpayer fact patterns, and relate to tax information determined after the ChatGPT knowledge deadline.”

Another major risk concerns your privacy. An AI could use your personal data to train models; it could accidentally share sensitive information in outputs or expose it via data breaches. It is recommended that you never share financial, health, or other sensitive personally identifiable information with a chatbot.

“The chatbot simply cannot replace a CPA who can spot a hidden deduction worth a few hundred dollars or point out an error that could cost you thousands of dollars,” Aguilar wrote. “When real money, filing deadlines, and IRS penalties are at stake, call a professional, not an AI. Also be aware that anything you share with an AI chatbot will likely be part of its training data, and that includes your income, Social Security number, and bank routing information.”

If you have a simple return, an AI can help automate data entry and identify legitimate deductions. But it can also hallucinate data.

An AI bot isn’t necessarily good at understanding your personal tax situation and taking context into account, especially with complex returns. And it could misinterpret complex tax laws.

Let’s say, for example, that you sometimes work from home. The AI ​​might tell you that you qualify for a home office deduction when you don’t. Or, it could tell you that your ancillary losses are fully deductible when they are not. If AI misclassifies deductions, you are responsible if the IRS reports it.

The prompts you use can also make a difference. For example, if Lorin tricks the AI ​​into reducing his taxes, “sycophancy” could come into play in which the AI ​​finds deductions even if they are not applicable to his situation.

Ultimately, you are responsible for the accuracy of your tax return. And inaccuracies can be costly, especially if you end up underreporting your income or overstating your deductions.

If AI has provided you with fabricated information, then you could be unwittingly committing tax fraud – a serious offense that could result in financial penalties or criminal prosecution. Inaccurate or incomplete information could also increase your risk of being audited by the IRS.

This is not to say that you can never use AI in the tax preparation process. Even tax software companies are integrating AI chatbot assistance into their products (although their functionality tends to be limited).

But you should always check your numbers and verify the veracity of the information. Basically, don’t take the AI ​​at its word.

We rely only on verified sources and credible third-party reports. For more details, see our editorial ethics and guidelines.

Home invoice (1); CNET(2); Axios (3); CPA Practice Advisor (4); Journal of Emerging Technologies in Accounting (5;

This article provides information only and should not be considered advice. It is provided without warranty of any kind.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button