DEI in the Workplace
Seniors' Rights in the Workplace

Challenges of Using ChatGPT in the Accounting Profession

Ethical Issues and Risks to Consider

It seems that virtually everyone is talking about ChatGPT. Educators fear that students will use it to write term papers. Those in the workplace have concerns about the security and privacy of the data. Ethical issues arise that must be considered. AI ethics managers have expressed concerns about privacy, manipulation, bias, difficulty understanding how it works, inequality and labor displacement. Accountability and transparency are two ethical values that should be considered in using the chatbot.

I have previously blogged about the ethical issues in using ChatGPT. In this posting, I examine the use of ChatGPT in the accounting profession, a field in which attracting students has become quite a challenge. The profession is searching for ways to entice more students to study and enter the profession to no avail. Can ChatGPT be the answer to the supply shortage?

What Can it Do?

As an artificial language model, ChatGPT is dependent on the data it is fed to make inferences and return accurate information. Using a wide range of internet data, ChatGPT can help users answer questions, write articles, program code, and engage in in-depth conversations on a substantial range of topics.

AI programs are designed to improve efficiency for performing basic tasks, including researching and writing. Some users are even asking ChatGPT to take on more complex forms of these tasks, including drafting emails to fire clients, crafting job descriptions, and writing company mission statements.

Writing for the Illinois Society of CPAs, Elizabeth Pittelkow, the VP of Finance for GigaOm, points out that: “It is important to note that while ChatGPT can provide helpful suggestions, it is not as good at decision-making or personalizing scripts based on personality or organizational culture.” She points out that: “An effective way to use ChatGPT and similar AI programs is to ensure a human or group of humans are reviewing the data, testing it, and implementing the results in a way that makes sense for the organization using it.” One example is with job descriptions written by an AI program. It is essential to build in internal controls by having one human “ensure the details make sense with what the organization does and does not do.”[1]

Ethical Uses and Risks

Pittelkow decided to try ChatGPT and asked the bot if it could tell her more about the ethics of AI. In response, it did not hesitate to point out that the field of ethics in AI is concerned with the moral implications of the development and use of the technology, pointing to a range of ethical topics on bias and fairness, privacy, responsibility and accountability, job displacement, and algorithmic transparency.

She also asked how ChatGPT can be used ethically. The bot suggested that being respectful, avoiding spreading misinformation, protecting personal information, and using ChatGPT responsibly, are all ethical issues to consider. Avoiding bias in entering data and analyzing it is of great concern. The data entering the system is only useful if impartiality can be assured.

Generative AI systems like ChatGPT can give inaccurate or misleading results because of prompts that are too vague but also from poor data sources. The limitation of the technology means it can experience problems on relatively simple queries. 

Other ethical risks include a lack of transparency, erosion of privacy, poor accountability and workforce displacement and transitions. The existence of such risks affects whether AI systems should be trusted. To build trust through transparency, organizations should clearly explain what data they collect, how it is used and how the results affect customers.

Data security and privacy are important issues to consider in deciding whether to use ChatGPT, especially in accounting. As an AI system, ChatGPT has access to vast amounts of data, including sensitive financial information. There is a risk that this data could be compromised. It is important that essential security measures are in place to protect this data from unauthorized access.

The need for oversight cannot be overstated. This includes accountants, regulators, policymakers, and the public. The accounting profession must be proactive in addressing these concerns and take steps to ensure that the use of ChatGPT in accounting is responsible and ethically applied in the analysis of data and decision making. ChatGPT 2

ChatGPT in Accounting

ChatGPT uses large amounts of data to generate human-like responses to text-based prompts. In accounting, it can be used to automate numerous tasks such as bookkeeping, financial statement analysis, and even fraud detection. It can also develop spreadsheets based on table data.

Heather Satterley writes in the Woodard Report that if the data used to train ChatGPT is biased, it may lead to biased results when used in accounting tasks. For example, if the data used to train ChatGPT only includes financial data from certain industries or demographics, the resulting analysis may not be applicable to other industries or demographics. This could lead to inaccurate financial reporting, questionable financial analyses, and biased decision making.[2]

Satterley identifies additional ethical issues of using ChatGPT in accounting as follows.

Accountability

A key question is who is responsible for making decisions based on the data provided. Is it the accountant who programmed ChatGPT or the AI system itself? If there are errors in the decision-making process, who is responsible for correcting those errors? She points out that these questions must be addressed to ensure accountability and transparency in the use of ChatGPT in accounting.

Data privacy and security

A pressing issue is data privacy and security. Given its access to vast amounts of data including sensitive financial information, there is a risk that this data could be compromised, either through hacking or other means. It is important that proper security measures are in place to protect this data from unauthorized access.

Workforce Displacement

The use of ChatGPT in accounting raises ethical concerns regarding job displacement. Firms should provide support and retraining whenever possible.

While workforce replacement is an important issue, ChatGPT could benefit the accounting profession that has experienced a decline in enrollment of students in accounting programs for several years. By automating routine tasks, the demand for accounting graduates might better match the supply of available graduates.

ChatGPT and the CPA Exam

Research conducted on whether ChatGPT could pass the CPA Exam was disappointing—at first. An early version 3.5 averaged a score of just 53.1 out of 100 and was unable to pass any section of the exam. The latest version 4.0, however, averaged a score of 85.1 and passed all four sections. In its best section, auditing and attestation (AUD), the chatbot got a score of 87.5.

The advanced version of ChatGPT with its advanced reasoning capabilities, does much better in handling assessments and decision making. This is not surprising as the chatbot is likely to learn from previous attempts and improve scores over time.

Research Studies

As would be expected, very little research has been conducted on whether ChatGPT responses to questions and/or case studies are better than, about the same as, or worse than student responses. One important study to be published in Issues in Accounting Education, reports that the authors used data from 14 countries and 186 institutions and compared ChatGPT and student performance for 28,085 questions from accounting assessments and textbook test banks. As of January 2023, ChatGPT provides correct answers for 56.5 percent of questions and partially correct answers for an additional 9.4 percent of questions. When considering point values for questions, students significantly outperform ChatGPT with 76.7 percent average on assessments compared to 47.5 percent for ChatGPT if no partial credit is given. ChatGPT performs better than the student average for 15.8 percent of assessments when partial credit is included.[3]

Conclusions

Accountants can spend more time on higher-level tasks such as analysis and strategy when automating routine tasks using ChatGPT. This could lead to better decision-making and more efficient use of resources. Additionally, ChatGPT can assist in fraud detection, potentially saving companies significant amounts of money. However, ChatGPT has a long way to go before it becomes a widely used tool in accounting. No doubt, new iterations of the bot will lead to better results.

In conclusion, the ethical dilemma of ChatGPT in accounting highlights the need for responsible and ethical use of AI. While there are potential benefits to the use of ChatGPT in accounting, it is important to address the ethical concerns raised by its use. This includes developing ethical guidelines and standards, involving all stakeholders in the conversation, and being proactive in addressing the potential impact of AI on the workforce. By doing so, we can ensure that the use of ChatGPT in accounting is beneficial to society as a whole and does not have any negative impact on the accounting profession or the wider economy. The ethical use of ChatGPT means that it must serve society’s needs and meet the public interest obligation of the accounting profession.

References

[1] https://www.icpas.org/information/copy-desk/insight/article/spring-2023/how-can-cpas-ethically-interact-with-chatgpt.

[2] https://report.woodard.com/articles/adventures-with-chatgpt-fpwr.

[3] bit.ly/3qoQPPv.

Blog posted by Steven Mintz, PhD, on June 6, 2023. Find out more about his professional activities on his website (https://www.stevenmintzethics.com/).  

 

Comments