For CFOs and finance leaders who frequent networking events or trade shows, you’ve likely attended a keynote by Glenn Hopper in recent years. Hopper, a former CFO who now serves as head of AI research and development of Eventus Advisory Group, an instructor of executive leadership at Duke University and host of the FP&A Today podcast, has become an AI insight whisperer to CFOs tasked with understanding all the positives and negatives from a business and risk management perspective that come with implementing these types of tools.
After publishing his most recent book, AI Mastery for Finance Professionals, toward the end of last year, Hopper recently talked to CFO.com about some of the ideas in the book and how they play into the outlook of AI in finance and its most coveted subset, generative AI.
Glenn Hopper

Head of AI research and development, Eventus Advisory Group
Notable previous positions:
- CFO, Sandline Global
- CFO, CICAYDA
- CFO, GR Ventures
- CFO, HCT Investments
This interview has been edited for brevity and clarity.
ADAM ZAKI: A point mentioned throughout the book is the importance of the development of an AI implementation timeline. Given all the factors that can play into the approach and execution of supplementing or replacing a finance function with technology, how would you advise this to be done?
GLENN HOPPER: There are two important paths for CFOs to identify here. There’s classical AI machine learning, which we’ve been using for a few decades now and most CFOs use it in some capacity already. This type of AI is deterministic, readily available and you can replicate the results.
Generative AI, which is not deterministic, is much different. It’s harder to explain how it works, it can hallucinate and it’s got issues, and many are coming up with ways to address them right now. Generative AI can do some cool parlor tricks right now, and if you use it in the right ways, it can be helpful to all different types of people working in different positions. But for CFOs, if you don’t have data maturity, if your data isn’t in order and you don’t have proper data governance and a data management system, then this is not a technology you can use yet.
Before generative AI, CFOs would have to be able to write Python or SQL queries to access these datasets. Now, there will be tools that act as layers on technology already in place. Think of your favorite dashboard, and instead of clicking through and searching for data, you ask the system questions about the data directly.
This is why I wrote the book. If you’re going to rely on AI for business decisions, you need more than just a superficial understanding. It’s not a magic box where you flip a switch, ask a question, and get an oracle-like answer. You don’t have to be a developer or an engineer, but it’s important to have a basic understanding of how it works. That knowledge helps you ask the right questions and use it effectively.
The timeline is about incremental wins. Clifford Stoll has a favorite quote of mine, and anyone who has heard me speak has probably heard me say it: “Data is not information, information is not knowledge, knowledge is not wisdom and wisdom is not understanding.” This is the pyramid of how these things will happen. It’s about understanding the technology, not shooting for the moon and trying to automate everything, and picking the projects that can get buy-in and make an impact.
That’s why I dive deep into these topics in the book. As a finance leader, it’s about choosing the right projects where you can demonstrate impact, gaining buy-in, and helping your team understand when to trust AI, when human input is essential and when tasks can be fully automated.
You talk about AI bias in the book. You write, “Users have to proactively test AI models for fairness.” How can this be done?
HOPPER: Most CFOs are probably relying on off-the-shelf models coming from the big companies. So it’s known that those models are only as good as the data they’re trained on. There were instances of Gemini putting out questionable images with controversy, and that is bias that is injected. That bias started from a good place but developed into an interesting phenomenon.
Those who created the AI that developed those pictures were trying to avoid a situation where, if you asked AI to create an image of, let’s say, a doctor, it wouldn’t show you the same race or gender person consistently. So they injected this bias to drive diversity in image output with good intentions, and it backfired.
It’s important for CFOs to make an effort in the perception from their people so employees don’t interpret AI implementation of any type as an efficiency gain only. The value of these tools is not in reducing the workforce and being able to fire people.
For those using their own data to create AI models, it’s about making sure your data is in order and bias-free. If you give AI limited datasets of company performance, and you have a really bad quarter, it’s going to overpredict and make errors based on that data. Data is the foundational piece to all of this. If your data is weak and you try to train an AI model on it, you’re doomed from the start.
You talk about level 5 AI intelligence in the book, and how it would be possible to one day run entire organizations. Though the technology is still in the works, do you believe it is possible for a generative AI tool to hold an executive or decision-making position within a company?
HOPPER: [OpenAI CEO] Sam Altman has already said they know the path to what they are calling Artificial General Intelligence, and that level 5 figure that cites leadership organization is OpenAI’s definition. Right now, most tools are at level one of generating text based on datasets with possible hallucinations.
Now there is development in level two, and instead of generating text, the AI can stop and think about it. This reduces hallucinations and gives more insightful answers. Sometimes it will think for ten seconds, sometimes for ten minutes. Some even envision an AI that can go think for something like ten days and do PhD-level research.
If OpenAI gets to level five, which can create a tool that can run an organization, that is something that will replace a human job. I don’t want to be an alarmist on this, but when Sam Altman said that this can happen in months, not years, it’s something CFOs need to be aware of. The rapid development of this technology is a significant concern.
More frequently, CFOs come from areas adjacent to accounting like FP&A, consulting, and investment banking. How will this play into how future finance leaders approach data compared to how they have in the past?
HOPPER: I think Sarbanes-Oxley and the history we’ve had in corporate finance have had a lot to do with this. But again, it goes back to the importance of having your data in order. You have to have clean books, the controls must be in place, your data must be secure.
I think the demand for CFOs is to be much more strategic, but in my experience, it almost takes two types of brains to be a CFO. One has to be that GAAP-centric approach, while the other has to be able to make predictions and forecasts and do so with creativity, and that’s the part that is growing in demand for the CFO.
I think for companies that can afford it, having a chief accounting officer there is really helpful because a CFO can’t do everything. But once the books are in order, CFOs have to be able to use their data to be more strategic and forward-looking.





