By Virginia Hulme
Editor’s Note: This article concludes a two-part series exploring professionalism in the actuarial profession. To read the first installment—focused on the role of Academy members in upholding professionalism—see the November/December 2025 issue of Contingencies.
As emerging technologies transform actuarial work, actuaries should be prepared to face challenges in transparency, control, and accountability—while continuing to uphold the principles of professionalism.
Emerging technologies, including artificial intelligence (AI) and other types of complex models, are bringing new challenges to actuarial professionalism. The main issue with these models from a professionalism perspective is that the actuary does not always know how they work, what data they were trained on, or what biases they may have. Yet professionalism requirements have not changed.
“One of the biggest challenges I see is navigating the gray areas where traditional actuarial frameworks intersect with emerging fields—especially AI, machine learning, and advanced analytics,” says Yukki Yeung, a former member of the Committee on Professional Responsibility (COPR). “Emerging technologies are expanding what actuaries can do—but they’re also expanding the ethical and professional questions we need to consider,” she says. “These tools are powerful, but they often operate as black boxes. When actuaries rely on AI or complex data models, we may be accountable for outputs we didn’t fully build or control, or for models that evolve over time in ways we didn’t anticipate. That challenges our traditional ideas of documentation, replicability, and control of work product.”
AI tools are already playing a big role in writing reports, which has implications for documentation and disclosure. “We need to keep in mind that these tools are not a replacement for our professional judgment and oversight,” says Michelle Iarkowski, a member of the Casualty Committee of the Actuarial Standards Board (ASB) and chairperson of Committee on Property and Liability Financial Reporting. “For example, while an AI tool might see the words ‘significant’ or ‘material’ as having many synonyms in the English language, those words have very specific meaning in actuarial disclosures.”
Joyce Hwu, a member of the task force revising ASOP No. 30, Treatment of Profit and Contingency Provisions and the Cost of Capital in Property/Casualty Insurance Ratemaking, says, “As AI becomes increasingly embedded in our day-to-day work, there’s no doubt some amount of human intervention is still needed to check any decisions made or work produced. But how much of that output is truly considered the human’s [contribution]? Once that work is ingested by AI, will other actuaries be able to trace its origins? Or said another way: Is AI now considered another ‘party’? This will have implications for actuarial communications, disclosures, even Precept 8, Control of Work Product.”
AI and other tools can pose challenges in areas beyond documentation and disclosures. Tim Geddes, member of the Actuarial Board for Counseling and Discipline (ABCD), points out that receiving data in a potentially near-final form will present a challenge to actuaries trying to assess reasonableness. “Finding ways to interact with the technologies as you could with human staff to determine reasonableness, for example, will be a significant professionalism challenge,” he says.
Standards to the Fore
Even though these technologies make ensuring transparency, fairness, and accountability more difficult, “our standards of practice continue to require the same level of professionalism in our work,” ASB Chairperson Laura Hanson says.
The good news is that there are resources available for actuaries trying to uphold professionalism while using these new technologies. First, remember the Code of Professional Conduct, particularly Precept 1: integrity, competence, skill, and care are required when you provide actuarial services. Second, while there is no one actuarial standard of practice (ASOP) on artificial intelligence or other new technologies, several existing ASOPs address many of the issues that arise when working with them. ASOP No. 56, Modeling, ASOP No. 23, Data Quality, ASOP No. 12, Risk Classification (for All Practice Areas), and ASOP No. 41, Actuarial Communications, make “a pretty good set of standards when you use AI,” says Darrell Knapp, immediate past president and former ASB chairperson. “I consistently refer actuaries to ASOP 56 and the need to have a basic understanding of the model you are using,” he says, and goes on to caution, “One important facet of emerging technologies is that it may make it much easier for an actuary who takes a few shortcuts on professionalism to really mess up.”
Another good resource is a recent discussion paper, Actuarial Professionalism Considerations for Generative AI by the COPR. “We wanted to provide a starting point for actuaries to think critically about how long-standing principles—like integrity, competence, and communication—apply in this new landscape,” says Yeung, who was involved in writing the paper. “Technology may change, but our responsibility to serve the public interest does not.”
Tricia Matson, Academy president and former ASB chairperson, suggests ways the profession can keep up with changing technology. First, actuaries must make sure standards keep up, she says. She recommends making sure the individuals involved are familiar with emerging technologies, for example, by adding to their teams ASB or ABCD representatives with knowledge of such technologies and incorporating that knowledge into new standards. Second, she continues, actuaries can leverage strong professionalism practices to their advantage. Because actuaries, who are among the professions leading this change, have a strong professionalism ethos, we are better positioned to ensure professionalism is a key consideration as we change, she says. Last, new technologies can help incorporate professionalism into actuarial work. As an example, she says, “I asked ChatGPT which ASOPs would apply if I were using AI in pricing, and it suggested I look at ASOPs 1, 12, 23, 41, and 56! And although I didn’t ask explicitly, it also recommended Precept 1 of the Code of Conduct. Not a bad answer for me to start with and validate appropriately!”

Other Challenges
Emerging technology may be top of mind as a challenge to actuaries, but there are others. Geddes and Knapp both raise the issue of competition from non-actuaries performing actuarial work. “Although they can produce an answer that may sometimes be right, they do not have the standards of practice, code of conduct, and qualifications standards behind their work,” says Knapp.
Geddes sees competition not just from other professions, but from technology itself, as computers take on “much of the raw analysis actuaries have historically completed.”
“This competition can pressure us financially, by offering a supposedly cheaper and faster alternative, and developmentally, by removing a lot of the work many of us did to hone our craft,” he says.
“We must find a way to cultivate learning in the next generation of actuaries, who will need to understand these competitors’ abilities to simplify some work while still providing the high-level, true analysis skills that have made actuaries indispensable.”
Another recent challenge is the new environment facing the insurance industry, says Matson. Major changes are taking place in both public and private insurance due to changes at the federal level and increasing claim costs across most types of insurance, which, in turn, are driven by factors such as medical trends and climate change. These changes are creating financial pressure on financial security systems, many of which are already under pressure, she says. Actuaries—the stewards of sound pricing and reserving—may find it challenging to focus on actuarial soundness in the face of these external pressures. But, she points out, “Our professionalism system is a great support for that!”
Demand for solutions to social issues is another new challenge, says Hwu, who often works with non-actuarial trade associations. “Actuarial concepts often come up in discussions about proposed regulation or legislation,” she says. “As the pace of technological advancement continues to accelerate and policymakers (and thereby the public) demand solutions to many growing social issues, there is more pressure than ever on actuaries to find these solutions. We, of course, have the expertise to meet the challenge, but how we do it while maintaining public trust becomes paramount.”
Yeung sees another risk as demands for efficiency, automation, and business outcomes increase: Professionalism may come to be regarded as a procedural requirement rather than a set of guiding principles. “We need to stay grounded in the principles that define our profession—especially in a world that’s moving fast and getting more complex by the day,” she says.
The Stewardship Role
It is well known that actuaries play key roles in helping to maintain financial security systems. Their analytical skills and their ability to adapt to changing circumstances, including evolving technology have served the profession well over time. As this article discusses at length, the profession benefits greatly from the Code of Professional Conduct and the standards of practice. Other professions may not have the same benefits.
Add to this the domain knowledge that actuaries have within their respective practice areas, and one can’t help but envision a leadership role for actuaries as AI technologies continue to roll out and evolve. In short, actuaries can lead from a position of strength with emphasis on professionalism, governance, innovation, education, and balance.
No doubt, the hype around AI is significant. However, there is also little doubt that its application will continue to expand within financial security systems. Accordingly, the National Association of Insurance Commissioners (NAIC) has spent significant time and effort seeking to understand the impact of AI on insurance consumers, and their work is ongoing. To date, it has offered guidance on AI systems that are fair and ethical, compliant, transparent, and secure (Principles on Artificial Intelligence (AI)). Further, it has indicated the need to consider governance, risk management and controls, and reliance on third-party models and data (Use of Artificial Intelligence Systems by Insurers).
When one adds all this up, it is clear that much work remains to be done, and actuaries have a significant stewardship role to play as we move forward.
—Rich Gibson, Former Senior Casualty Fellow
Implications for Professionalism
Actuarial professionalism exists for a reason—to protect the public and to ensure that organizations that provide insurance, pensions, and other benefits are solvent when people need them.
The rapid pace of change in the environment in which actuaries work presents a new set of challenges for professionalism. “Navigating AI can be challenging for actuaries today, as the rapid pace of AI development brings many questions and uncertainties,” says Judy Liu, COPR member. “As actuaries, we have to stay informed and adaptable, and work together to regulate and promote ethical AI use. This involves continuous learning, collaboration, and proactive measures to ensure that AI is used responsibly and ethically.”
The use of emerging technologies “raises new questions about bias, fairness, and explainability—especially when models are used in underwriting, pricing, or claim decisions that affect real people,” says Yeung. “It also raises serious questions about transparency, accountability, and professional responsibility. We’re being asked to interpret results, make judgments, and sign off on things that might go far beyond traditional actuarial models. That’s not inherently bad—but it does require a deeper understanding of our ethical obligations and the boundaries of our expertise.”
“Professionalism today isn’t just about knowing the calculations or citing the right ASOP—it’s about understanding the broader implications of the tools we use and being willing to ask tough questions about risk, equity, and trust,” she says.
VIRGINIA HULME is the Academy’s assistant director of professionalism.