governance
A Governance Work Plan for Responsible AI Adoption in Not-for-Profits
Published: February 10, 2026
Read Time: 6 minutes
As artificial intelligence (AI) becomes more accessible, not-for-profit organisations across Australia are experimenting with tools that can improve efficiency, strengthen community engagement, and even support impact evaluation.
But amid the opportunities, Not-for-Profits (NFP) boards and executive teams are rightly asking: how do we make sure these technologies are used ethically, transparently, and in service of our mission?
The answer lies in building a fit-for-purpose AI governance work plan, a roadmap that helps your organisation adopt AI in a responsible, human-centred way. This article lays the groundwork for what an AI governance work plan might look like, including the core principles, key elements, and early actions you can take to protect your organisation, staff, and community.
Why Now?
AI is already in many tools used daily across NFPs, often without teams realising it. Platforms like Microsoft Office, Canva, and Mailchimp now embed AI to enhance productivity. Meanwhile, more advanced tools like ChatGPT, Otter.ai, and writing tools are being tested by NFPs teams seeking to “do more with less”.
Used wisely, AI can support your purpose. But without oversight, it can also introduce reputational, ethical and operational risks, especially for organisations working with vulnerable populations, First Nations communities, or marginalised groups.
An AI Governance Work Plan in Context
Rather than starting with technical complexity, NFPs can take a strategic and values-led approach. An AI governance work plan should:
- Reflect your organisation’s purpose
- Be proportionate to your risk appetite and capacity
- Embed accountability and ethics in the work plan
- Provide clear guidance on where human intervention is essential
- Encourage innovation, learning, and transparency.
The Five Building Blocks
A practical governance work plan should start with five core components:
1. An AI Governance Charter
A concise, board-approved statement of intent outlining how your organisation will engage with AI tools.
What to include:
- Purpose and intent – What problems are you using AI to solve?
- Ethical principles – These include dignity, fairness, privacy, and inclusion.
- Use boundaries – Where AI can and can’t be used (e.g. not for eligibility decisions).
- Human oversight – When human review is mandatory.
- Data governance – How consent, privacy and ownership are handled.
- Accountability – Who reviews tools, performance and risks.
Tip: Treat the Charter like a living document. Review annually or as part of your digital or strategic planning cycle.
2. AI Tool Register
A transparent list of current and planned AI-enabled tools in use by your organisation. Suggested columns:
- Tool name and provider
- Primary function (e.g. donor engagement, content creation)
- Department/team using it
- Risk rating (low, medium, high)
- Whether personal or sensitive data is used
- Human review required (Y/N)
- Date reviewed/approved
Tip: Start simple. Even a shared spreadsheet is a good foundation.
3. Human-in-the-Loop Guidelines
Clearly defined points in your operations where AI must defer to humans. This safeguards decision-making and builds trust. Examples of when human intervention is essential:
- Accepting or rejecting service requests
- Drafting public communications about sensitive topics
- Making funding allocation decisions
- Identifying clients in crisis or high-risk categories
- Creating or editing reports for regulators
Tip: Use process maps or workflows to identify where staff “step in” during AI-enabled tasks.
4. Embedding Governance into Existing Structures
Build AI oversight into what already works in your organisation. You don’t have to start from scratch. Suggested integration points:
- Board and Risk Committee – Add AI updates to the agenda at least on a quarterly basis.
- Internal audit – Assess if AI tools meet fairness, privacy, and compliance standards.
- Digital capability planning – Include AI skills and awareness in staff development plans.
- Vendor onboarding – Request documentation on how AI tools are built, tested, and monitored.
- Code of conduct – Introduce practical guidelines for responsible AI use by staff and volunteers.
Tip: Build AI discussions into your regular governance rhythms, rather than creating extra workstreams.
5. Creating a Culture of Ethical Curiosity
Normalise conversations about AI by making them accessible and inclusive, not just a tech or compliance issue. Suggested activities:
- Host casual learning sessions (e.g. AI lunch and learn).
- Run short, anonymous surveys to capture staff views on AI.
- Invite feedback from service users on how AI might affect their experience.
- Include a short, practical module on ethical AI in new staff onboarding.
Tip: Start with one question at your next team meeting: “Where do you see AI popping up in your daily work?” It opens the door for curiosity without pressure.
Building Confidence Through Training and Reviews
Strong governance relies on capable people. As AI tools become more common in the not-for-profit sector, building confidence and understanding among staff and volunteers is essential. Rather than overwhelming teams, focus on accessible training that suits your context. An annual “Responsible AI 101” session can offer a simple foundation for all staff.
For teams working more directly with AI, consider practical, scenario-based learning to explore ethical decision-making, risk spotting, and appropriate use. Short explainer videos tailored to the AI tools used within your organisation can also be helpful, especially for new staff or volunteers. These resources don’t need to be high budget. They just need to be clear and relevant.
Designating an “AI lead” within each team can go a long way in embedding everyday support and encouraging open discussions about AI use. These individuals don’t need to be technical experts but can serve as key contacts for questions, flagging risks, or sharing good practice.
Where To From Here
AI will not replace the human touch that is central to your impact, but it can support it. From reporting to service design, well-governed AI adoption helps your organisation deliver better results, with fewer resources, in increasingly complex contexts.
Your governance work plan doesn’t have to be perfect or high-tech. It simply needs to reflect your mission, values, and responsibilities to the people you serve.
By starting now, you’re future-proofing your organisation and strengthening your reputation as a responsible, thoughtful, and mission-driven player in the social impact space.
The most important step is to start small and build gradually. Begin by mapping where AI already exists in your tools and identify where governance gaps might be. Appointing a working group or nominating internal champions can help build momentum. Governance doesn’t mean slowing down innovation. It means guiding it with care. Keep your board informed, invite staff into the conversation, and learn from your community along the way. The goal isn’t to control every use of AI, but to be deliberate, ethical, and transparent. When used thoughtfully, AI can be an enabler of productivity, impact, and sustainability across your programs.
This article was first published in the 2025 Better Boards Conference Magazine.
Further Reading
Using AI: Has Your Board Done Its Homework
AI and Your Board – Insights into Responsible Governance
5 Cognitive Biases That Quietly Influence Boardroom Decisions
Share this Article
Recommended Reading
Recommended Viewing
Author
-
Co-Founder
Hoshizora Foundation
- About
-
Wenda is the co-founder of Hoshizora Foundation, an education not-for-profit, and currently serves as the Board Chair. Based in Sydney (Australia), she is a also a Director for a global consulting firm leading teams to provide services on ESG, risk management, corporate governance, strategy activation, and business model management. She is passionate about social impact and advising several ESG and community investment programs globally. Being trilingual in English, Japanese and Indonesian, Wenda has given lectures, workshops, and developed knowledge-sharing programs for local and global institutions.
Found this article useful or informative?
Join 5,000+ not-for-profit & for-purpose directors receiving the latest insights on governance and leadership.
Receive a free e-book on improving your board decisions when you subscribe.
Unsubscribe anytime. We care about your privacy - read our Privacy Policy .