
Gwendoline Grollier on how female founders are turning the “diversity gap” into a measurable performance advantage, and proving that “Trust is the True Currency of the Future”
“Risk isn’t about saying no, it’s about enabling smarter yeses.”
In this exclusive interview, we speak with Gwendoline Grollier, co-founder of the 99% female-owned consultancy, T3 Consultants, about her remarkable career shift from two decades in global finance to the forefront of technology.
Gwendoline’s journey is defined by a central belief that risk management is not about control, but empowerment, a philosophy she sums up as enabling “smarter yeses.” This mindset underpins her work at the intersection of risk, regulation, and Responsible AI.
She discusses building a practitioner-led firm that is successfully proving a powerful business case: why the sector of Women in Responsible AI is not just about ethics, but about securing a competitive advantage. Using data, she reveals how female founders generate “more revenue per unit of capital invested,” arguing that what’s often called a diversity gap is, in fact, a performance advantage.
Hi Gwendoline, Thank you so much for being part of Women Rock. Could you tell us a little about your background and how you made the transition from finance into the world of tech?
When I was two, I stuck a hairpin into a plug just to see what was inside. I got a small shock, but also my first lesson in risk: understand it, don’t fear it. That instinct to look inside, to understand before acting, has guided my career ever since.
I’ve spent nearly twenty years in finance, from Crédit Agricole CIB to Credit Suisse, helping global institutions manage complex risks. As finance evolved, so did risk. Operational resilience brought in cybersecurity, data integrity, and technology. At some point, it stopped being ‘finance versus tech’ and became one conversation.
My work now sits exactly at that intersection, helping organisations innovate safely and responsibly by creating forward-thinking, adaptable risk frameworks aligned with regulatory requirements.
What inspired you to found your own company, and what has that journey been like so far?
I co-founded T3 to build the kind of consultancy I wished existed: one led by practitioners, not just theorists. Everyone here has lived what they now advise on, from regulatory stress tests and AI governance to crisis management. We don’t talk about best practice; we deliver it.
T3 is 99% female-owned. We work across risk, regulation, and Responsible AI. The journey has been demanding but rewarding. We’ve proven that expertise and agility can coexist and that compliance can drive performance rather than stifle it.
Moving from corporate to start-up has also had a profound impact on my identity and my definition of success. It pushed me to rethink what impact means, not just managing change but shaping it.
Looking back, what has been one of the most defining moments in your career?
The turning point was realising that risk isn’t about saying no, it’s about enabling smarter yeses. Once I understood that, I stopped viewing my work as control and started seeing it as empowerment. That mindset shift shaped every role that followed and ultimately how we built T3.
As a female founder and owner in tech, what have been some of the biggest challenges you’ve faced, and how have you overcome them?
The hardest challenge? Being underestimated, quietly, subtly, but persistently. I learned early on not to argue my worth but to show it. Consistency builds credibility faster than any speech ever could. Credibility compounds like capital.
The Performance Advantage: Why Female Leadership Wins
What does being part of a female-founded and female-owned business mean to you personally and professionally?
It means freedom to lead differently, hire differently, and deliver differently. Professionally, it proves that collaboration and precision outperform hierarchy. Personally, it’s about representation, showing that empathy and execution can coexist and that leadership doesn’t have to look traditional to be effective.
Women may still be the minority among start-up founders, only around 15 to 18 per cent globally, yet they consistently outperform expectations. Studies show that female-led firms demonstrate stronger capital efficiency, higher returns on investment, and greater long-term sustainability.
Rigorous academic analysis shows that women-founded companies generate more revenue per unit of capital invested, proving that what’s often framed as a diversity gap is, in reality, a performance advantage.
(Sources: World Economic Forum – Global Gender Gap Report 2024 and Morazzoni, G., & Sy, M. (2022). “Female entrepreneurship, financial frictions and capital.” Journal of Economic Behavior & Organization, 197, 303–321.)
Have there been any role models, mentors, or communities that have supported you along the way?
I’ve had mentors and role models in many forms, and they’ve rarely been who you’d expect. Some didn’t give me answers but asked sharper questions, the kind that stretch your thinking instead of comforting it. Others were simply there to listen after long days, helping me see things from a new angle.
What I’ve learned is that role models aren’t always the people on stage. Sometimes they’re the people beside you: co-founders, friends, colleagues, the quiet voices who hold you accountable.
You have to look for them. They’re everywhere if you’re curious enough. They don’t need to be famous or successful entrepreneurs. Anyone whose actions align with their values can be a role model. It’s about perception, persistence, and curiosity, qualities that make ordinary people extraordinary.
The Future of Women in Responsible AI
You’ve mentioned a strong interest in Responsible AI. What does this mean to you, and why do you feel it’s so important for the future of technology?
Responsible AI is about the human side of technology. AI is already part of everything we do; it’s here to stay. The question isn’t whether to adopt it, but how to adopt it responsibly. It’s not about being first; it’s about being first to do it right. Innovation without accountability is risk without control.
AI influences hiring, lending, healthcare, and even human relationships. The implications are vast. Independent data show both the surge in AI adoption and the rise in AI-related incidents. Stanford’s 2025 AI Index reports that 78% of organisations used AI in 2024, while AI-related incidents are rising sharply.
In the UK, the Ada Lovelace Institute and the Alan Turing Institute found that 67% of the public report exposure to AI-related harms. Together, these trends underline why governance and accountability are now business-critical.
Responsible AI means building systems that are explainable, fair, and accountable. It’s not a compliance exercise; it’s the new foundation for trust.
(source: Stanford University – Human Centered Artificial Intelligence, The 2025 AI Index Report)
Why do you think Responsible AI has comparatively higher representation of women, and what can other fields in tech learn from this?
Responsible AI is the intersection of deep technical skill and considered ethical judgment. It’s not enough to know how a model works, one must also understand its social consequences, potential harms, and trade-offs. Women often bring this dual perspective: thinking about both the mechanism and the impact.
A 2025 UNESCO report notes that women make up about 30% of AI professionals globally, with lower representation in the Global South. Interestingly, Responsible AI appears to attract a higher share of women than most other tech disciplines. That’s telling. It shows that when purpose meets technology, diversity follows naturally.
Other tech fields can learn from that. Progress isn’t about how fast you move; it’s about whether you’re moving in the right direction.
(source: UNESCO, Tackling Gender Bias and Harms in Artificial Intelligence (AI), 2025)
Could you share some positive examples of ethical, fair, or transparent AI in practice that inspire you?
I am most inspired by organisations that build ethics and accountability into design, not as a fix after something goes wrong.
Frameworks like the OECD AI Principles, the NIST AI Risk Management Framework, and the EU’s Ethics Guidelines for Trustworthy AI have turned responsibility into practice. They help teams document data provenance, assess bias, and explain model decisions, building trust through transparency.
NIST defines four core functions (Map, Measure, Manage, and Govern) while the OECD places human oversight and accountability at the heart of innovation. The EU’s Guidelines, now embedded in the AI Act, outline seven pillars of trustworthy AI, from transparency to robustness and privacy.
Alongside these, a growing network of non-profits and research bodies is bridging ethics and engineering. The Alan Turing Institute develops assurance frameworks for AI systems; meanwhile Ada Lovelace Institute advances transparency and accountability through public participation and redress mechanisms.
According to the OECD’s 2023 report Advancing Accountability in AI, trust in AI is built on transparency, accountability, and human oversight. Those who make that trust measurable will lead the next phase of AI adoption.
If you want to start, keep it simple and tangible:
- Select a framework such as NIST, and map where your current AI practices align. Even a basic gap analysis can reveal blind spots and quick wins.
 - List all your AI use cases. Knowing where AI already operates, formally or informally, is the foundation of any governance strategy. You cannot manage what you do not see.
 - Establish clear accountability at leadership level. Make AI governance a leadership function, not a side project.
 
Responsible AI maturity begins with awareness. Start by knowing what you have, then build accountability around it. The sooner you start, the sooner responsibility becomes part of innovation itself.
(Source: OECD AI Principles (2019, updated 2024), OECD (2023). Advancing Accountability in AI: OECD Publication, NIST AI Risk Management Framework (2023), Alan Turing Institute: Courses, Ada Lovelace Institute (2023): AI Assurance), Ada Lovelace Institute (2023): Risks and Redress in AI Systems)
What’s a favourite motto that motivates or inspires you?
My favourite motto is that curiosity is the beginning of responsibility. It reminds me of that moment with the plug. Curiosity drives progress, but responsibility ensures it lasts.
Outside of work, what do you enjoy doing that helps you recharge and bring balance?
I recharge outdoors, hiking mostly or doing anything that forces me to look up from a screen. I also write; storytelling resets my brain differently. I don’t believe work-life balance means switching off. For me, balance is switching perspectives. Life is a pendulum; trying to freeze it in balance is the surest way to feel perpetually unsatisfied.
Finally, what’s next for you – both personally and for your company?
For T3, we’re focused on helping organisations embed AI in ways that create real value, not just adopt technology, but operationalise it responsibly at scale to deliver measurable ROI. From training and adoption to assurance, we’re working to prove that responsible AI isn’t a cost centre; it’s a competitive advantage. We want to make this accessible to everyone, from start-ups to Big Tech.
Personally, I want to keep bridging finance, regulation, and technology. AI will define how we work, but responsibility will define how we trust it, and trust is the true currency of the future. And of course, I want to get a few books published.
Gwendoline’s Quick Takes
- Favourite Quote: “We don’t see things as they are, we see them as we are.” by Anaïs Nin
 - Favourite Book: “Tomorrow, and Tomorrow, and Tomorrow” by Gabrielle Zevin
 - Favourite Song: Dog Days Are Over by Florence + The Machine
 - The best advice you have ever been given: “Don’t aim to be right, aim to stay curious.” That one sentence changed how I lead. It’s impossible to innovate or build responsibly without curiosity at the core.
 
Gwendoline Grollier’s insights provide a powerful blueprint for the future of the industry, shifting the conversation from compliance to competitive advantage. She proves that an intersectional career bridging finance, risk, and technology allows for a unique, 360-degree view of innovation.
Her conviction that risk management is empowerment, not restriction, is perhaps her most vital lesson. We leave inspired by her clear-eyed focus on operationalising accountability in AI, reinforcing the core truth that innovation without accountability is risk without control.
Gwendoline’s story is a compelling reminder to be both persistent and curious, and to remember her ultimate conclusion: trust is the true currency of the future.
Want to be featured and share your story, or know someone with a great story? Get in touch at hello@womenrock.tech.
Interviewed by Ryan Loftus






