September Market Update 2025
This month we discuss the continued rally in risk assets, softness in the US labour market and the Fed's policy response, valuations the technology sector, and challenges to the outlook.
Our In Conversation With series features candid discussions with inspiring individuals and leaders across diverse fields, exploring their journeys, values, and perspectives on wealth and impact.
For Dorothy Chou, Angel Investor and Director of the Public Engagement Lab at Google DeepMind, wealth is not about extraction or quick wins. It is about creating value that endures, embedding resilience into essential systems, and enabling people to follow purpose. This belief shapes both her work at DeepMind—where she builds public trust in artificial intelligence—and her investments in the next generation of innovators.
With a career spanning Google, Dropbox, and Uber, Dorothy has helped design accountability structures that still shape how billions live and work. Today, she reflects on her path at the intersection of technology and society, the power of solidarity in leadership, and why she believes imagination is as important as policy in shaping the future of AI.
Q Can you tell us about your background and what drew you to working at the intersection of technology, policy, and public engagement ?
I grew up the daughter of immigrant scientists in drug discovery, so the scientific method was part of daily life. At the same time, being bilingual and drawn to history, geography, and literature meant I was always moving between worlds—scientific and humanistic, empirical and imaginative. That perspective shaped how I think about technology: not just what can be built, but how it reshapes the human story.
At Google, Dropbox, and Uber, I worked with founders whose decisions continue to influence billions of lives. I helped launch accountability measures such as Google’s first Transparency Report and developed ethics standards to address inequities in risks and access.
Today, at DeepMind, I lead the Public Engagement Lab. We work with civil society, policymakers, and decision-makers to imagine what society actually wants from AI—and how to build meaningful ways for people to shape that future. In parallel, I invest in founders tackling urgent but overlooked challenges, embedding themselves into essential systems like healthcare, civic infrastructure, and science.
Q As a female leader working across both technology and investment, what has your journey taught you about leadership and representation ?
I didn’t set out to be an advocate for female leadership. My family raised me to believe that if I worked hard enough, gender bias would disappear. But it didn’t. The further I advanced, the harder it became.
Several years ago, after being dismissed in a meeting, a mentor told me: “I can’t promise you it will get better, but I can promise you we’ll do it together.” That line stayed with me. It showed me the impact solidarity can have. One woman alone can’t change much, but several women in leadership positions can shift entire systems.
Women today may have more disposable income than ever, but unless we turn it into market-moving power, even our most basic needs—such as women’s health—will continue to be overlooked. This can’t be solved from one angle. The whole innovation pipeline has to change—from funding rules, to which founders we back, to the incentives shaping their choices.
Q At DeepMind, you lead the Public Engagement Lab. How does broadening the conversation on AI change the way society defines its value and its risks ?
Facts and rules on their own aren’t enough to prepare people for AI. Public imagination is as important as public policy. People need to be able to picture the future they want.
“Public imagination is as important as public policy. People need to be able to picture the future they want.”
That’s why we convene students, cultural figures, policymakers, and technologists. It broadens the conversation beyond tech circles. AI stops being about efficiency or profit and becomes about culture, dignity, and justice. It gives visibility to groups that might otherwise be excluded.
AI also has extraordinary potential in the areas we care most about: health, education, and climate resilience.
Take AlphaFold, DeepMind’s breakthrough in protein folding. What once took years of research can now happen in seconds, a discovery recognised with the 2024 Nobel Prize in Chemistry. We opened a database of more than 200 million protein structures, now used by over two million scientists. For organisations tackling neglected diseases, that has meant entirely new “shots on goal” for treatments that might never have existed.
These are the kinds of returns AI should be chasing. The risk is that our funding systems don’t reward them. They prioritise rapid scalability and short-term returns, not breakthroughs that save lives. If capital doesn’t evolve, we risk missing the opportunity.
Trust, too, depends on participation. It is earned by bringing people into the process early, not by presenting them with finished products. At the Public Engagement Lab, we create spaces where technologists, policymakers, educators, and communities co-create. The act of building together gives people both a voice and a stake. That’s how trust endures.
Which personal values guide your approach, whether advancing AI at DeepMind or backing a new founder?
Three pillars guide my approach:
That’s why I joined DeepMind, and it shapes my investing too. I ask: is AI really the right tool for this problem? How will it interact with society through trust, regulation, ecosystems? What safeguards exist when harm occurs? Those aren’t just diligence questions—they’re values in practice.
Q Wealth can mean very different things to different people. What does wealth mean to you, and how has your perspective evolved ?
I used to think wealth meant scaling fast and worrying about impact later. Two decades in tech have changed that.
“Wealth isn’t extraction—it’s creation that endures. It’s resilience, and the freedom to follow purpose while enabling others to do the same.”
The best founders don’t just count revenue, they build resilience. They embed into essential systems, withstand shocks, and keep creating value long after the hype fades.
That’s real wealth: capital aligned with outcomes that still matter fifty years from now.
Q Looking ahead, where do you hope your work, both in AI and investing, will leave its mark ?
We’re living through extraordinary advances—from AI predicting protein structures to cars driving themselves. But beyond individual breakthroughs, I care most about systems-level change: shifting markets to reward outcomes, resilience, and public value, not just exits.
This is why I’ve backed ventures like Proxima Fusion, building Europe’s first stellarator reactor, and AUAR, reinventing housing with automated, sustainable construction. Both show how founders can create entirely new markets by embedding into essential systems from the start.
“The best founders don’t just count revenue, they build resilience. They embed into essential systems and keep creating value long after the hype fades.””
If, ten years from now, AI is helping build healthcare that absorbs shocks, scientific infrastructure that translates knowledge into action, and civic institutions people can trust, I’ll feel I’ve contributed to the right kind of progress.
If you are interested in finding out more information about the themes discussed in this article or Google DeepMind please reach out to info@bedrockgroup.ch for more information.