Last week I took part in a roundtable focused on the future of free legal advice in the face of AI, and I left inspired. The conversation centred on something simple but powerful: how we coordinate our work around AI and tech to better serve frontline organisations. Not to invent more tools, but to create the scaffolding that helps people use the ones we’ve already built.
It reminded me how much more we could do—if we weren’t scattered. That then sparked with the constant barrage of the current news cycle, leading to me pondering:
“If I had the power to dictate the sector’s AI response—what would I actually do?”
Here’s my manifesto.
We Need Radical Coherence, Not More Tools
This is not about accelerating adoption. It’s about slowing down to align. Right now, the sector is full of energy, experiments, and well-intentioned pilots. But we lack shared scaffolding. We duplicate. We chase shiny tools. We let hype outrun purpose.
1. Start with Doctrine, Not Technology.
Every organisation would adopt Wardley Doctrine Phase 1 as the baseline to their strategy.
That means:
-
We stop starting with solutions.
-
We invest in situational awareness.
-
We act only when we understand user needs, inertia, and strategic play.
Most AI projects will fail not because the tech is wrong, but because the thinking is shallow. Phase 1 doctrine makes that failure harder to repeat.
2. Open IP, Always - Be Transparent
If you build a tool for the public good, the public should be able to inspect it.
No locked-in models. No secret sauce. No proprietary infrastructure acting as a middleman between vulnerable people and critical services… and no extractive profiteering from sector need.
This isn’t idealism. It’s trust, audit-ability, and collaboration. We can’t build sector-wide confidence on black-box systems.
3. Procurement as a Strategic Filter
We don’t need 40 chatbots doing the same half-useful thing. We need a filter that:
-
Funds only projects aligned with doctrine
-
Prioritises reuse and interoperability
-
Denies duplication unless clearly justified
Controlling procurement is really at the heart of sector logic, so it makes sense to think about the upsides. In this dictatorship, funding is tied to strategic clarity. No doctrine, no money.
And here’s the kicker:
Funders must invest in a shared infrastructure that:
-
Tracks what’s already been funded, sharing standardised data.
-
Signals available tools and resources, they run the knowledge-base of what exists as signposts when duplication is requested.
-
Acts as a coordination layer, not a bottleneck
This is the memory the sector currently lacks. The funders have the power and resources, so we need that weilded toward solution. A shared data standard will give a massive win to all funders, and gives us a shot at spending that can reach the multi-million totals needed in the space. It doesn’t have to be complex: “is this already funded?” and a list of past grantees and providers comes back.
4. Know Users and Their Needs by Actually Listening
AI isn’t strategy. It’s just a tool.
If we’re not solving real problems—triage, reporting, data entry, referrals—we’re wasting time.
This means:
-
We embed researchers inside services.
-
We stop building tools in isolation. Cohorts are better than solo funded orgs.
-
We put the painful, messy, brilliant frontline work at the centre.
Design is co-created. Feedback is continuous. Value is real.
5. Train People, Not Just Models - Common language
Too much AI literacy is abstract. We’re being an AI project is like saying we’re building a software project, and unchecked will drive duplication. We can use Wardley mapping here to break down systems and have better conversations, but also have courses like CAST’s AI first step learned by all.
So we start from plain language.
-
Everyone moves from jargon to terms they understand to the next level of detail.
-
Everyone learns to map. It only takes a few minutes.
-
Therefore everyone can spot tech theatre.
-
Everyone—from caseworkers to commissioners—gets the tools to challenge bad implementations and steer better ones.
We don’t build AI capacity with PDFs. We do it with shared language and practical exposure.
The Sector Already Gets This—We Just Haven’t Joined It Up Yet
That roundtable I attended? It’s already happening:
-
Legal advice networks naming their needs.
-
Funders wanting to coordinate better, calling for data standards.
-
Initiatives trying to collate and centralise support.
-
Whiteboards filled with ideas for sharing what works and what doesn’t.
This was the source of my hope. The question isn’t “should we do this?” It’s “how quickly and urgently can we align around doing it together?”.
Where did this all come from?
I’ve been a massive fan of Wardley Mapping and Doctrine for a while. They’re worth reading, here’s the list of things to consider from Phase 1 of Simon Wardley’s strategy listed out:
1. Focus on user needs
Understand and meet the real needs of your users. Start with user research; map needs explicitly before proposing solutions.
2. Use a common language
Create shared terms so teams can collaborate effectively. Establish and document shared definitions; avoid jargon or silos.
3. Be transparent
Make information visible to foster alignment and trust. Share maps, decisions, and rationales openly across teams.
4. Challenge assumptions
Don’t take things at face value—ask why things are done that way. Regularly run “premortems” or ask “what if we’re wrong?” in planning sessions.
5. Remove duplication
Avoid reinventing the wheel; reuse before you build. Inventory internal tools and check what already exists before starting new work.
6. Think small teams
Smaller, autonomous teams move faster and adapt better. Organise around empowered, cross-functional teams with clear goals.
7. Understand the details
Strategy requires grasping the operational ground. Spend time with front-line teams; map workflows before high-level decisions.
8. Be humble
Accept you don’t know everything and may be wrong. Encourage questions, reward learning from mistakes, and avoid “hero” culture.
One of the delegates was calling for a new framework for seeing this challenge of AI through. I wonder if what we need to do is re-discover an existing framework.
I really want alignment because the waste is heart-breaking to watch. Seeing the hunger in the room last week for a cohesive strategy really gave me something to think though. Shared doctrine. Open systems. Co-ordinated funding. User-led tools. Plain language. This is what we know, lets hopefully not let a two letter term trip us up and instead use it to create radical coherence.