leedab.com
LeedAB · Investor brief · Confidential
2026-05 · Sydney AU

The local-first AI operations layer for SMBs.

What we do in one line

LeedAB is the operations layer SMBs install on their own machines — a roster of specialised AI agents that share one customer-specific brain and run without sending data to anyone's cloud.

Most "AI agent" products are thin wrappers over an LLM. We're building the substrate underneath them.

The problem. Small and mid-sized businesses get the same pitch every quarter: "let AI run your ops." They try the chatbot, get a 30-day demo, and bounce. Why? Because the agents have no memory of their business, no shared context across departments, and they all need the customer's data shipped to a cloud.

The wedge. A LeedAB install drops a per-customer "brain" — a structured knowledge vault — onto the customer's own machine, then runs specialist agents on top of it: Ops, Procurement, Finance, CX, Sales, Marketing, plus a COO orchestrator that fields every message and delegates. Customer data never leaves their hardware.

The moat. Once a customer's brain is populated and their workflows are wired, the cost of switching isn't "buy a different chatbot" — it's "rebuild the substrate." Lock-in is structural, not contractual.

Why now, why us, why local-first.

01

The cloud-AI fatigue is real

SMBs in regulated verticals (logistics, healthcare, legal, trades) are increasingly told they "can't put that data into ChatGPT." Local-first stops being a niche concern; it becomes a procurement requirement.

02

Hardware finally does what we need

A 2026-spec Mac Mini runs Claude-class models well enough for ops tasks. The "must be cloud" argument is yielding — appliance shape becomes feasible inside the next 12 months.

03

Agents need a substrate, not just a wrapper

Today's agent products treat memory as an afterthought. We treat it as the product — every agent reads from and writes to one structured per-customer brain. The agents are interchangeable; the brain compounds.

How it's built — five layers.

From bottom to top: substrate, license, manifest, agents, channels. Each layer is small, well-defined, and replaceable. The full stack runs on a customer's Mac Mini today (still software-only); appliance shape is on the 12-month horizon.

Layer 5
Channels
Telegram · Web console · WhatsApp · iMessage
Layer 4
COO orchestrator + 31 specialist agents
One bot per customer · routes + delegates + synthesises
Layer 3
Per-customer manifest
Which agents · industry-specific overrides · customer playbook
Layer 2
License + tier packs
Ed25519-signed · server-side guards · trial flow
Layer 1
Customer brain (Obsidian-backed vault)
Structured knowledge · agents read + write · customer-owned

31 specialist agents shipped today across executive (COO, Concierge), growth (Marketing, Content, Campaigns, Email, Brand, Video), build (Product, Engineering, Frontend, Backend, DevOps), automation, research, people, ops + supply, finance + investor + data, support (CX), legal + compliance, and customer lifecycle (Solutions, Onboarding, Implementation, Champion, Delivery).

Where we are.

2026-Q1
Tarheel Logistics — first paying customer. NSW logistics, ~200 deliveries/day. Pilot conversion via the same install we use ourselves.
Live
2026-04 → 05
13 PRs shipped to LeedAB-OS main + per-customer manifest layer + Solutions agent build flow + COO orchestrator MVP. Foundation locked; customer-specific configurations now ship in minutes.
Shipped
2026-05-03
Rome AI inbound — founder + cofounder DM'd asking to connect about partnership.
Inbound
2026-05-03
Pillarix licensing inbound — Pillarix CPO floated licensing LeedAB tech to their construction customers (vertical AI × horizontal substrate).
Inbound
Now
YC application drafted; council critique done; pitch deck shipped. Reframe pending the visa picture (NSW 491 invitation in flight).
In flight
Q3 / Q4 2026
Customer brain provisioning template — extract from our own dogfood vault. Lets us provision a fresh customer brain in <30 min.
Queued
2027
Appliance shape — Mac Mini + UGREEN AI NAS substrate, flashed at provisioning, customer plugs in and they're live.
Roadmap

The team.

Mina Aziz

Cofounder · Product + Eng

Ex-Microsoft Azure PM (Sydney, FastTrack + Red Flag, 2022 – 2025). Masters in AI at UTS. Masters in Cybersecurity (Macquarie). Native Arabic + native English. Builds the substrate.

Muiez Ahmed

Cofounder · Eng + Infra

Long-form software engineer. Ships the production console + license backend + CI surface. The "this needs to be a product, not a demo" voice.

Ziad (Feb 2026)

Engineering · Joining

Engineering hire joining Feb 2026. Onboarding patterns being formalised against Muiez's join experience.

The bet
SMB ops will run on AI substrates that compound, not chatbots that renew.
A customer who runs LeedAB for 18 months has a brain that knows their suppliers, their margins, their customers, their patterns. A customer who runs a chatbot for 18 months has 18 months of forgotten conversations. The first kind doesn't churn.