The next ten years of higher education won't be built by software.
They'll be built by AI workers who carry context the way good colleagues do, and who treat the student on the other side as a person, not a row.

The students who deserve the best advice rarely get it.
Indian higher education is a market of three hundred and fifty million people, run on stitched-together CRMs.
Indian higher education runs on stitched-together CRMs, WhatsApp groups, and counsellors who burn out by year two. The students who deserve the best advice rarely get it. The ones who can afford the best advice are sold programmes someone is incentivised to push.
We've spent the last two years inside this system, shipping Sensei across nine partner universities. We watched what happens when a counsellor is replaced by software: the student gets a slower, dumber version of the same biased conversation. We watched what happens when a counsellor is given an AI worker: the student gets the answer they came for, in their language, at the hour they had a question.
Five beliefs that shape every agent we ship.
- 01AI workers, not chatbots. A chatbot answers a query. A worker owns an outcome, carries context across conversations, and hands off cleanly when it's out of depth.
- 02Neutral by design. We are admissions partners for nine universities. The agent never says 'I'd recommend X'. It says 'here are three programmes that fit, which feels right to you?'.
- 03Truthful before fluent. If the data doesn't have a fee, the agent says so. We will not ship an agent that fabricates to sound confident.
- 04Private as a default, not a setting. Student data lives in India. Audit logs are on by default. The CFO can read every prompt and response.
- 05Suite, not silo. The agent that counsels, the agent that nudges, the agent that whispers to the human counsellor, and the agent that watches the ledger all read from the same student profile. The student never starts over.

A worker owns the outcome. A chatbot answers the query.
Three refusals we hold even when the numbers say otherwise.
We won't build a counsellor that pretends to be human. Sensei opens with "I'm an AI counsellor". Students keep talking. The ones who don't, never would have.
We won't sell agents that optimise for the university's margin at the student's cost. If a programme isn't a fit, the agent says so, even when there's a higher commission on the alternative.
We won't replace the work of a great counsellor. We will remove the thirty hours a week of context-switching that stops them from doing it.
We won't build a counsellor that pretends to be human.
Small team, founders on every pilot, no demo theatre.
No SDRs. Founders run every pilot. Every agent goes live in twenty four hours or doesn't go live at all. We measure the thing the partner university actually cares about, not the metric that makes the dashboard pretty.
We are slow on purpose where it matters: data residency, consent, audit. We are fast on purpose where it should be: shipping the agent, fixing what the partner finds in week one, getting out of the way once it works.

Treat the student as a person, not a conversion event.
In ten years, every Indian university will run on a stack of AI workers.
The question is whether those workers were built by people who treated the student on the other side as a person, or as a conversion event.
Boole is the bet that the first answer wins.