See Kodori in 60 seconds

Four moments that differentiate an AI-native DMS from a legacy one.

Kodori is built on primitives buyers can verify against the running product before a contract is signed. Below: the four differentiators that change the workflow.

  1. 01

    Hybrid search that actually works on your PDFs.

    Type a partial phrase from a contract you barely remember. Kodori runs Postgres FTS + pgvector semantic in parallel and fuses the results via Reciprocal Rank Fusion. Hits include extracted text from PDFs (including scans via Azure Doc Intelligence + Claude vision), Office docs, emails, and connected sources (Slack, Outlook, SharePoint, OneDrive, Google Drive, Gmail). The "where did we say that thing about the Brennan staircase incident" question stops requiring three people to pause what they're doing.

  2. 02

    An AI agent that operates the system, not just answers about it.

    Press ⌘K. Ask "find every doc on the Murphy matter that hasn't been classified yet, suggest the right sensitivity tier, and pin them to the matter collection." The agent uses the same MCP tool catalog the UI uses. Same `canReadDocument` gate — there is no agent-bypass. Every tool call emits an event with `actorKind: agent` so you can answer "what did the AI do this week?" with a single audit-log filter. Consequential actions (delete, sensitivity change, retention change, hold change, bulk over 10) require explicit confirmation.

  3. 03

    Tamper-evident audit chain, demonstrable in one click.

    Every consequential mutation appends an event to a per-tenant log. Each event's prev_hash is the SHA-256 of the previous event. /audit has a "Verify chain integrity" button that walks the chain end-to-end and confirms every prev_hash matches its predecessor. Pass returns "verified through 2026-05-02T14:23:11Z." Fail returns a rich diagnostic of the first mismatch. A weekly Inngest cron runs the same verification across every tenant — auditors asking "do you verify even when no one's watching?" get a yes with a continuous timestamped trail.

  4. 04

    Governance that holds at the database edge.

    Permission-trimmed retrieval runs INSIDE the SQL query — a user without read on a document never sees a search hit, dashboard count, extracted-text snippet, API response, or agent reply for that document. Deny rules always win over allow rules. Legal hold deny-wins on destruction: a held document refuses to delete, refuses retention disposal, refuses sensitivity downgrade. Cedar policy engine layers customer-defined ABAC on top in shadow mode — divergence telemetry accumulates against the TS gates with no risk to live decisions until the soak window elapses.

  5. 05

    Multi-step workflows the agent plans, not just executes.

    Open /canvas. Type a goal — "find every confidential doc older than one year and either archive it or escalate to a partner." Click Auto-plan. Claude proposes a tree of pending nodes — a hybrid-search tool-call to find candidates, a branch evaluating sensitivity tier against age, a human-approve gate before any deletion, downstream tool-calls per branch path. Every node is pending until you click Advance — the LLM cannot cause a side effect, only suggest one. Branch evaluation is a 3-operator predicate language (truthy / eq / neq) on dotted JSON paths. Reduce nodes fan-in multiple parents (concat / count / first-truthy). Every advance + cascade lands on the per-run hash-chained audit stream so post-mortem-investigating a workflow run produces the same evidence as investigating any other tenant action.

Where Kodori wins

Four head-to-heads with the incumbents.

vs iManage Insight, NetDocuments ndMAX

AI-native, not AI-bolted-on

Incumbent AI features were retrofit on top of a 20-year-old document store. Kodori was designed assuming agents are the primary surface — every operation is a typed MCP tool with a Zod schema, an authorization gate, and an audit event. The agent has nothing the UI doesn't and nothing the UI doesn't.

vs Every legacy DMS

Hash-chained tamper evidence, not row-level audit

Per-tenant SHA-256 chain detects modification of any prior event without re-running anything. SOC 2 Type I + 21 CFR Part 11 + FRCP discovery substrates ship today. Conformance docs published at /legal/21-cfr-part-11 + /legal/sec-17a-4.

vs Most incumbents (one-way file links only)

External connectors with content-not-just-metadata indexing

Six vendor connectors live: Slack, Gmail, Outlook, SharePoint, OneDrive, Google Drive. Messages and file attachments index into the same FTS + pgvector retrieval Kodori uses for native docs. Vendor stays source-of-truth — bytes are never mirrored as Kodori documents. GDPR Article 17 typed-confirmation purge available on revocation.

vs Quote-based incumbents with surprise add-ons

Honest pricing with explicit caps

Per-seat with documented caps. AI questions, Opus reasoning, storage, extractions all metered. Caps are load-bearing — over-cap requests refuse rather than silently degrade (Opus falls back to Haiku). Kodori's actual cost per medium-usage seat is ~$5-6 against the $30 you pay on Team — comfortable margin, no gouging.

Want the full side-by-side? vs iManage · vs NetDocuments · vs FileHold.

The product is verifiable. Verify it.

Sign-in is Google or Microsoft. Free tier is permanent — 1 GB, 100 docs, 50 agent questions a month, every governance feature live. Run the audit-chain verifier yourself.