docs: document LLM extractor adapter usage

This commit is contained in:
2026-05-11 12:19:58 +09:00
parent 4f877a40fb
commit 3e39d3bbd5

View File

@@ -113,6 +113,29 @@ await db.ingestStatement('Bun makes TypeScript tooling fast.', {
});
```
## LLM-backed extraction
You can bridge any text-generating model into IdentityDB by wrapping it with `LlmFactExtractor`.
```ts
import { LlmFactExtractor } from 'identitydb';
const extractor = new LlmFactExtractor({
model: {
async generateText(prompt) {
return callYourFavoriteLlm(prompt);
},
},
instructions: 'Prefer technology, product, and time topics over generic nouns.',
});
await db.ingestStatement('I have worked with Bun and TypeScript since 2025.', {
extractor,
});
```
The adapter expects the model to return JSON and will validate the structured response before IdentityDB writes a fact.
## Development
```bash