llm

@motioneffector/llm

A TypeScript client for LLM APIs with OpenRouter support, streaming responses, conversation management, and automatic retries.

npm version license TypeScript

Features

Read the full manual →

Quick Start

import { createLLMClient } from '@motioneffector/llm'

// Create a client
const client = createLLMClient({
  apiKey: process.env.OPENROUTER_KEY,
  model: 'anthropic/claude-sonnet-4'
})

// Send a chat completion request
const response = await client.chat([
  { role: 'user', content: 'Explain quantum computing in simple terms' }
])

console.log(response.content)
console.log(`Used ${response.usage.totalTokens} tokens in ${response.latency}ms`)

Testing & Validation

License

MIT © motioneffector