Chat with Conversation History
Build a chat agent that remembers previous messages using thread state
Use thread state to maintain conversation history across multiple requests. The thread persists for up to 1 hour, making it ideal for chat sessions.
The Pattern
Thread state stores conversation history automatically. Each browser session gets its own thread, and messages persist across requests.
import { createAgent } from '@agentuity/runtime';
import { streamText } from 'ai';
import { anthropic } from '@ai-sdk/anthropic';
import { s } from '@agentuity/schema';
interface Message {
role: 'user' | 'assistant';
content: string;
}
const agent = createAgent('Chat Agent', {
description: 'Conversational agent with memory',
schema: {
input: s.object({
message: s.string(),
}),
stream: true,
},
handler: async (ctx, input) => {
// Get or initialize conversation history (async)
const messages = (await ctx.thread.state.get<Message[]>('messages')) || [];
// Add user message
messages.push({ role: 'user', content: input.message });
// Generate streaming response
const { textStream, text } = streamText({
model: anthropic('claude-sonnet-4-5'),
system: 'You are a helpful assistant. Be concise but friendly.',
messages,
});
// Save assistant response after streaming completes
ctx.waitUntil(async () => {
const fullResponse = await text;
messages.push({ role: 'assistant', content: fullResponse });
await ctx.thread.state.set('messages', messages);
ctx.logger.info('Conversation updated', {
messageCount: messages.length,
threadId: ctx.thread.id,
});
});
return textStream;
},
});
export default agent;Route Example
import { createRouter } from '@agentuity/runtime';
import chatAgent from '@agent/chat';
const router = createRouter();
router.post('/chat', chatAgent.validator(), async (c) => {
const { message } = c.req.valid('json');
return chatAgent.run({ message });
});
// Reset conversation
router.delete('/chat', async (c) => {
await c.var.thread.destroy();
return c.json({ reset: true });
});
export default router;Frontend
A simple chat interface that displays streaming responses:
import { useState, useRef, useEffect } from 'react';
interface Message {
role: 'user' | 'assistant';
content: string;
}
export function App() {
const [messages, setMessages] = useState<Message[]>([]);
const [input, setInput] = useState('');
const [isStreaming, setIsStreaming] = useState(false);
const messagesEndRef = useRef<HTMLDivElement>(null);
useEffect(() => {
messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
}, [messages]);
const sendMessage = async () => {
if (!input.trim() || isStreaming) return;
const userMessage = input.trim();
setInput('');
setMessages((prev) => [...prev, { role: 'user', content: userMessage }]);
setIsStreaming(true);
// Add placeholder for assistant response
setMessages((prev) => [...prev, { role: 'assistant', content: '' }]);
const response = await fetch('/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ message: userMessage }),
});
// Stream the response
const reader = response.body?.getReader();
const decoder = new TextDecoder();
while (reader) {
const { done, value } = await reader.read();
if (done) break;
const chunk = decoder.decode(value);
setMessages((prev) => {
const updated = [...prev];
updated[updated.length - 1].content += chunk;
return updated;
});
}
setIsStreaming(false);
};
return (
<div style={{ maxWidth: '600px', margin: '0 auto', padding: '1rem' }}>
<div style={{ height: '400px', overflowY: 'auto', marginBottom: '1rem' }}>
{messages.map((msg, i) => (
<div
key={i}
style={{
padding: '0.75rem',
margin: '0.5rem 0',
borderRadius: '8px',
background: msg.role === 'user' ? '#e3f2fd' : '#f5f5f5',
marginLeft: msg.role === 'user' ? '20%' : '0',
marginRight: msg.role === 'assistant' ? '20%' : '0',
}}
>
{msg.content || '...'}
</div>
))}
<div ref={messagesEndRef} />
</div>
<div style={{ display: 'flex', gap: '0.5rem' }}>
<input
value={input}
onChange={(e) => setInput(e.target.value)}
onKeyDown={(e) => e.key === 'Enter' && sendMessage()}
placeholder="Type a message..."
disabled={isStreaming}
style={{ flex: 1, padding: '0.75rem' }}
/>
<button onClick={sendMessage} disabled={isStreaming || !input.trim()}>
{isStreaming ? '...' : 'Send'}
</button>
</div>
</div>
);
}The frontend reads the streaming response chunk by chunk and updates the UI in real-time.
Key Points
- Thread state (
ctx.thread.state) persists for up to 1 hour - Async API: All thread state methods are async (
await ctx.thread.state.get()) - Messages array stores the full conversation history
waitUntilsaves the response after streaming completes- Thread ID (
ctx.thread.id) identifies the conversation
Simpler with push()
For append-only patterns like chat history, use push() with maxRecords for automatic sliding window behavior:
await ctx.thread.state.push('messages', newMessage, 100); // Keeps last 100See Also
- State Management for all state scopes
- Streaming Responses for streaming patterns
Need Help?
Join our Community for assistance or just to hang with other humans building agents.
Send us an email at hi@agentuity.com if you'd like to get in touch.
Please Follow us on
If you haven't already, please Signup for your free account now and start building your first agent!