Introduction
Agent is the Signal-native streaming library for Angular 20+ โ built natively for LangGraph, without React translation layers. Build streaming AI applications with Angular Signals, connect to LangGraph agents, and ship production-ready frontends for your AI products.
This guide walks you through the complete workflow: build a LangGraph agent in Python, run it locally, connect it to an Angular app with agent(), and deploy to production.
What is agent()?
agent() is an Angular function that creates a reactive, streaming connection to a LangGraph agent. It returns an object whose properties are Angular Signals โ meaning your templates update automatically as the agent streams responses, token by token.
No RxJS. No manual subscriptions. No async pipes. Just Signals that work with Angular's OnPush change detection out of the box.
The Architecture
Watch a full conversation turn flow through the stack โ from user input to rendered response:
Build Your Agent
LangGraph agents are Python programs defined as directed graphs. Here's a minimal chat agent using the example from this repository:
MessagesState manages a list of messages. The call_model node takes the current messages, adds a system prompt, and calls the LLM. The graph runs this single node and returns the response. LangGraph handles streaming, checkpointing, and thread management automatically.
Run Your Agent Locally
Create a .env file with your API keys:
Your agent is now running at http://localhost:2024. You can test it in LangGraph Studio at https://smith.langchain.com/studio/.
Connect with Angular
Now connect your Angular app to the running agent using agent().
Open http://localhost:4200 and start chatting with your agent. Messages stream in real-time as the LLM generates them.
Key Concepts
Everything agent() gives you out of the box โ click any to learn more:
Deploy to Production
When you're ready to go live, deploy your agent to LangGraph Cloud and point your Angular app to the deployment URL.
Your agent code (the Python project with langgraph.json) needs to be in a GitHub repository. Make sure your langgraph.json references the correct graph entry point.
Go to LangSmith Deployments and click + New Deployment. Connect your GitHub account, select your repository, and deploy. The first deployment takes about 15 minutes.
You'll receive a deployment URL like https://my-agent-abc123.langsmith.dev.
Point apiUrl to your deployment URL and set up environment-based configuration:
Deploy your Angular frontend to any hosting platform โ Vercel, Netlify, AWS, or your own infrastructure. Since agent() is a stateless client, your frontend has no server-side state requirements.
Your Angular app is a stateless client. All agent state โ threads, checkpoints, memory โ lives on LangGraph Platform. This means you can deploy your frontend anywhere (CDN, edge, SSR) without state management concerns. Scale your frontend independently of your agent infrastructure.
What's Next
Detailed 5-minute walkthrough with a complete chat component
Token-by-token updates, stream modes, and status tracking
Thread persistence across sessions and reactive thread switching
Human-in-the-loop approval and confirmation flows
Deterministic testing with MockAgentTransport
Deep dive into how Signals power agent
Graphs, nodes, edges, and state for Angular developers
Complete agent() function reference