/kickoff as NDJSON with tool call events.
What You’ll Build
- A LangGraph with assistant + tool executor nodes.
- Two tools:
list_top_posts(sorted by votes) andsearch_launches(keyword/topic search). - Streaming runs keyed by
thread_id, kept in sync viaMemorySaver. - A template you can expose over HTTP/SSE to connect with CometChat’s AI Agents.
Prerequisites
- TypeScript: Node.js 18+ (Node 20 recommended);
OPENAI_API_KEYin.env(optionalPRODUCT_OPENAI_MODEL, defaultgpt-4o-mini). - Python: Python 3.10+;
OPENAI_API_KEYin.env(optionalMODEL, defaultgpt-4o-mini). - CometChat app + AI Agent entry.
Quick links
- Repo root: ai-agent-lang-graph-examples
- TypeScript project: typescript/langgraph-product-hunt-agent (
src/graph.ts,src/server.ts,.env.example) - Python project: python/langgraph_product_hunt_agent (
agent.py,server.py,.env)
How it works
- Tools —
list_top_postsandsearch_launcheslive insrc/graph.ts, backed by helpers insrc/data/search.ts. Both return markdown bullets so the assistant can cite results directly. - Graph —
StateGraph(MessagesAnnotation)alternates between the assistant node and thetoolsnode based onshouldCallTools; tool outputs are fed back asToolMessageobjects. - State —
MemorySavercheckpoints perconfigurable.thread_idlet you run multi-turn conversations on the same graph instance. - Streaming —
app.streamemits state snapshots (streamMode: "values"). The console demo prints each message as the model calls tools and drafts the response.
Setup (TypeScript)
1
Install
cd typescript/langgraph-product-hunt-agent && npm install2
Env
Copy
../.env.example to .env; set OPENAI_API_KEY (optional PRODUCT_OPENAI_MODEL).3
Run demo
npm run demo — “Top Product Hunt style launches right now?”4
Run server
npm run server → POST /kickoff on http://localhost:3000.Setup (Python)
1
Install
cd python && python -m venv .venv && source .venv/bin/activate && pip install -r requirements.txt2
Env
Create
.env with OPENAI_API_KEY (optional MODEL).3
Run server
python -m langgraph_product_hunt_agent.server → POST /kickoff on http://localhost:8000.Project structure
- TypeScript: Graph src/graph.ts, Demo src/index.ts, Server src/server.ts, Data src/data, Config .env.example + package.json
- Python: Graph agent.py, Server server.py, Data data/, Config .env + requirements.txt
Step 1 - Understand the agent tools
buildProductHuntGraph binds both tools to ChatOpenAI (temperature 0 by default). The graph checks each assistant reply for tool_calls, routes to the tools node to execute them, then loops back until there are none left.
Streaming API (HTTP)
Event order (TypeScript and Python servers):text_start → text_delta chunks → tool_call_start → tool_call_args → tool_call_end → tool_result → text_end → done (error on failure). Each event includes message_id; echo thread_id/run_id from the client if you want threading.
Example requests: