What's Included
Prax includes all the components required to build, test, and deploy LLM-based applications. The platform is modular and can be extended with custom code when needed.
Flow Builder
Visual interface for chaining nodes
Supports conditional routing, looping (planned), and parallel branches
Save and version flows for reuse across environments
Node Library
LLM nodes: OpenAI, Azure OpenAI, Hugging Face, Cohere
Tool nodes: Calculator, web search, scraping, file reader
Data nodes: JSON parser, transformers, input/output adapters
Memory nodes: Short-term memory, vector stores (Pinecone, Supabase, Redis)
Control nodes: If/else branching, wait/sleep, error handling
API nodes: External request, webhook response
Chat Interface
Console for testing any flow with natural language input
Shows step-by-step node output and intermediate data
Useful for debugging, verifying flow logic, and prompt iteration
REST API
Each flow is exposed at a unique endpoint
Accepts POST requests with custom payloads
Responses are returned in structured JSON
Can be secured with tokens or integrated into external services
Deployment Options
Run locally with Node.js
Docker container for consistent deployment
Support for Vercel, Railway, or custom cloud infrastructure
Environment variable management for secrets and API keys
Optional authentication for user-level flow access
Last updated