B3OS
HomePricing
Dashboard
Back to Integrations
⌘K

Llm

Tag

2 integrations tagged with "llm"

Workflow
AI Prompt
Action

AI Prompt

Prompt an AI model with custom instructions and optional context data via OpenRouter (GPT-4, Claude, Llama, etc.). Supports structured JSON output with schema enforcement via OpenRouter (best with OpenAI models). Pass in data from previous workflow steps as context. Ideal for: AI-driven decisions, data analysis, content generation, automated reasoning in workflows.

8 workflows
463 runs
98% success
aillmchat+5
Type:ai-llm-chat
Fields:8 (1 required)
Chat with NanoChat LLM (d32 Model)
Action

Chat with NanoChat LLM (d32 Model)

Stream real-time responses from NanoChat, a 32-layer transformer language model built by Andrej Karpathy in 2025. Access this lightweight LLM via the NanoBrain platform for conversational AI tasks with streaming token-by-token output. | Input: Accepts an empty request body (no parameters required) to initiate a chat session | Output: Returns streaming server-sent events (SSE) with token-by-token text generation from the NanoChat d32 model, including model identification and completion markers | Use cases: I want to chat with a lightweight language model for quick responses; Get streaming AI responses for conversational tasks; I need to integrate a compact LLM into my application; Generate text responses using the NanoChat d32 model | Cost: 0.01 USDC on base

x402payment-gatedai+7
Type:x402ep_d5s9k9smd1grabgljr70
Fields:1 (1 required)
Status unavailable
DocsTermsPrivacy