LLM Completion Request
Function: LLM Completion Request
This action allows you to send a question or instruction (a "prompt") to an Artificial Intelligence (AI) model and receive a generated response. It's like having a smart assistant that can understand your requests and provide text-based answers, summaries, or even structured data. You can guide the AI with specific instructions and even provide files for it to analyze.
Input
- User prompt (Text, Required): This is your main question or instruction for the AI. Be clear and specific to get the best results.
- User prompt placeholders (Key-Value Pairs, Optional): If your "User prompt" contains dynamic information (e.g.,
\{\{PRODUCT_NAME\}\}), you can define these placeholders here. The AI will see the prompt with these values filled in.- Example: If your prompt is "Summarize the benefits of {{PRODUCT_NAME}}", you could set
PRODUCT_NAMEto "NoCode-X Platform".
- Example: If your prompt is "Summarize the benefits of {{PRODUCT_NAME}}", you could set
- System prompt (Text, Required): This sets the overall behavior and tone for the AI. Use it to give the AI a role (e.g., "You are a helpful marketing assistant") or specific guidelines (e.g., "Always respond in a friendly tone").
- System prompt placeholders (Key-Value Pairs, Optional): Similar to user prompt placeholders, these replace dynamic parts within your "System prompt".
- Model (Dropdown, Required): Choose which AI model you want to use. Different models have varying capabilities, costs, and performance.
- Available Models: gpt-5, gpt-5-mini, gpt-5-nano, gpt-4.5-preview, o3-mini, o1, o1-mini, gpt-4o, gpt-4o-mini, gpt-4-turbo, gpt-3.5-turbo, claude-sonnet-4-5-20250929, claude-sonnet-4-20250514, claude-haiku-4-5-20251001, claude-opus-4-1-20250805, claude-opus-4-20250514, claude-3-7-sonnet-latest, claude-3-5-sonnet-latest, claude-3-5-haiku-latest, claude-3-haiku-20240307, claude-3-opus-latest, mistral-small, mistral-medium, mistral-large-latest, open-mistral-7b, open-mixtral-8x7b, open-mixtral-8x22b, open-mistral-nemo, meta-llama/llama-4-scout-17b-16e-instruct, meta-llama/llama-4-maverick-17b-128e-instruct, llama-3.3-70b-versatile, llama-3.2-1b-preview, llama-3.2-3b-preview, llama-3.1-8b-instant, deepseek-r1-distill-llama-70b, gemma2-9b-it, qwen-2.5-32b, gemini-2.5-flash, gemini-2.5-pro, gemini-2.5-flash-lite.
- Files (List of Files, Optional): You can upload one or more files for the AI to analyze or reference in its response. This option is only available for specific models that support file input (e.g., certain Claude, GPT, Llama, Mistral, and Gemini models).
- API token (Password, Optional): If you have your own API key for an AI service, you can provide it here. This allows you to use your own credits instead of the platform's AI credits.
Output
- Result (Variable): This is the variable where the AI's generated response will be stored. By default, it's named "RESULT", but you can change it.
- Response format (Data Format): Use this to define a specific structure (like a JSON schema) for the AI's response. If you leave this blank, the AI will return plain text. If specified, the AI will attempt to format its output according to your defined structure, and the
Resultvariable will contain structured data (e.g., a list of key-value pairs).
Execution Flow
Real-Life Examples
Example 1: Drafting a Marketing Email
Scenario: You need to quickly draft a marketing email for a new product launch.
Inputs:
- User prompt: "Draft a short, engaging marketing email announcing the launch of our new 'Smart Home Hub'. Highlight its ease of use, security features, and compatibility with other devices."
- User prompt placeholders: (None)
- System prompt: "You are a professional marketing copywriter. Your goal is to create compelling and concise email content."
- System prompt placeholders: (None)
- Model:
gpt-4o-mini - Files: (None)
- API token: (Leave blank to use platform credits)
- Response format: (Leave blank for plain text)
Result: The RESULT variable will contain a text string with a draft marketing email, ready for review and sending.
Example 2: Extracting Customer Feedback into a Structured Format
Scenario: You have collected raw customer feedback from a survey and want to extract key sentiment and product mentions into a structured format for analysis.
Inputs:
- User prompt: "Analyze the following customer feedback and extract the overall sentiment (positive, negative, neutral) and any mentioned product features. If no product features are mentioned, leave that field empty. Feedback: 'The new update is terrible, it crashes constantly and the interface is confusing. I miss the old version. However, the customer support was very helpful in trying to resolve my issues.'"
- User prompt placeholders: (None)
- System prompt: "You are a data extraction assistant. Your task is to accurately parse text into the specified JSON format."
- System prompt placeholders: (None)
- Model:
claude-3-5-sonnet-latest - Files: (None)
- API token: (Leave blank)
- Response format:
\{
"type": "object",
"properties": \{
"sentiment": \{
"type": "string",
"enum": ["positive", "negative", "neutral"]
\},
"product_features_mentioned": \{
"type": "array",
"items": \{
"type": "string"
\}
\}
\},
"required": ["sentiment"]
\}
Result: The RESULT variable will contain a structured object (or map) like this:
\{
"sentiment": "negative",
"product_features_mentioned": ["interface", "customer support"]
\}
Example 3: Summarizing a Document with File Input
Scenario: You have a PDF document (e.g., a project proposal) and need a quick summary of its main points.
Inputs:
- User prompt: "Provide a concise summary of the attached document, highlighting the main objectives, proposed timeline, and key deliverables."
- User prompt placeholders: (None)
- System prompt: "You are an expert project manager. Summarize documents clearly and professionally."
- System prompt placeholders: (None)
- Model:
gpt-4o(or any other model that supports file input) - Files: Upload your
project_proposal.pdffile. - API token: (Leave blank)
- Response format: (Leave blank for plain text)
Result: The RESULT variable will contain a text summary of the project_proposal.pdf, outlining its objectives, timeline, and deliverables.