AI Agent With Workflows Template

Test this app for free
132
import os
from flask import render_template, request, jsonify
from gunicorn.app.base import BaseApplication
from abilities import llm
from app_init import app, db
from models import ConversationMessage


@app.route("/")
def home_route():
    return render_template("home.html")

@app.route("/chat", methods=["POST"])
def chat_route():
    from agent import AIAgent
    
    user_input = request.json.get("message", "")
    
    # Create an AI agent to handle the request
    agent = AIAgent()
    
    # Process the user input through the agent
    agent_response = agent.handle_request(user_input)
Get full code

Frequently Asked Questions

Enables 24/7 operation without additional staffing costs Companies implementing this template typically see a 30-50% reduction in customer service operational costs within the first year. Q3: What sets this template apart from standard chatbot solutions in terms of business value?

The AI Agent With Workflows Template distinguishes itself through: - Intelligent workflow routing based on conversation context - Persistent memory management for coherent long-term interactions - Dynamic adaptation to different business processes - Integration capabilities with existing business systems - Scalable architecture that grows with business needs - Detailed analytics and conversation tracking

Q4: How can I extend the template to add custom API integrations? Can you provide a code example?

A: The AI Agent With Workflows Template can be extended by creating a new workflow class that implements API integrations. Here's an example:

```python from workflows.base_workflow import BaseWorkflow import requests

class APIIntegrationWorkflow(BaseWorkflow): description = "Workflow for handling external API integrations"

def __init__(self):
    super().__init__()
    self.api_key = os.getenv('API_KEY')
    self.base_url = 'https://api.example.com'

def next_step(self, user_input, think_output=None):
    try:
        # Make API call
        response = requests.get(
            f"{self.base_url}/endpoint",
            headers={'Authorization': f'Bearer {self.api_key}'},
            params={'query': user_input}
        )

        return {
            "response": self._format_api_response(response.json()),
            "workflow_step": "api_integration",
            "workflow_objective": "External API Integration"
        }
    except Exception as e:
        self.logger.error(f"API Integration error: {e}")
        return {"response": "Error processing API request"}

```

Q5: How can I implement custom memory management in the template? Please provide an example.

A: The template's memory management can be customized by extending the ConversationMessage model and implementing custom retrieval methods. Here's an example:

```python from models import db, ConversationMessage

class EnhancedConversationMessage(ConversationMessage): def init(self): super().init() self.metadata = Column(JSON)

@classmethod
def get_contextual_memory(cls, context_type, limit=5):
    return cls.query.filter(
        cls.metadata['context_type'].astext == context_type
    ).order_by(
        cls.timestamp.desc()
    ).limit(limit).all()

Usage in workflow

def retrieve_context(self, context_type): relevant_messages = EnhancedConversationMessage.get_contextual_memory( context_type='customer_support' ) return self._process_memory(relevant_messages) ```

This implementation allows for more sophisticated memory management with metadata tagging and contextual retrieval.

Created: | Last Updated:

Flask-based AI agent template with modular design, LLM workflows, web UI, task execution, memory management, and API integrations.

AI Agent With Workflows Template Guide

This template provides a sophisticated AI agent system with modular workflows, a web-based chat interface, and conversation memory management. The agent can analyze user input, select appropriate workflows, and generate contextual responses.

Getting Started

  • Click "Start with this Template" in the Lazy Builder interface to begin working with this template

Testing the Application

  • Click the "Test" button in the Lazy Builder interface
  • Lazy will deploy the application and provide you with a server link to access the chat interface

Using the Chat Interface

The web interface provides several key features:

  • A clean, modern chat interface for interacting with the AI agent
  • Real-time message updates
  • Thought process visibility for understanding the agent's decision-making
  • Chat history management with a clear chat option

To use the interface:

  • Navigate to the provided server link
  • Type your message in the input field
  • Press "Send" or hit Enter to submit
  • View the AI's response, including its thought process
  • Use the "Clear Chat History" button to reset the conversation

Customizing Workflows

The template supports custom workflow creation for specific use cases. To create a new workflow:

  1. Create a new file in the workflows directory
  2. Inherit from BaseWorkflow
  3. Implement the required methods

Example workflow structure:

```python from workflows.base_workflow import BaseWorkflow

class CustomWorkflow(BaseWorkflow): description = "Description of what this workflow handles"

def __init__(self):
    super().__init__()
    self.steps = ["step1", "step2", "step3"]
    self.current_step = 0

def next_step(self, user_input):
    # Implement your workflow logic here
    pass

```

The agent will automatically discover and incorporate new workflows into its decision-making process.

Key Features

  • Modular workflow system for handling different types of interactions
  • Conversation memory with SQLite database storage
  • Dynamic workflow selection based on user input analysis
  • Thought process transparency
  • Clear chat history functionality
  • Responsive web interface

The template provides a foundation for building sophisticated conversational AI applications with structured workflow management and a polished user interface.



Template Benefits

  1. Intelligent Customer Service Automation
  2. Provides a ready-to-deploy AI customer service solution
  3. Maintains conversation context and history for personalized responses
  4. Reduces customer service costs while providing 24/7 availability
  5. Handles multiple conversation types through dynamic workflow selection

  6. Enterprise Process Automation

  7. Modular workflow system enables automation of complex business processes
  8. Easily extendable for specific business use cases and requirements
  9. Maintains audit trails and conversation logs for compliance
  10. Integrates with existing business systems through API capabilities

  11. Knowledge Management & Support

  12. Creates an intelligent knowledge base interface for employees
  13. Provides consistent and accurate information across the organization
  14. Reduces training time and support ticket volume
  15. Enables self-service support with contextual understanding

  16. Sales & Lead Qualification

  17. Engages potential customers through intelligent conversation flows
  18. Qualifies leads based on customizable criteria
  19. Maintains detailed interaction history for sales follow-up
  20. Scales sales operations without proportional staff increases

  21. Data-Driven Decision Support

  22. Provides intelligent analysis of user queries and requests
  23. Maintains conversation history for pattern analysis
  24. Enables data-backed decision making through structured workflows
  25. Offers insights into user needs and common request patterns

Technologies

Optimize Your Django Web Development with CMS and Web App Optimize Your Django Web Development with CMS and Web App
Flask Templates from Lazy AI – Boost Web App Development with Bootstrap, HTML, and Free Python Flask Flask Templates from Lazy AI – Boost Web App Development with Bootstrap, HTML, and Free Python Flask
Streamline JavaScript Workflows with Lazy AI: Automate Development, Debugging, API Integration and More  Streamline JavaScript Workflows with Lazy AI: Automate Development, Debugging, API Integration and More
FastAPI Templates and Webhooks FastAPI Templates and Webhooks
Python App Templates for Scraping, Machine Learning, Data Science and More Python App Templates for Scraping, Machine Learning, Data Science and More

Similar templates

FastAPI endpoint for Text Classification using OpenAI GPT 4

This API will classify incoming text items into categories using the Open AI's GPT 4 model. If the model is unsure about the category of a text item, it will respond with an empty string. The categories are parameters that the API endpoint accepts. The GPT 4 model will classify the items on its own with a prompt like this: "Classify the following item {item} into one of these categories {categories}". There is no maximum number of categories a text item can belong to in the multiple categories classification. The API will use the llm_prompt ability to ask the LLM to classify the item and respond with the category. The API will take the LLM's response as is and will not handle situations where the model identifies multiple categories for a text item in the single category classification. If the model is unsure about the category of a text item in the multiple categories classification, it will respond with an empty string for that item. The API will use Python's concurrent.futures module to parallelize the classification of text items. The API will handle timeouts and exceptions by leaving the items unclassified. The API will parse the LLM's response for the multiple categories classification and match it to the list of categories provided in the API parameters. The API will convert the LLM's response and the categories to lowercase before matching them. The API will split the LLM's response on both ':' and ',' to remove the "Category" word from the response. The temperature of the GPT model is set to a minimal value to make the output more deterministic. The API will return all matching categories for a text item in the multiple categories classification. The API will strip any leading or trailing whitespace from the categories in the LLM's response before matching them to the list of categories provided in the API parameters. The API will accept lists as answers from the LLM. If the LLM responds with a string that's formatted like a list, the API will parse it and match it to the list of categories provided in the API parameters.

Icon 1 Icon 1
130

We found some blogs you might like...