Context Aware Chat API

Test this app for free
121
import logging

from fastapi import FastAPI
from fastapi.responses import RedirectResponse

logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

app = FastAPI()


@app.get("/", include_in_schema=False)
def root():
    return RedirectResponse(url="/docs")


from chat_api import router as chat_router

app.include_router(chat_router, prefix="/api")


# Do not remove the main function while updating the app.
if __name__ == "__main__":
    import uvicorn
Get full code

Context Aware Chat API

Created: | Last Updated:

A simple chat API for user interaction with a large language model.

Introduction to the Context Aware Chat API Template

Welcome to the Context Aware Chat API Template! This template is designed to help you build a simple chat API that interacts with a large language model (LLM) to provide context-aware responses. Ideal for customer support or interactive applications, this API keeps track of the conversation history to maintain context and deliver more relevant replies.

Getting Started

To begin using this template, simply click on "Start with this Template" on the Lazy platform. This will pre-populate the code in the Lazy Builder interface, so you won't need to copy, paste, or delete any code manually.

Test: Deploying the App

Once you have started with the template, press the "Test" button to deploy the app. The Lazy platform handles all deployment aspects, so you don't need to worry about installing libraries or setting up your environment. The deployment process will launch the Lazy CLI, and you will be prompted for any required user input only after using the Test button.

Using the App

After deployment, Lazy will provide you with a dedicated server link to use the API. Since this template uses FastAPI, you will also receive a link to the API documentation. You can interact with the API by sending POST requests to the "/api/chat" endpoint with a JSON payload containing your message. Here's a sample request you might send:

{   "message": "Hello, can you help me with my issue?" } And here's an example of the response you might receive:

{   "session_id": "a unique session identifier",   "response": "Of course, I'm here to help. What seems to be the problem?" } The API will maintain a history of the conversation to ensure that the context is preserved throughout the interaction.

Integrating the App

If you wish to integrate this chat API into an existing service or frontend, you will need to send HTTP requests to the server link provided by Lazy. You can use the API endpoints to send and receive messages programmatically from your application.

For example, if you're building a web application, you can use JavaScript to make AJAX calls to the chat API, sending user messages and displaying the API's responses in your application's UI.

Remember, no additional setup is required for the 'abilities' module, as it is a built-in module within the Lazy platform.

By following these steps, you can quickly set up and integrate the Context Aware Chat API into your application, providing an interactive and engaging user experience.

Technologies

OpenAI OpenAI