Claude 3 Quickstart Chat API

Test this app for free
454
import logging
import os
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from typing import Optional
import anthropic 

logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

app = FastAPI()

client = anthropic.Anthropic(
    api_key=os.environ.get("ANTHROPIC_API_KEY")
)

class ChatRequest(BaseModel):
    system_prompt: Optional[str] = "You are a helpful assistant. Respond to the following question/statement."
    user_prompt: str
    model: Optional[str] = "claude-3-opus-20240229"
    temperature: Optional[float] = 1.0
    max_tokens: Optional[int] = 1000

@app.post("/ask-claude/")
Get full code

Claude 3 Quickstart Chat API

Created: | Last Updated:

A Chat API that allows you to immediately integrate any application you're building with the largest anthropic model (Claude Opus) with only your API key. Once you run the app, you can use this app's "/ask-claude" endpoint to send a request to Claude 3 from any application you build.

Introduction to the Template

Welcome to the Claude 3 Quickstart Chat API template! This template allows you to integrate the largest anthropic model, Claude Opus, into any application you're building with just your API key. The /ask-claude endpoint enables you to send requests to Claude 3 and receive responses, making it a powerful tool for various applications.

Getting Started

To get started with this template, click Start with this Template.

Test

After starting with the template, press the Test button. This will begin the deployment of the app and launch the Lazy CLI. The Lazy CLI will guide you through any required user input.

Entering Input

The code does not require any user input through the CLI, so you can skip this section.

Using the App

Once the app is deployed, you can use the /ask-claude endpoint to communicate with the AI. Here’s how you can interact with the API:

  1. Endpoint: /ask-claude/
  2. Method: POST
  3. Request Body: json { "system_prompt": "You are a helpful assistant. Respond to the following question/statement.", "user_prompt": "Your question or statement here", "model": "claude-3-opus-20240229", "temperature": 1.0, "max_tokens": 1000 }

  4. Sample Request: bash curl -X POST "http://<your-server-link>/ask-claude/" -H "Content-Type: application/json" -d '{ "system_prompt": "You are a helpful assistant. Respond to the following question/statement.", "user_prompt": "What is the capital of France?", "model": "claude-3-opus-20240229", "temperature": 1.0, "max_tokens": 1000 }'

  5. Sample Response: json { "message": "The capital of France is Paris." }

Integrating the App

To integrate this app into your service or frontend, follow these steps:

  1. Get the Server Link: After pressing the Test button, the Lazy CLI will provide you with a dedicated server link to use the API.
  2. API Documentation: You will also receive a link to the FastAPI documentation, which you can use to explore and test the endpoints.

Example Integration

If you are integrating this API into another service, you might need to add the server link and endpoints to your external tool. Here’s an example of how to do this:

  • External Tool: Postman
    1. Open Postman and create a new request.
    2. Set the request method to POST.
    3. Enter the server link followed by /ask-claude/ in the request URL.
    4. In the Body tab, select raw and set the type to JSON.
    5. Paste the sample request body into the input field.
    6. Click Send to test the request and view the response.

By following these steps, you can easily integrate the Claude 3 Quickstart Chat API into your application and start leveraging the power of the Claude Opus model.

Technologies

OpenAI OpenAI
Claude by Anthropic Claude by Anthropic
Python Python