Claude 3 Quickstart Chat API
import logging
import os
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from typing import Optional
import anthropic
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
app = FastAPI()
client = anthropic.Anthropic(
api_key=os.environ.get("ANTHROPIC_API_KEY")
)
class ChatRequest(BaseModel):
system_prompt: Optional[str] = "You are a helpful assistant. Respond to the following question/statement."
user_prompt: str
model: Optional[str] = "claude-3-opus-20240229"
temperature: Optional[float] = 1.0
max_tokens: Optional[int] = 1000
@app.post("/ask-claude/")
Created: | Last Updated:
Introduction to the Template
Welcome to the Claude 3 Quickstart Chat API template! This template allows you to integrate the largest anthropic model, Claude Opus, into any application you're building with just your API key. The /ask-claude
endpoint enables you to send requests to Claude 3 and receive responses, making it a powerful tool for various applications.
Getting Started
To get started with this template, click Start with this Template.
Test
After starting with the template, press the Test button. This will begin the deployment of the app and launch the Lazy CLI. The Lazy CLI will guide you through any required user input.
Entering Input
The code does not require any user input through the CLI, so you can skip this section.
Using the App
Once the app is deployed, you can use the /ask-claude
endpoint to communicate with the AI. Here’s how you can interact with the API:
- Endpoint:
/ask-claude/
- Method: POST
-
Request Body:
json { "system_prompt": "You are a helpful assistant. Respond to the following question/statement.", "user_prompt": "Your question or statement here", "model": "claude-3-opus-20240229", "temperature": 1.0, "max_tokens": 1000 }
-
Sample Request:
bash curl -X POST "http://<your-server-link>/ask-claude/" -H "Content-Type: application/json" -d '{ "system_prompt": "You are a helpful assistant. Respond to the following question/statement.", "user_prompt": "What is the capital of France?", "model": "claude-3-opus-20240229", "temperature": 1.0, "max_tokens": 1000 }'
-
Sample Response:
json { "message": "The capital of France is Paris." }
Integrating the App
To integrate this app into your service or frontend, follow these steps:
- Get the Server Link: After pressing the Test button, the Lazy CLI will provide you with a dedicated server link to use the API.
- API Documentation: You will also receive a link to the FastAPI documentation, which you can use to explore and test the endpoints.
Example Integration
If you are integrating this API into another service, you might need to add the server link and endpoints to your external tool. Here’s an example of how to do this:
- External Tool: Postman
- Open Postman and create a new request.
- Set the request method to POST.
- Enter the server link followed by
/ask-claude/
in the request URL. - In the Body tab, select
raw
and set the type toJSON
. - Paste the sample request body into the input field.
- Click Send to test the request and view the response.
By following these steps, you can easily integrate the Claude 3 Quickstart Chat API into your application and start leveraging the power of the Claude Opus model.