by we
SummarizeMe API: FastAPI Endpoint with LLM Text Summarization
import logging
from fastapi import FastAPI, HTTPException
from fastapi.responses import RedirectResponse
from pydantic import BaseModel
from abilities import llm_prompt
from fastapi.openapi.utils import get_openapi
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
app = FastAPI()
class TextInput(BaseModel):
text: str
@app.get("/", include_in_schema=False)
def root():
return RedirectResponse(url="/docs")
@app.post("/summarize")
async def summarize_text(input_data: TextInput):
try:
prompt = f"Summarize the following text in under 200 words:\n\n{input_data.text}"
summary = llm_prompt(prompt=prompt, image_url=None, response_type="text", model="gpt-4o", temperature=0.7)
Frequently Asked Questions
What are some potential business applications for the SummarizeMe API?
The SummarizeMe API has numerous business applications across various industries. Some potential use cases include: - Content curation: Quickly summarize articles, reports, or research papers for content aggregators or news platforms. - Customer support: Summarize lengthy customer inquiries or support tickets to help agents respond more efficiently. - Market research: Condense large volumes of market data or competitor analysis into concise summaries. - Legal document processing: Summarize contracts, legal briefs, or case studies for faster review and analysis. - Educational tools: Create summaries of textbooks or lecture notes to aid students in their studies.
How can the SummarizeMe API improve productivity in a business setting?
The SummarizeMe API can significantly boost productivity by: - Saving time: Quickly generate summaries of long documents, allowing employees to grasp key points without reading entire texts. - Enhancing decision-making: Provide concise summaries of reports or data, enabling faster and more informed decision-making. - Improving communication: Create brief summaries of complex ideas or projects for easier sharing across teams or departments. - Streamlining research: Quickly summarize multiple sources of information, accelerating the research process. - Facilitating knowledge management: Generate summaries of important documents for easier archiving and retrieval.
What are the potential cost savings associated with implementing the SummarizeMe API?
Implementing the SummarizeMe API can lead to cost savings in several ways: - Reduced labor costs: Automating summarization tasks can free up employees' time for higher-value activities. - Improved efficiency: Faster information processing can lead to quicker project completions and reduced overhead. - Lower training costs: Summarized documents can make onboarding and training processes more efficient. - Decreased information overload: By providing concise summaries, employees can focus on essential information, potentially reducing stress and burnout. - Optimized resource allocation: With faster access to key information, businesses can make more informed decisions about resource allocation.
How can I customize the summarization prompt in the SummarizeMe API?
You can customize the summarization prompt in the SummarizeMe API by modifying the prompt
variable in the summarize_text
function. Here's an example of how you might change it to request a shorter summary:
python
@app.post("/summarize")
async def summarize_text(input_data: TextInput):
try:
prompt = f"Provide a concise summary of the following text in no more than 100 words:\n\n{input_data.text}"
summary = llm_prompt(prompt=prompt, image_url=None, response_type="text", model="gpt-4o", temperature=0.7)
return {"summary": summary}
except Exception as e:
logger.error(f"Error in summarization: {str(e)}")
raise HTTPException(status_code=500, detail="An error occurred during summarization")
This modification changes the prompt to request a summary of no more than 100 words, instead of the original 200-word limit.
How can I add error handling for specific exceptions in the SummarizeMe API?
To add more specific error handling in the SummarizeMe API, you can catch and handle different types of exceptions separately. Here's an example of how you might modify the summarize_text
function to handle specific exceptions:
```python from fastapi import FastAPI, HTTPException import requests
@app.post("/summarize") async def summarize_text(input_data: TextInput): try: prompt = f"Summarize the following text in under 200 words:\n\n{input_data.text}" summary = llm_prompt(prompt=prompt, image_url=None, response_type="text", model="gpt-4o", temperature=0.7) return {"summary": summary} except requests.RequestException as e: logger.error(f"Network error during summarization: {str(e)}") raise HTTPException(status_code=503, detail="Service unavailable. Please try again later.") except ValueError as e: logger.error(f"Invalid input for summarization: {str(e)}") raise HTTPException(status_code=400, detail="Invalid input. Please check your text and try again.") except Exception as e: logger.error(f"Unexpected error in summarization: {str(e)}") raise HTTPException(status_code=500, detail="An unexpected error occurred during summarization") ```
This modification adds specific handling for network errors (RequestException) and invalid input (ValueError), providing more informative error messages to the API users.
Created: | Last Updated:
Introduction to the Template
Welcome to the "SummarizeMe API: FastAPI Endpoint with LLM Text Summarization" template. This template helps you create a FastAPI endpoint for text summarization using an LLM prompt, with OpenGraph integration for documentation preview.
Getting Started
To get started with this template, click Start with this Template.
Test
After starting with the template, press the Test button. This will begin the deployment of the app and launch the Lazy CLI.
Entering Input
Once you press the Test button, the Lazy App's CLI interface will appear, and you will be prompted to provide the input. The input required for this template is the text you want to summarize.
Using the App
The app provides an API endpoint for text summarization. Here’s how you can use it:
-
Access the API Documentation: After deployment, you will be provided with a link to the FastAPI documentation. This link will look something like
http://<your-app-url>/docs
. Open this link in your browser to access the interactive API documentation. -
Summarize Text: Use the
/summarize
endpoint to summarize your text. You can test this endpoint directly from the FastAPI documentation interface.
Sample Request
Here is a sample request to the /summarize
endpoint:
json
{
"text": "Your long text goes here."
}
Sample Response
The response will contain the summarized text:
json
{
"summary": "This is the summarized text."
}
Integrating the App
To integrate this app into your existing workflow or external tools, follow these steps:
-
API Endpoint: Use the provided API endpoint URL to make requests from your application or tool. For example, you can use
http://<your-app-url>/summarize
to send text for summarization. -
Scopes and Permissions: Ensure that any external tool or service you are integrating with has the necessary permissions to access the API endpoint.
-
Sample Code for Integration: If you are integrating this API with another tool, here is a sample code snippet to help you get started:
```python import requests
url = "http://
response = requests.post(url, json=data) print(response.json()) ```
By following these steps, you can successfully deploy and integrate the "SummarizeMe API: FastAPI Endpoint with LLM Text Summarization" template into your project.
Here are 5 key business benefits for this template:
Template Benefits
-
Rapid API Development: This FastAPI template enables quick deployment of a text summarization service, allowing businesses to quickly integrate advanced NLP capabilities into their applications or workflows.
-
Scalable Text Processing: By leveraging large language models for summarization, this API can handle a wide range of text inputs and volumes, making it suitable for businesses dealing with large amounts of textual data.
-
Improved Documentation: The custom OpenAPI schema and RedirectResponse to "/docs" provides clear, accessible API documentation, making it easier for developers to integrate and use the service.
-
Error Handling and Logging: Built-in error handling and logging help maintain service reliability and assist in troubleshooting, reducing downtime and improving overall service quality.
-
Flexible Deployment: The use of FastAPI and Uvicorn allows for easy deployment in various environments, from local development to cloud platforms, giving businesses flexibility in how they host and scale the service.