Old Bird Tweet Fetcher

Test this app for free
66
import logging
from gunicorn.app.base import BaseApplication
from app_init import create_app

# Setup logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

class StandaloneApplication(BaseApplication):
    def __init__(self, app, options=None):
        self.application = app
        self.options = options or {}
        super().__init__()

    def load_config(self):
        # Apply configuration to Gunicorn
        for key, value in self.options.items():
            if key in self.cfg.settings and value is not None:
                self.cfg.set(key.lower(), value)

    def load(self):
        return self.application

if __name__ == "__main__":
Get full code

Created: | Last Updated:

Setup instructions here: https://docs.google.com/document/d/1dB8JCJA6wecU5VbtVZ417V2gkKFFTHO5wav_XPLgUeQ/edit?tab=t.0#heading=h.1m6ups7md5vv Old Bird Tweet Fetcher is an intelligent content discovery and curation platform designed to help professionals and enthusiasts effortlessly discover and explore the most relevant and engaging social media content. By leveraging advanced AI-driven filtering and personalized keyword tracking, Tweet Lens transforms the overwhelming world of social media into a targeted, meaningful stream of insights.

Here's a guide for using the Old Bird Tweet Fetcher template:

Introduction

The Old Bird Tweet Fetcher is a powerful web application that helps discover and curate relevant tweets based on keywords. It features an intelligent filtering system that identifies viral and relevant content, with a clean interface for managing keywords and viewing tweet history.

Getting Started

  • Click "Start with this Template" to begin using the Old Bird Tweet Fetcher template

Initial Setup

  • In the Environment Secrets tab, add your RAPIDAPI_API_KEY
  • To get your RAPIDAPI_API_KEY:
  • Go to RapidAPI
  • Sign up or log in
  • Subscribe to the Twitter API v1.5.4 package
  • Copy your API key from your RapidAPI dashboard

Test the Application

  • Click the "Test" button to deploy the application
  • The Lazy CLI will provide you with a server link to access your Tweet Fetcher interface

Using the Application

Managing Keywords

  • Navigate to the Keywords page
  • Add keywords related to topics you want to track
  • Keywords help the AI find relevant tweets about software development, coding, and productivity

Viewing Tweets

  • On the home page, click "Click to view tweet" to see the selected tweet
  • The app automatically filters for viral and relevant content
  • View your tweet history in the History tab
  • Track performance metrics in the Admin dashboard

Customizing Relevance Criteria

  • Use the Admin dashboard to customize the AI prompt that determines tweet relevance
  • Adjust the criteria to better match your content preferences

The application will continuously fetch and analyze new tweets based on your keywords, maintaining a curated feed of relevant content.



Template Benefits

  1. Intelligent Content Curation - Automatically filters and surfaces the most relevant tweets based on customizable keywords and AI-powered relevance scoring, saving professionals hours of manual content discovery time.

  2. Engagement Analytics Dashboard - Provides comprehensive analytics on tweet performance, viral factors, and user engagement metrics, enabling data-driven content strategy decisions and audience understanding.

  3. Multi-Agent Support - Supports multiple user agents/accounts with separate tracking and history, making it ideal for team collaboration and managing multiple brand or department social media monitoring needs.

  4. Viral Content Detection - Uses sophisticated algorithms to calculate and identify viral content potential through metrics like views, engagement rates and timing, helping businesses spot trending topics early.

  5. Customizable Relevance Criteria - Features an adaptable prompt system that allows organizations to fine-tune content relevance criteria based on their specific industry needs and business objectives, ensuring highly targeted content discovery.

These benefits make the template particularly valuable for digital marketers, content strategists, social media managers, and businesses looking to maintain an effective social media presence while saving time and resources.

Technologies

Maximize OpenAI Potential with Lazy AI: Automate Integrations, Enhance Customer Support and More  Maximize OpenAI Potential with Lazy AI: Automate Integrations, Enhance Customer Support and More
Flask Templates from Lazy AI – Boost Web App Development with Bootstrap, HTML, and Free Python Flask Flask Templates from Lazy AI – Boost Web App Development with Bootstrap, HTML, and Free Python Flask
Streamline X (Twitter) Workflows with Lazy AI: Automate Tasks, Optimize Processes, Integrate using API and More  Streamline X (Twitter) Workflows with Lazy AI: Automate Tasks, Optimize Processes, Integrate using API and More
Streamline JavaScript Workflows with Lazy AI: Automate Development, Debugging, API Integration and More  Streamline JavaScript Workflows with Lazy AI: Automate Development, Debugging, API Integration and More
Python App Templates for Scraping, Machine Learning, Data Science and More Python App Templates for Scraping, Machine Learning, Data Science and More
Optimize SQL Workflows with Lazy AI: Automate Queries, Reports, Database Management and More Optimize SQL Workflows with Lazy AI: Automate Queries, Reports, Database Management and More

Similar templates

Open Source LLM based Web Chat Interface

This app will be a web interface that allows the user to send prompts to open source LLMs. It requires to enter the openrouter API key for it to work. This api key is free to get on openrouter.ai and there are a bunch of free opensource models on openrouter.ai so you can make a free chatbot. The user will be able to choose from a list of models and have a conversation with the chosen model. The conversation history will be displayed in chronological order, with the oldest message on top and the newest message below. The app will indicate who said each message in the conversation. The app will show a loader and block the send button while waiting for the model's response. The chat bar will be displayed as a sticky bar at the bottom of the page, with 10 pixels of padding below it. The input field will be 3 times wider than the default size, but it will not exceed the width of the page. The send button will be on the right side of the input field and will always fit on the page. The user will be able to press enter to send the message in addition to pressing the send button. The send button will have padding on the right side to match the left side. The message will be cleared from the input bar after pressing send. The last message will now be displayed above the sticky input block, and the conversation div will have a height of 80% to leave space for the model selection and input fields. There will be some space between the messages, and the user messages will be colored in green while the model messages will be colored in grey. The input will be blocked when waiting for the model's response, and a spinner will be displayed on the send button during this time.

Icon 1 Icon 1
505

FastAPI endpoint for Text Classification using OpenAI GPT 4

This API will classify incoming text items into categories using the Open AI's GPT 4 model. If the model is unsure about the category of a text item, it will respond with an empty string. The categories are parameters that the API endpoint accepts. The GPT 4 model will classify the items on its own with a prompt like this: "Classify the following item {item} into one of these categories {categories}". There is no maximum number of categories a text item can belong to in the multiple categories classification. The API will use the llm_prompt ability to ask the LLM to classify the item and respond with the category. The API will take the LLM's response as is and will not handle situations where the model identifies multiple categories for a text item in the single category classification. If the model is unsure about the category of a text item in the multiple categories classification, it will respond with an empty string for that item. The API will use Python's concurrent.futures module to parallelize the classification of text items. The API will handle timeouts and exceptions by leaving the items unclassified. The API will parse the LLM's response for the multiple categories classification and match it to the list of categories provided in the API parameters. The API will convert the LLM's response and the categories to lowercase before matching them. The API will split the LLM's response on both ':' and ',' to remove the "Category" word from the response. The temperature of the GPT model is set to a minimal value to make the output more deterministic. The API will return all matching categories for a text item in the multiple categories classification. The API will strip any leading or trailing whitespace from the categories in the LLM's response before matching them to the list of categories provided in the API parameters. The API will accept lists as answers from the LLM. If the LLM responds with a string that's formatted like a list, the API will parse it and match it to the list of categories provided in the API parameters.

Icon 1 Icon 1
152

We found some blogs you might like...