by Luhanm

Email Sender Pro

Test this app for free
85
import os
from flask import Flask, request, render_template, redirect, url_for, session
from flask_bootstrap import Bootstrap
import smtplib
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
from abilities import llm_prompt

app = Flask(__name__)
Bootstrap(app)
app.config['SECRET_KEY'] = 'a_very_secret_key'

EMAIL_ADDRESS = os.environ.get('EMAIL_ADDRESS')
EMAIL_PASSWORD = os.environ.get('EMAIL_PASSWORD')

@app.route('/', methods=['GET', 'POST'])
def index():
    if request.method == 'POST':
        user_prompt = request.form['prompt']
        recipient = request.form['recipient']
        signature = request.form.get('signature', '')
        if 'signature' not in session or signature:
            session['signature'] = signature if signature else ''
        signature = session.get('signature', '')
Get full code

Created: | Last Updated:

An app that generates and sends emails using a language model, allowing users to preview and customize the content and subject before sending.

How to Use the Email Sender Pro Template on Lazy

Introduction to the Email Sender Pro Template

The Email Sender Pro template is a powerful tool that allows you to generate and send emails using a language model. This template is perfect for users who want to automate their email sending process with the ability to preview and customize the content and subject before dispatching the emails. Whether you need to send casual or formal emails, this template has got you covered.

Getting Started

To begin using the Email Sender Pro template, simply click on Start with this Template on the Lazy platform. This will pre-populate the code in the Lazy Builder interface, so you won't need to copy, paste, or delete any code.

Initial Setup

Before you can start sending emails, you'll need to set up a couple of environment secrets within the Lazy Builder. These are the EMAIL_ADDRESS and EMAIL_PASSWORD, which the application will use to authenticate with the email server and send out emails.

  • Go to the Environment Secrets tab in the Lazy Builder.
  • Click on the 'Add Secret' button.
  • Enter 'EMAIL_ADDRESS' as the key and your email address as the value.
  • Repeat the process to add the 'EMAIL_PASSWORD' secret with your email password as the value.

Please ensure you have the correct permissions and that you're using a secure and private email address for this purpose.

Test: Pressing the Test Button

Once you have set up the environment secrets, press the Test button to begin the deployment of the app. The Lazy CLI will handle the deployment process, and you won't need to install any libraries or set up your environment.

Entering Input

If the template requires user input, the Lazy App's CLI interface will prompt you to provide the necessary information after you press the test button. Follow the prompts to enter any required information.

Using the App

After deployment, Lazy will provide you with a dedicated server link to use the app. Navigate to this link to access the Email Sender Pro interface where you can compose and preview your emails.

Integrating the App

If you need to integrate the Email Sender Pro app into another service or frontend, you can use the server link provided by Lazy. Add this link to your external tool where necessary, and configure any additional settings as required by that tool.

Here's a sample request you might make to the app's API:

POST /send HTTP/1.1<br>     Host: [Your Lazy Server Link]<br>     Content-Type: application/x-www-form-urlencoded<br>     <br>     email_content=Your%20email%20content&recipient=recipient@example.com&email_subject=Your%20Subject

And a sample response you would receive:

HTTP/1.1 200 OK<br>     Content-Type: text/html; charset=utf-8<br>     <br>     Email sent successfully!

If you encounter any issues or need further assistance, refer to the documentation provided in the template or reach out to the Lazy support team for help.

Technologies

Maximize OpenAI Potential with Lazy AI: Automate Integrations, Enhance Customer Support and More  Maximize OpenAI Potential with Lazy AI: Automate Integrations, Enhance Customer Support and More

Similar templates

Open Source LLM based Web Chat Interface

This app will be a web interface that allows the user to send prompts to open source LLMs. It requires to enter the openrouter API key for it to work. This api key is free to get on openrouter.ai and there are a bunch of free opensource models on openrouter.ai so you can make a free chatbot. The user will be able to choose from a list of models and have a conversation with the chosen model. The conversation history will be displayed in chronological order, with the oldest message on top and the newest message below. The app will indicate who said each message in the conversation. The app will show a loader and block the send button while waiting for the model's response. The chat bar will be displayed as a sticky bar at the bottom of the page, with 10 pixels of padding below it. The input field will be 3 times wider than the default size, but it will not exceed the width of the page. The send button will be on the right side of the input field and will always fit on the page. The user will be able to press enter to send the message in addition to pressing the send button. The send button will have padding on the right side to match the left side. The message will be cleared from the input bar after pressing send. The last message will now be displayed above the sticky input block, and the conversation div will have a height of 80% to leave space for the model selection and input fields. There will be some space between the messages, and the user messages will be colored in green while the model messages will be colored in grey. The input will be blocked when waiting for the model's response, and a spinner will be displayed on the send button during this time.

Icon 1 Icon 1
483
FastAPI endpoint for Text Classification using OpenAI GPT 4

This API will classify incoming text items into categories using the Open AI's GPT 4 model. If the model is unsure about the category of a text item, it will respond with an empty string. The categories are parameters that the API endpoint accepts. The GPT 4 model will classify the items on its own with a prompt like this: "Classify the following item {item} into one of these categories {categories}". There is no maximum number of categories a text item can belong to in the multiple categories classification. The API will use the llm_prompt ability to ask the LLM to classify the item and respond with the category. The API will take the LLM's response as is and will not handle situations where the model identifies multiple categories for a text item in the single category classification. If the model is unsure about the category of a text item in the multiple categories classification, it will respond with an empty string for that item. The API will use Python's concurrent.futures module to parallelize the classification of text items. The API will handle timeouts and exceptions by leaving the items unclassified. The API will parse the LLM's response for the multiple categories classification and match it to the list of categories provided in the API parameters. The API will convert the LLM's response and the categories to lowercase before matching them. The API will split the LLM's response on both ':' and ',' to remove the "Category" word from the response. The temperature of the GPT model is set to a minimal value to make the output more deterministic. The API will return all matching categories for a text item in the multiple categories classification. The API will strip any leading or trailing whitespace from the categories in the LLM's response before matching them to the list of categories provided in the API parameters. The API will accept lists as answers from the LLM. If the LLM responds with a string that's formatted like a list, the API will parse it and match it to the list of categories provided in the API parameters.

Icon 1 Icon 1
251

We found some blogs you might like...