Summarize Text with PaLM API
Made by Google Cloud
Summarizes text in Firestore documents using PaLM API.
How this extension works
This extension allows you to summarize a field in a Firestore document using the PaLM API.
On installation, you will need to specify the following information:
- PaLM API Provider This extension makes use of the PaLM large language model. There is a choice of provider for this API. See the section below for more details.
- Firestore collection path: The path to the Firestore collection that contains the documents to summarize.
- Document field to summarize: The name of the document field to summarize.
- Target summary length (number of sentences): The desired length of the summary in sentences.
- Response field: The name of the field in the document to store the summary.
This extension will listen to the specified collection for new document writes and execute the following logic:
- Call the PaLM API to generate a summary of the document field.
- Write the summary back to the triggering document in the response field.
This extension can be used for a variety of use cases, including:
- Summarizing customer feedback
- Abstracting long articles
- Condensing user-generated content
Here are some examples of how this extension can be used:
- An e-commerce platform could use this extension to automatically generate summaries of customer feedback, giving customers a quick overview of the feedback before they decide whether to read the full content.
- A news website could use this extension to summarize articles, helping readers decide whether they want to invest their time in reading the full article.
- A social media platform could use this extension to provide summaries of user-generated content, improving content discoverability and user engagement.
Choosing a PaLM Provider
There are currently two different APIs providing access to PaLM large language models. The PaLM Developer (Generative Language) API, and Vertex AI. This extension will prompt you to pick an API on installation. For production use-cases we recommend Vertex AI, as the Generative Language API is still in public preview.
For more details on the Vertex AI PaLM API, see the Vertex AI documentation
Harm filter thresholds
PaLM provides content filters in different categories. For simplicity, this extension allows you to set a global threshold for all categories in one configuration parameter, specified during installation. Note that the filtering is based on the probability that the prompt or response contains the category of content, and not necessarily the severity of the content.
Currently the extension only supports this feature for the Generative AI for developers PaLM Provider.
For more information see the documentation for the Generative AI for Developers PaLM API.
Regenerating a response
Changing the state field of a completed document’s status from
COMPLETED to anything else will retrigger the extension for that document.
If you have not already done so, you will first need to apply for access to the PaLM API via this waitlist.
Once you have access, please enable the Generative Language API in your Google Cloud Project before installing this extension.
Ensure you have a Cloud Firestore database set up in your Firebase project, and enabled the Generative Language API in your Google Cloud Project before installing this extension.
To install an extension, your project must be on the Blaze (pay as you go) plan. You will be charged a small amount (typically around $0.01/month) for the Firebase resources required by this extension (even if it is not used).
This extension uses other Firebase and Google Cloud Platform services, which have associated charges if you exceed the service’s no-cost tier:
- Cloud Firestore
- Cloud Functions (See FAQs)
Additionally, this extension uses the PaLM API, which is currently in public preview. During the preview period, developers can try the PaLM API at no cost. Pricing will be announced closer to general availability. For more information on the PaLM API public preview, see the PaLM API documentation.