Language Tasks with PaLM API

Made by Google Cloud

Performs AI/ML tasks on text, customizable with prompt engineering, using PaLM API and Firestore.

200+
installs
Works with
Cloud Firestore
Version
0.1.10 | Source code
Tags
ai, palm, generative-ai, text-ai, language-ai, large-language-models, llm, nlp, google-ai
License
Apache-2.0
Publisher
Google Cloud
Report
Bug
Abuse

How this extension works

This extension allows you to perform language tasks using the PaLM API, a custom prompt, and Firestore.

On installation, you will be asked to provide the following information:

  • PaLM API Provider This extension makes use of the PaLM large language model. There is a choice of provider for this API. See the section below for more details.
  • Prompt: This is the text that you want the PaLM API to generate a response for. It can be free-form text or it can use handlebars variables to substitute values from the Firestore document.
  • Firestore collection path: This is the path to the Firestore collection that contains the documents that you want to perform the language task on.
  • Response field: This is the name of the field in the Firestore document where you want the extension to store the response from the PaLM API. There is a token limit for the response field. This means responses may be truncated if they exceed a certain length. The standard limit is approximately 4096 characters, but this can vary depending on language and context. For more information, check here.

This extension will listen to the specified collection for new documents. When such a document is added, the extension will:

  1. Substitute any variables from the document into the prompt.
  2. Query the PaLM API to generate a response based on the prompt.
  3. Write the response from the PaLM API back to the triggering document in the response field.

Each instance of the extension should be configured to perform one particular task. If you have multiple tasks, you can install multiple instances.

For example, you could use this extension to:

  • Predict star ratings on a collection of product reviews.
  • Classify customer feedback as positive, negative, or neutral.
  • Summarize long articles.
  • Extract named entities from text.
  • Generate creative text, such as poems or code.

Here’s an example prompt used for predicting star ratings on a collection of product reviews:

Provide a star rating from 1-5 of the following review text: “This is a truly incredible water bottle, I keep it with me all the time when I’m traveling and it has never let me down.”
5

Provide a star rating from 1-5 of the following review text: “I really enjoyed the water bottle, I just wish they carried this in a larger size as I go for long hikes. But overall the aesthetic, manufacturing, and functional design are great for what I needed.”
4

Provide a star rating from 1-5 of the following review text: “The water bottle was fine, although the design was a bit lacking and could be improved.”
3

Provide a star rating from 1-5 of the following review text: “Please don’t get this water bottle, there are major design flaws, for example the cap doesn’t screw on fully so water leaks into my backpack all the time.”
1

Provide a star rating from 1-5 of the following review text: \“{{review_text}}\”

In this case, review_text is a field of the Firestore document and will be substituted into the prompt when querying PaLM.

Choosing a PaLM Provider

There are currently two different APIs providing access to PaLM large language models. The PaLM Developer (Generative Language) API, and Vertex AI. This extension will prompt you to pick an API on installation. For production use-cases we recommend Vertex AI, as the Generative Language API is still in public preview.

  • The PaLM developer (Generative Language) API is currently in public preview, and you will need to sign up waitlist if you want to use it. For details and limitations, see the PaLM API documentation.

  • For more details on the Vertex AI PaLM API, see the Vertex AI documentation

Safety Thresholds

Both the Generative Language for Developers and Vertex AI models have safety thresholds, to block inappropriate content. You can read the details here:

At this moment, only Generative AI for Developers allows configuring safety thresholds via their API, and only for their text generation models, not their chat-bison models.

Regenerating a response

Changing the state field of a completed document’s status from COMPLETED to anything else will retrigger the extension for that document.

Additional Setup

If you have not already done so, you will first need to apply for access to the PaLM API via this waitlist.

Once you have access, please enable the Generative Language API in your Google Cloud Project before installing this extension.

Ensure you have a Cloud Firestore database set up in your Firebase project, and enabled the Generative Language API in your Google Cloud Project before installing this extension.

Billing

To install an extension, your project must be on the Blaze (pay as you go) plan. You will be charged a small amount (typically around $0.01/month) for the Firebase resources required by this extension (even if it is not used).
This extension uses other Firebase and Google Cloud Platform services, which have associated charges if you exceed the service’s no-cost tier:

  • Cloud Firestore
  • Cloud Functions (See FAQs)

Learn more about Firebase billing.

Additionally, this extension uses the PaLM API, which is currently in public preview. During the preview period, developers can try the PaLM API at no cost. Pricing will be announced closer to general availability. For more information on the PaLM API public preview, see the PaLM API documentation.