Published: May 16, 2024
When shopping online, it can be overwhelming to see the volume of product reviews and the volume of products available. How can we sort through all of this noise to find the product that will actually meet our specific needs?
For example, say we're shopping for a work backpack. Backpacks need to meet a balance in function, aesthetics, and practicality. The number of reviews makes it nearly impossible to know if you have found the perfect bag. What if we could use AI to sift through the noise and find the perfect product?
What would be helpful is a summary of all reviews, alongside a list of most common pros and cons.
To build this, we use server-side generative AI. Inference occurs on a server.
In this document, you can follow along with a tutorial for the Gemini API with Node.js, using the Google AI JavaScript SDK to summarize data from many reviews. We focus on the generative AI portion of this work; we won't cover how to store results or create a job queue.
In practice, you could use any LLM API with any SDK. However, the suggested prompt may need to be adapted to meet the model you choose.
Prerequisites
Create a key for the Gemini API, and define it in your environment file.
Install the Google AI JavaScript SDK, for example with npm:
npm install @google/generative-ai
Build a review summarizer application
- Initialize a generative AI object.
- Create a function to generate review summaries.
- Select the generative AI model. For our use case, we'll use Gemini Pro. Use
a model that's specific to your use case (for example,
gemini-pro-vision
is for multimodal input). - Add a prompt.
- Call
generateContent
to pass the prompt as an argument. - Generate and return the response.
- Select the generative AI model. For our use case, we'll use Gemini Pro. Use
a model that's specific to your use case (for example,
const { GoogleGenerativeAI } = require("@google/generative-ai");
// Access the API key env
const genAI = new GoogleGenerativeAI(process.env.API_KEY_GEMINI);
async function generateReviewSummary(reviews) {
// Use gemini-pro model for text-only input
const model = genAI.getGenerativeModel({ model: "gemini-pro" });
// Shortened for legibility. See "Write an effective prompt" for
// writing an actual production-ready prompt.
const prompt = `Summarize the following product reviews:\n\n${reviews}`;
const result = await model.generateContent(prompt);
const response = await result.response;
const summary = response.text();
return summary;
}
Write an effective prompt
The best way to be successful with generative AI is to create a thorough prompt. In this example, we've used the one-shot prompting technique to get consistent outputs.
One-shot prompting is represented by the example output for Gemini to model.
const prompt =
`I will give you user reviews for a product. Generate a short summary of the
reviews, with focus on the common positive and negative aspects across all of
the reviews. Use the exact same output format as in the example (list of
positive highlights, list of negative aspects, summary). In the summary,
address the potential buyer with second person ("you", "be aware").
Input (list of reviews):
// ... example
Output (summary of reviews):
// ... example
**Positive highlights**
// ... example
**Negative aspects**
// ... example
**Summary**
// ... example
Input (list of reviews):
${reviews}
Output (summary of all input reviews):`;
Here's an example output from this prompt, which includes a summary of all reviews, alongside a list of common pros and cons.
## Summary of Reviews:
**Positive highlights:**
* **Style:** Several reviewers appreciate the backpack's color and design.
* **Organization:** Some users love the compartments and find them useful for
organization.
* **Travel & School:** The backpack seems suitable for both travel and school
use, being lightweight and able to hold necessary items.
**Negative aspects:**
* **Durability:** Concerns regarding the zipper breaking and water bottle holder
ripping raise questions about the backpack's overall durability.
* **Size:** A few reviewers found the backpack smaller than expected.
* **Material:** One user felt the material was cheap and expressed concern about
its longevity.
**Summary:**
This backpack seems to be stylish and appreciated for its organization and
suitability for travel and school. However, you should be aware of potential
durability issues with the zippers and water bottle holder. Some users also
found the backpack smaller than anticipated and expressed concerns about the
material's quality.
Token limits
Many reviews can hit the model's token limit. Tokens aren't always equal to a single word; a token can be parts of a word or multiple words together. For example, Gemini Pro has a 30,720 token limit. This means the prompt can include, at-most, 600 average 30-word reviews in English, minus the rest of the prompt instructions.
Use countTokens()
to check the number of tokens and reduce the input if the prompt is larger than
allowed.
const MAX_INPUT_TOKENS = 30720
const { totalTokens } = await model.countTokens(prompt);
if (totalTokens > MAX_INPUT_TOKENS) {
// Shorten the prompt.
}
Build for enterprise
If you're a Google Cloud user or otherwise need enterprise support, you can access Gemini Pro and more models, such as Anthropic's Claude models, with Vertex AI. You may want to use Model Garden to determine which model best matches your specific use case.
Next steps
The application we built relies heavily on quality reviews to give the most effective summaries. To collect those quality reviews, read the next article in this series is Help users write useful product reviews with on-device web AI.
We want to hear from you about this approach. Tell us what use cases most interest you. You can share your feedback and join the early preview program to test this technology with local prototypes.
Your contribution can help us make AI a powerful, yet practical, tool for everyone.