I have an app that suggests kids activities to parents. I’d like to set up a feature where AI takes the title of the activity (like play mini-golf) and the zip code of the user and suggests local places for that idea. I don’t think I want this to automatically happen because that would rack up tokens. I was thinking of a button the user could push to get local suggestions. How would I set this up? How do I minimize token use?
I would suggest using the Perplexity API for this, since you would want something real-time.
Assuming you have access to Call API, you would want to construct an API call that includes a prompt (that steers the model to search for [activity name] in [place]) and writes the result into a column of your choice.
If you don’t have Call API, you can try using the Make integration, send the inputs, create a Perplexity module in Make, get the results and write them back using the Glide module in Make.
In both cases, you can use a button to minimize your tokens used.
Real-time: using an API for this and some sort of integration, why not.
Perplexity: using AI to replace search feels like flushing the toilet ten times to eliminate a spec of dust you found on the floor. It doesn’t mean it can be done that it should. It would probably work, but it’s overkill, unnecessary and by many standards probably wrong.
I wouldn’t know how to do it, but if it can be done, I would recommend integrating your app with a search API and not a LLM API.
I might be understanding you wrong here, but I didn’t recommend it because it’s a LLM, but because it’s a search-centered LLM that would fit this use case (searching for real-time info).
It’s my understanding that Perplexity remains a search-focused LLM and its per‑query resource consumption remains much larger than that of a conventional search engine. If this is indeed correct, based on this, since I believe responsible and sustainable usage of resources is more important than adding a feature to an app, I’d have to stand by my previous comment: I don’t recommend using AI to replace search. Even if the search-centered LLM will do the trick. Not in a Glide app, and not for professional or personal use.
I also recognize that I do not understand the technical differences between LLM and search tools and the various intricacies of LLM vs search. I would be more than happy to be corrected, because I do find AI magical, including to replace search if it were a responsible and sustainable thing to do.
I assume you care about the sustainability as in the environmental impact?
I might not be a good source of info for environmental impact, but my two cents on the difference of technical between using Perplexity and a traditional “search” API.
Perplexity is built on top of a search API, it’s one of the steps it must do to get the resource to answer your question.
Think of it this way:
-
A Traditional Search API (like Google’s): Using a conventional search API is like asking a librarian for information on “local mini-golf courses.” The librarian points you to an aisle and gives you a list of books (the search results/links) that matches or partially matches the “phrase”/“keyword”. It is now your responsibility as the app developer to go through each book, extract the relevant names and addresses, and compile a list for your user. This process is slow to build.
-
An “Answer Engine” API (like Perplexity): Using Perplexity is like asking the same librarian the same question. But instead of giving you a reading list, the librarian goes and reads the relevant parts of the best books, synthesizes the key information, and hands you a neatly formatted note that says: “Here are three local mini-golf courses with their addresses and a short description.”
The Perplexity API does the heavy lifting that comes after the initial search. Its Large Language Model (LLM) capabilities are used to:
-
Understand the results: It analyzes the content of the top-ranking pages to figure out which ones actually contain the specific information you need.
-
Extract the key data: It identifies and pulls out the relevant details, such as business names, locations, and descriptions, from the unstructured text of those websites.
-
Synthesize and format the answer: It combines the information from multiple sources into a single, coherent, and easy-to-use response. For a developer, this response often comes in a structured format (like JSON), which can be directly and easily displayed in your app.
This saves a massive amount of development time and computing resources on your end and results in a much better and faster experience for your user. For your specific feature, you wouldn’t just be getting a list of links; you’d be getting a clean, usable list of suggested locations, and possibly more reasoning text on top of it.
Thanks for the explanation, Thinh