If you’re working on a new app in 2023, you’re likely thinking about how to integrate Large Language Model (LLM) based features into your code. The hottest API to access this kind of technology is ChatGPT from OpenAI. It’s 10x cheaper and apparently just as good as earlier versions. In its own version of Moore’s law, it’s moving fast!
There are a wide variety of starter templates out there for interacting with ChatGPT. Our goal in releasing another one is to:
Deploying on Coherence also gives you an easy path to using the resources (see our docs) required to build a real app on top of this technology (e.g. postgres database, redis cache, etc). For many apps, you’ll still need things like users, data storage, and various processing logic to get the right inputs and outputs for ChatGPT sdk calls. The integrated cloud support that Coherence provides for these resources makes it easy to use them in your app.
The OpenAI GPT starter repo we’re showing here is written in python, so if you are using django, flask, or FastAPI it should be easy to adapt. Other languages or frameworks such as Ruby on Rails or Next.js, are easy to deploy on Coherence as well. We’re happy to help, or you can follow the other examples in our docs.
The app in the starter repo is an AI-powered QA engineer that writes automated tests using playwright.js. You can see a working demo of the app here. If you want some example code to run against the demo, you can use this simple http server in go. Check out the open-source starter on github here. It’s easy to get started, just: