OpenAI 

What does the OpenAI connection provide? 

The OpenAI connection allows you to interact with the OpenAI API using a pre-authenticated client. It's designed to simplify the process of integrating OpenAI's powerful language models into your Gadget application.

Setting up the OpenAI connection 

In your Gadget application, click Plugins on the sidebar to view the list of connections and click on the OpenAI connection to configure it.

A view of Gadget's connections page with an arrow pointing to the OpenAI connection

Choose between using Gadget's managed OpenAI API key or using your own API keys.

A view of Gadget's OpenAI connection modal with the Use Gadget's keys option selected

Option 1: Use Gadget's keys 

For a quick start without signing up for an OpenAI account, choose Gadget managed keys. They come pre-configured and include $50 in OpenAI credits to begin. Keep in mind that Gadget managed keys have rate limits and are meant for development purposes only. When you're ready for production, replace them with your own OpenAI keys.

Option 2: Use your own API keys 

If you already have an OpenAI account or desire greater API control, simply use your own OpenAI keys. This grants you the ability to manage API rate limits and access the full range of OpenAI platform features.

How does the OpenAI connection work? 

With the OpenAI connection set up, you can now start using the OpenAI API in your Gadget actions and routes to leverage the power of OpenAI's models.

Example: Using a chat completion in a global action 

Create a new global action with the API Identifier useOpenAI and add the following code to the action's run function:

api/actions/useOpenAI.js
JavaScript
1export const run: ActionRun = async ({ connections }) => {
2 const chatCompletion = await connections.openai.chat.completions.create({
3 model: "gpt-3.5-turbo",
4 messages: [{ role: "user", content: "Hello from Gadget!" }],
5 });
6
7 return { message: chatCompletion.choices[0].message };
8};
1export const run: ActionRun = async ({ connections }) => {
2 const chatCompletion = await connections.openai.chat.completions.create({
3 model: "gpt-3.5-turbo",
4 messages: [{ role: "user", content: "Hello from Gadget!" }],
5 });
6
7 return { message: chatCompletion.choices[0].message };
8};
A view of a global action using the OpenAI connection.

Now click the Run action button to go to your API playground and test the action. You should see a result like this:

json
1{
2 "data": {
3 "useOpenAI": {
4 "success": true,
5 "errors": null,
6 "result": {
7 "message": {
8 "role": "assistant",
9 "content": "Hello Gadget! How can I assist you today?"
10 }
11 }
12 }
13 },
14 "extensions": {
15 "logs": "https://links.ggt.dev/logs/3/bda7ac999cd8ec291ecdb47f12319720",
16 "traceId": "bda7ac999cd8ec291ecdb47f12319720"
17 }
18}

Using LangChain with Gadget-managed credentials 

If you are working with LangChain, please note that with gadget-managed keys, passing the required API key is not enough; you also need to include a baseURL parameter. The snippets below show how to set up two models using LangChain: ChatOpenAI and OpenAIEmbeddings.

Chat model integration 

JavaScript
1import { ChatOpenAI } from "@langchain/openai";
2
3const chatModel = new ChatOpenAI({
4 streaming: true,
5 modelName: "gpt-3.5-turbo-0613",
6 temperature: 0,
7 openAIApiKey: connections.openai.configuration.apiKey,
8 configuration: {
9 basePath: connections.openai.configuration.baseUrl,
10 // Set the baseURL for OpenAI API calls
11 },
12});
1import { ChatOpenAI } from "@langchain/openai";
2
3const chatModel = new ChatOpenAI({
4 streaming: true,
5 modelName: "gpt-3.5-turbo-0613",
6 temperature: 0,
7 openAIApiKey: connections.openai.configuration.apiKey,
8 configuration: {
9 basePath: connections.openai.configuration.baseUrl,
10 // Set the baseURL for OpenAI API calls
11 },
12});

Text embedding integration 

JavaScript
1import { OpenAIEmbeddings } from "@langchain/openai";
2
3const openAIEmbedding = new OpenAIEmbeddings(
4 {
5 openAIApiKey: connections.openai.configuration.apiKey,
6 },
7 {
8 basePath: connections.openai.configuration.baseUrl,
9 // Set the baseURL for OpenAI API calls
10 }
11);
1import { OpenAIEmbeddings } from "@langchain/openai";
2
3const openAIEmbedding = new OpenAIEmbeddings(
4 {
5 openAIApiKey: connections.openai.configuration.apiKey,
6 },
7 {
8 basePath: connections.openai.configuration.baseUrl,
9 // Set the baseURL for OpenAI API calls
10 }
11);

Was this page helpful?