Skip to main content

Polling Third-Party Endpoints with a Microservice

A lightweight, task-specific microservice can extend the capabilities of Conscia's low-code platform tremendously, and allow for a streamlined and clear business user experience despite complex programmed functions being executed in the workflow. This recipe demonstrates one such microservice.

While calls to DX Engine's Experience API allow for synchronous and asynchronous calls to web services, there are workflows that cannot be adequately engaged by a single synchronous call or asynchronous call. While strategies exist to fulfill this need using Conscia alone, it is typical for enterprises to implement a number of cloud-hosted serverless functions peripheral to their orchestration platform instance.

This recipe demonstrates a typical microservice interaction - a function called Relay. It requests a job be enqueued on another system, actively polls a job status endpoint, and then queries the job result once it has concluded. In this example, we will be sending a query to a specific OpenAI (ChatGPT) Assistant, and awaiting the response created by that LLM.

Additionally, we take advantage of Conscia's Experience Rules to capture the majority of the data fields required, so that business users can call upon "pre-fab" configuration combinations instead of concerning themselves with managing a complex state.

By following this pattern any necessary tools and capabilities, like real-time LLM content, can supplement and extend Conscia's capabilities.

Mapping Out The Recipe

Polling with a Microservice Visualizer

When the frontend calls Conscia's Experience API, it will pass one Context field:

  • The relayConnection value, which is the name of a Connection. This will provide the Relay Component both its actual connection details and the rules by which the outgoing payload is populated.

An example call looks like this:

POST {{engineUrl}}/experience/components/_query
X-Customer-Code: {{customerCode}}
Authorization: Bearer {{dxEngineToken}}

{
"componentCodes": ["relay"],
"context": {
"relayConnection": "relay-openai"
}
}

Based on the context provided, we will reach out to OpenAI with a pre-populated query ("Explain deep learning to a 5 year old."), and the LLM will respond:

{
"duration": 6308,
"components": {
"relay": {
"status": "VALID",
"response": "Imagine your brain is like a big team of tiny helpers who work together to figure things out. Deep learning is like teaching a computer to have its own team of tiny helpers, called neurons, that work together to learn new things. Just like you learn by looking at stuff and practicing, the computer looks at lots of examples, practices a lot, and gets smarter over time."
}
},
"errors": []
}

Microservice Configuration Details

For this recipe, we hosted the following Javascript application on Google Cloud, using Cloud Run. However, this is a "vanilla" JS application (using only common packages like Express and https) that can be hosted on any cloud or server.

The inputs to this application are detailed in the Relay Rules Component. To summarize the work performed, it will call apiUrl1 with apiBody1, headers, and an auth if provided via downstream-authorization. Once complete, it will substitute any response variables, then call the pollUrl every pollInterval until the response matches the doneRegex. Then, it will either GET apiUrl2, or POST apiUrl2 with apiBody2 depending on if apiBody2 was provided. The second API response is sent back to Conscia. Logging and error handling are present.

index.js
const express = require('express');
const axios = require('axios');
const https = require('https');

const logging = process.env.NODE_ENV !== 'production';

const log = (message) => {
if (logging) {
console.log(message);
}
};

// Create a new HTTPS agent
const agent = new https.Agent({
secureProtocol: 'TLSv1_2_method',
rejectUnauthorized: false // Disable SSL certificate verification
});

const app = express();
app.use(express.json());

// Endpoint to receive requests from your program
app.post('/start-job', async (req, res) => {
let { apiUrl1, apiBody1, headers, responseVariables, pollUrl, pollInterval, doneRegex, apiUrl2, apiBody2 } = req.body;

if (req.headers['downstream-authorization']) {
headers['authorization'] = req.headers['downstream-authorization'];
}
log("req.body: " + JSON.stringify(req.body));
log("headers: " + JSON.stringify(headers));
log("req.headers: " + JSON.stringify(req.headers));

try {
// Step 2: Make a request to API URL #1 to start a job
const response1 = await axios.post(apiUrl1, apiBody1, {
headers: { ...headers },
httpsAgent: agent
});
log("response1: " + response1);

// Step 3: Dynamically retrieve variables from the response
const context = {};
responseVariables.forEach(variable => {
context[variable] = response1.data[variable];
pollUrl = pollUrl.replace(`{!{${variable}}!}`, context[variable]);
apiUrl2 = apiUrl2.replace(`{!{${variable}}!}`, context[variable]);
});

log("context: " + context);
log("pollUrl: " + pollUrl);

let pollResponse;
const doneRegexObj = new RegExp(doneRegex);
let isJobDone = false;

while (!isJobDone) {
await new Promise(resolve => setTimeout(resolve, pollInterval));

pollResponse = await axios.get(pollUrl, { headers });

log("pollResponse: " + JSON.stringify(pollResponse.data) + ", " + JSON.stringify(pollResponse.status) + ", " + JSON.stringify(pollResponse.headers));

// Check the entire pollResponse for the key-value pair in doneRegex.
if (doneRegexObj.test(JSON.stringify(pollResponse.data))) {
isJobDone = true;
log("isJobDone: " + isJobDone);
}
}

// Step 4: Run API URL #2.
let apiResponse2;
if (apiBody2) {
apiResponse2 = await axios.post(apiUrl2, apiBody2, {
headers: { ...headers },
httpsAgent: agent
});
} else {
apiResponse2 = await axios.get(apiUrl2, {
headers: { ...headers },
httpsAgent: agent
});
}

log("apiResponse2: " + JSON.stringify(apiResponse2.data) + ", " + JSON.stringify(apiResponse2.status) + ", " + JSON.stringify(apiResponse2.headers));
res.status(apiResponse2.status).json(apiResponse2.data);

} catch (error) {
console.error('Error handling job request:', error);

// Log detailed error information
if (error.response) {
console.error('Response data:', error.response.data);
console.error('Response status:', error.response.status);
console.error('Response headers:', error.response.headers);
res.status(error.response.status).json({ error: error.response.data });
} else if (error.request) {
console.error('Request data:', error.request);
res.status(500).json({ error: 'No response received from the server' });
} else {
console.error('Error message:', error.message);
res.status(500).json({ error: 'An unexpected error occurred' });
}
}
});

// Start the server
const PORT = process.env.PORT || 8080;
app.listen(PORT, () => {
log(`Microservice listening on port ${PORT}`);
});
package.json
{
"name": "relay",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"start": "node index.js"
},
"repository": {
"type": "git",
"url": "git+https://github.com/conscia/relay.git"
},
"keywords": [],
"author": "",
"license": "ISC",
"bugs": {
"url": "https://github.com/conscia/relay/issues"
},
"homepage": "https://github.com/conscia/relay#readme",
"dependencies": {
"axios": "^1.7.7",
"dotenv": "^16.4.5",
"express": "^4.21.0"
}
}

OpenAI Configuration Details

OpenAI Assistants are the API-only equivalent of Custom GPTs, which are web UI only. They cannot be accessed in a single-call Completion, instead being accessed via creation of a new threaded conversation. However, this allows us to unlock additional capabilties, such as adding additional back-and-forth to the dialog, redirecting a conversation to a more appropriate Assistant, and so on.

OpenAI API Key

On the OpenAI Platform API Keys page, create an API Key that allows for, at a minimum, "Assistants" and "Threads" write permissions.

OpenAI Assistant

On the OpenAI Platform Assistants Page, create an Assistant to receive the incoming request. The example today uses quite a simple and to-the-point Assistant:

FieldValue
NameSuccinct Jimmy
System instructionsYou're an educational service for extremely smart people. Give brief, technically-sound explanations.
Modelgpt-4o
Response formattext
Temperature1
ToolsNone

Retain the API key and the Assistant ID you just generated. We will execute the remainder of the recipe in DX Engine.

DX Engine Configuration Details

The topics in this section explain how to implement the elements involved in this recipe.

Context Fields

We need to create a Context Field for managing Relay's Connection.

  1. In the top navigation, click Settings, and then click Context Fields. The Manage Context Fields page appears.
  2. Click Add Context Field. The Create Context Field wizard appears.
  3. For Context Field Name, enter relay-openai.
  4. For Display Name, enter Relay OpenAI.
  5. Optionally, enter a Description for the Context Field.
  6. For Data Type, select String.
  7. Click Submit.

Secrets

We need to create a Secret to store the OpenAI API Key.

  • Navigate to the Secrets page (Settings --> Secrets).
  • Click the + Add Secret button.
  • Enter the following and click Submit:
FieldValue
Secret CodeopenAI
Secret NameopenAI (ChatGPT) Key
Secret ValueEnter your openAI API key.

Connections

Relay OpenAI Connection

To create a Connection used to direct Relay to engage with OpenAI, in the DX Engine UI:

  • Navigate to the Connections page (Manage Experiences --> Components).
  • Click the + Add Connection button.
  • Enter the following and click Submit.
FieldValue
Connection Coderelay-openai
Connection NameRelay: OpenAI
ConnectorUniversal API Connector
Base URLthe url provided by the host of your microservice
ConnectionGet value from: Literal
DX Graph
Path/start-job
MethodPOST

Components

Relay Rules Component

  • Navigate to the Experience Components page (Manage Experiences --> Components).
  • Click the + Add Component button.
  • Enter the following and click Submit.
FieldForm TabValue
Component CodeMainrelay-rules
Component NameMainRelay Rules
No RulesMainNot Checked
Component TypeMainConscia - Metadata
Execution based on dependenciesConditionsSkip on Failed Dependency
Do not Skip on Skipped Dependency
Skip on Invalid Dependency
  • Now, establish the following Experience Rule Attributes on the Attribute Definition tab:
Attribute Name & Attribute PropertyAttribute DescriptionAttribute RequiredType
apiUrl1The first API that Relay will call.RequiredText
apiBody1The contents of the first call.RequiredText
headersHeaders for all calls made by Relay.RequiredText
responseVariablesVariables in subsequent Relay URLs. Write them like this when in use: /check-status/{!{job-id}!}/Not RequiredText
pollUrlThe URL we will poll to.RequiredText
pollIntervalThe interval at which Relay will poll, in ms.Not RequiredNumber - default 1000
doneRegexThe regex which we will match against the poll response.RequiredText
apiUrl2The second API we call, once the pollUrl matches doneRegex.RequiredText
apiBody2The second API call's body. If absent, Relay will GET; if present, Relay will POST.Not RequiredText

Relay Rules Transform Component

  • Navigate to the Experience Components page (Manage Experiences --> Components).
  • Click the + Add Component button.
  • Enter the following and click Submit.
FieldForm TabValue
Component CodeMainrelay-rules-transform
Component NameMainRelay Rules Transform
No RulesMainChecked
Component TypeMainConscia - Data Transformation Script
Execution based on dependenciesConditionsSkip on Failed Dependency
Do not Skip on Skipped Dependency
Skip on Invalid Dependency
Data to modifyMainGet value from: JS Expression
componentExtras('relay-rules').rule.attributes
ScriptMainBelow:
function isJSON(str) {
try {
JSON.parse(str);
return true;
} catch (e) {
return false;
}
};


var retval = Object.entries(data).reduce((acc, [key, value]) => {
acc[key] = isJSON(value) ? JSON.parse(value) : value;
return acc;
}, {});

retval = retval;

Relay Standalone Run Component

  • Navigate to the Experience Components page (Manage Experiences --> Components).
  • Click the + Add Component button.
  • Enter the following and click Submit.
FieldForm TabValue
Component CodeMainrelay
Component NameMainRelay: Run Relay Standalone
Component DescriptionMainRun Relay once, making polling calls to third party endpoints.
No RulesMainNot Checked
Component TypeMainConscia - Universal API Connector
ConnectionMainGet value from: JS Expression
`contextField('relayConnection')
BodyMainGet value from: Component Response - Relay: Rules Transform
Response TransformMainresponse.data[0].content[0].text.value

Experience Rules

Experience Rules for Relay to OpenAI LLM

  • Navigate to the Omnichannel Experience Rules page (Manage Experiences --> Experience).
  • Navigate to your Relay: Rules Component under All Components.
  • Click the + Add Experience Rule button.
  • Enter the following and click Submit.
FieldForm TabValue
Rule IDMainopenAI
Rule NameMainopenAI
PriorityMain10
ActiveExperienceChecked
Real-time Context EvaluationExperiencerelayConnection is equal to (=) relay-openai
apiUrl1 AttributeExperiencehttps://api.openai.com/v1/threads/runs
apiBody1 AttributeExperienceBelow
headers AttributeExperience{ "OpenAI-Beta": "assistants=v2", "Content-Type": "application/json" }
responseVariables AttributeExperience["thread_id", "id"]
pollUrl AttributeExperiencehttps://api.openai.com/v1/threads/{!{thread_id}!}/runs/{!{id}!}
pollInterval AttributeExperience300
doneRegex AttributeExperience`(completed
apiUrl2 AttributeExperiencehttps://api.openai.com/v1/threads/{!{thread_id}!}/messages
apiBody2 AttributeExperienceblank

apiBody1:

{
"assistant_id": "asst_lECeouTchuK22F5nuH43Kbwk",
"thread": {
"messages": [
{
"role": "user",
"content": "Explain deep learning to a 5 year old."
}
]
}
}

Extending the Recipe

From this point, we could extend the recipe to allow Relay to engage with a second remote system by creating and populating a new Connection, and then creating an Experience Rule that populates the payload when that connection is provided in context. It could even be a very similar or identical set of connection attrbutes, but interacting with a different Assistant or sending a different message across.

A trivial modification to the recipe would enable the response body to be delivered from a contextField instead of from an Experience Rule, allowing a "choose-your-own-adventure" interface to any remote endpoint.

Finally, modifications to the microservice, such as creating a second function that could be called, can enable even more complex behaviours such as call-poll-poll-call, call-stream-return, or even circular or tree-traversing workflows where a function calls itself or another microservice.

References