Skip to main content

Integrating Colang with Cognigy

· 2 min read
Razvan Dinu

In this article, we look at a proof of concept integration of a colang model into an existing conversational AI platform using the CoAPI.

In the video above, we have a recording of how a customer support bot, built with Cognigy, handles a refund request for an order by handing over the conversation to a colang model.


First, to use a colang mode in a Cognigy flow, we set the context variable colang_model and call a separate flow which will use the CoAPI to communicate with the deployed bot:

Colang Model Flow

The Colang Model flow has two main stages:

  1. Initializing the conversation by passing all relevant context.
  2. Forwarding all relevant events and retrieving the bot responses.

We use a context variable conversation_id to track whether we've already initialized the conversation or not:

After the user message is posted using the API, we wait for the bot responses, forward them back to the user and wait for additional input:

Wrapping Up

Integrating a colang model into an existing conversational AI platform, like Cognigy, is straightforward. This proof of concept integration uses the CoAPI for models deployed using the Colang Playground. In a future article, we will look at more advanced patterns, including passing context information and handoff.