Skip to main content

Introduction to CoAPI

· 4 min read
Razvan Dinu

Colang was built to address the core challenges in building advanced conversational bots. One of these challenges is interoperability. In this article, we look at the motivation and the high-level design of the CoAPI.

Many companies that use conversational AI have invested in internal systems built using open-source technologies, existing conversational AI platforms, or both. Like web and mobile applications development, building conversational bots requires utilizing many tools and techniques, each being the best for specific use cases. Colang aims to be the best at handling advanced conversational scenarios, i.e., conversations spanning tens of turns, with many potential interruptions, requiring complex context tracking.


Achieving interoperability in conversational AI is an open challenge. In software development, interoperability is characteristic of a product or system to work with other products or systems, and it is mainly achieved through open standards and APIs. So, what does interoperability mean in the context of conversational AI?

We can talk about interoperability at every stage in the lifecycle of a conversational AI project, i.e., design, development, testing, deployment, live, monitoring, analytics, etc. In the following sections, we will focus on live interoperability, i.e., the ability of multiple bots to handle the conversation collaboratively with a human.


Live Conversational AI Interoperability is the ability of multiple bots to handle a conversation with a human collaboratively by passing the control, the conversation context, and the relevant events to each other.

Models, Bots, and Environments

Before continuing, let's look at the relationship between models, bots, and environments.

A colang model can represent a specific behavior, like greeting, user authentication or choosing a time slot, or represent a complete virtual assistant e.g. customer support. Any colang model can be deployed as a bot in a conversational AI environment.

A conversational AI environment is a backend environment, managed or self-hosted, in which colang models can be deployed as bots.


The CoAPI enables the configuration, deployment, and interaction with bots in a conversational AI environment. In the following sections, we look at how the CoAPI was designed to enable live interoperability as defined above.


From the control perspective, the API uses a standard REST architecture with basic HTTP authentication. It can be used to start a conversation with a bot, post messages or events, and wait for bot responses.

The API assumes an asynchronous communication model, i.e., at any point in time, messages and events can be added to the conversation by any party. This is different from the typical request-response model used by many existing platforms and provides more flexibility, especially when the bot needs to perform background work.


The context of a conversation is represented by the user information, their rights, their preferences, the communication channel(s), data from previous conversations, and any domain-specific data required. The CoAPI accepts a conversation context as a standard JSON object, and it can be updated and read at any time.


Each conversation is a sequence of events, the most common ones being user_said and bot_said. The CoAPI provides endpoints for posting generic events, and shorthands for posting text messages. This approach enables a clean integration of other types of events such as user silent, dtmf_data, user navigation, or handoff, which can be used directly in colang flows.

Wrapping Up

The CoAPI is in private beta. If you have access to the Playground, you automatically have access to the API as well. The complete API documentation can be found here.

Integrating conversational AI components, regardless of the technology they've been built with, should be easy. The CoAPI has been designed to make it possible to integrate colang models into existing systems. Allowing conversational components to be reused across multiple platforms and use cases will lead to increased efficiency in building conversational system and high quality customer experience.

In the next article, we will look at a proof of concept integration of a colang model into an existing conversational AI platform using the CoAPI.