Building Condé Nast’s Bot Future

Jeff Israel
Chatbots Magazine
Published in
3 min readFeb 24, 2017

--

How Condé Nast is getting its feet wet with the next wave in user experience: conversational interfaces and the chatbot. This post was originally published on Condé Nast’s Engineering Blog.

Austin, Texas is home to an enthusiastic, growing, and active bot-building community. Companies such as Conversable, Message.io, and Howdy.ai are a few examples of this presence. Austin was also home to the first Talkabot conference in September 2016. Condé Nast sees bots as a new, interactive content delivery mechanism for many of its brands. Seeking to collaborate with a thriving tech community and innovate in a growing space, building a bot became the first project for Condé Nast’s new Austin-based Partnerships team.

The first two questions to answer were what experience to provide and what platform to use. Image filters in Snapchat and Instagram have proved to be very popular experiences. Users respond to and enjoy the augmented reality the filters provide. At the same time, an opportunity arose to work with a technology leader in the face and skin mapping arena. The combination of these elements formed the idea of allowing a user to try on iconic makeup looks with a bot. Send us a selfie, we will apply a look to it, and send you the result to share.

Trying on a look with the Beauty Lenses prototype

As far as platform choice, we chose Facebook Messenger. Messenger is the current, clear leader in chat platform usage with 900 million active users monthly. Messenger also boasts a rich set of APIs, providing the most interactive chat platform for experimentation. Content shared from Messenger to Facebook timelines can reach an even wider audience.

With these decisions made, it was time to start building.

One of the goals of the project, dubbed *Beauty Lenses*, was to create an SDK to ease future bot development. NodeJS is our stack of choice and the number of applicable SDKs available when we started this project was small. The evolving nature of the space makes it difficult to determine what modules will survive. Thus, we opted to write our own event driven SDK on top of Messenger’s APIs. The application itself ends up with a number of event subscribers such as:

This is in stark contrast to the bare bones approach we saw others libraries taking. Oftentimes they only provided the equivalent of a `message` event. Even the Hello World sample Node implementation had more logic than the SDKs out there. The SDK we have developed also provides a rich session store to help provide context on the user.

Dispatchers emit the session from the cache with the event, so the application itself can add data. We relied on object destructuring in Node 6 to keeping a terse syntax while giving event listeners all the necessary data. One way we use this to a user’s benefit in *Beauty Lenses* is by keeping track of what selfies they send, in the form of S3 object keys. A user can send one selfie and try many looks in succession.

Each platform has their own version of structured content display. For our Messenger implementation, we created simple, reusable objects to reduce code clutter.

We refrained from this for a long time to avoid hiding things in wrapper libraries. For example, we did not want to lose clarity around the structure of messages. During development, we decided that the added layer of abstraction was worthwhile. We extended this concept to Messenger’s templates as well.

The next parts of this series will dive deeper into the SDK, the data the bot generates, and lessons learned. For example, many best practice discussions assert that bots should always respond to greetings and help requests. To embrace this best practice, we extended our SDK to emit specific greeting and help events.

You might be wondering, “how can I get my hands on this SDK you keep talking about?”. We are working towards a 1.0 release in the very near future. However, the GitHub repo for launch-vehicle-fbm is open to the public for anyone that wants to take a look now.

--

--