News Labs + IRFS + CPS-Vivo Hackathon

cover-image

We’ve just held a two-day internal hackathon with the Internet Research and Future Services (IRFS) and Vivo-CPS teams, with support from BBC Connected Studio. Here’s what we built, what we learned and where we go from here.

The Challenge

The main focus of the hack was to think about how a range of different audience experiences might be created from the same set of content.

Our team was additionally interested in exploring what data structures and models might be needed to allow content to be reused across different contexts, as well as ways that multimedia content could be combined and delivered responsively. We have a long-standing interest in conversational user experiences, which we hoped Labbers could integrate into their projects, if so inspired.

The Winners

Most Likely to Appear on a Roadmap: Team Rule Breakers

News Labber Dave Bevan worked with a few of the CPS team and a UX designer to integrate Elvis (the BBC’s image repository) with one of the BBC’s online content production systems, called Vivo. The team’s objective was to solve the problem of connecting cloud based and non-cloud based BBC systems while maintaining security.

In the current workflow, a journalist looking to add an image to a Vivo post would have to access Elvis from outside of the CMS, search for an image, download it and re-upload it to Vivo after manually retyping all the metadata — an especially inefficient process for live pages. Team Rule Breakers’ prototype experimented with using Amazon’s Simple Queue Services (SQS) to send messages back and forth over the firewall that separates Reith (the BBC’s corporate network) and Vivo on AWS. Building the prototype involved making changes to Vivo user interface, adding a new API, and building a new component to run on Reith that can consume SQS messages, make calls to Elvis, and return JSON-encoded data to Vivo. Besides a live demo, they also designed a brand new UI for Elvis-in-Vivo.

Best Collaboration: Saturday News Edition

News Labbers Rob Cochran, Alli Shultes and Gar Thomas worked with members of the Vivo-CPS team to produce a “weekly edition” of all content published by the BBC. The edition is navigable by tag selection and designed to solve the problem of content discovery, where visitors to the BBC might not know where to look to get beyond the day’s top stories. By enriching data from the CPS API, the team was able to generate a sorted list of articles published over seven days, with associated links and images.

Saturday news
A prototype of a weekly roundup of content, categorised by tag.

Surprise Us!: Team West-17

News Labber Lei He worked with members of Vivo-CPS and R&D to prototype a Google Home Action which exposes content from Vivo that is suitable for voice. The technology works by extracting intended entities from spoken speech and retrieving the relevant Vivo content using the Vivo API. Users can ask for the latest news and be returned a list of headlines that have been recently published. They can also ask for the latest on a specific entity (“What’s the latest on Donald Trump?”). The Google Home Action reads out relevant material to the user.

Team West-17 was also designated the overall winner of the hackathon for its work.

All Projects

Team CPS

What if journalists didn’t have to manually suggest relevant tags for their articles before publishing? Team CPS prototyped automated tag suggestion as part of the standard production process.

This was done as an integration between Vivo and Starfruit, R&D’s API enabling suggested tags for articles. Starfruit has been trained on a corpus of BBC News stories and their tags. The recommendation engine built by Team CPS returns the five most relevant tags for the body text and post title, based on Starfruit’s suggestions, and attaches them to the draft story. Bonus feature: if Starfruit fails, Vivo ignores the automatic tagging without stopping the publication of a post.

Team MadameCurtis

MadameCurtis experimented with new ways for delivering video content from BBC Rewind and Window on the Newsroom (WON). By soliciting user feedback on a video using “thumbs up/thumbs down” input, the team’s project algorithmically adjusts the content that it next shows the user.

Suggested content is generated after running video files through the BBC’s Redux annotator and BBC News Labs’ Transcriptor tool. After passing through this process, videos are typically pre-packaged based on set start and end points. MadameCurtis instead created a “halfway video”, where viewers start at a specified entry point but then can help “make up” the next part of the story based on their interests.

The algorithm feeding the new format took 50,000 news articles and used machine learning to determine the tags that typically occur together. By starting users on a random tag and then providing recommendations for onward journeys by pulling from related tags, it helps deliver a personalised video experience.

Team Atomiserranger

News Labber Rachel Wilson experimented with ways that Vivo posts could be pulled together to form an article in a non-linear way. Currently, Vivo streams are comprised of elements which are stitched together by posting in reverse chronological order, with no way to capture more detailed relationships between posts other than the publishing time. Rachel put together a Neo4J database to visualise the different ways a journalist might arrange posts into a more engaging news story.

Atomiserranger
Atomiserranger visualises relationships between Vivo elements.

Rachel received honourable mention in the “Surprise Us!” category for her work.

Team Pajaro

Team Pajaro built a Freebird-Vivo integration that allows journalists to search for relevant external content and reformat it as a post in BBC live pages. Journalists would search for external news using an editorial algorithm, and select returned stories that would be automatically assembled by populating author, title and summary fields, as well as additional information about the publisher and a link out to the original content.

What Next?

Many of the prototypes developed during the two-day hack are both feasible projects and potentially useful for audiences and journalists at the BBC. We’re exploring how to move forward with the fantastic ideas generated. We’ll keep you posted on future updates here.


Categories:

Tags: