Interactivity in live sport and music broadcasting could be the next game-changing technology upgrade.
A guest post by Adrian Pennington
Stripping a programme of its constituent elements and allowing viewers to compose their own version might sound like a bizarre thing to do, but technology is close to making this a reality.
Live content already comprises separate clean feeds of video, audio and graphics (a still image, a caption) before they are ‘baked in’ to the signal on broadcast. Object-based broad casting (OBB) simply extracts more original raw elements, treats them as data and lets a user’s web-based device adapt the broadcast according to context, such as screen size or viewer preference.
“The internet works by chopping things up, sending them over a network and reassembling them based on audience preference or device context,” explains BBC R&D head of operations Jon Page. “OBB is the idea of making media work like the internet.”
With the 4K Ultra HD chain largely solved, R&D teams have pinpointed interactivity of the live experience as the next game-changing tech upgrade.
BT Sport, Sky Sports and the BBC are all investigating OBB. BBC chief Putting the viewer in the driving seat technology officer Matthew Postgate has championed the idea, calling it “profound” and “little understood”.
“It’s about moving the whole industry away from thinking of video and audio as hermetically sealed, and towards a place where we are no longer broadcasters but datacasters,” he says.
Larger on-screen graphics for the visually impaired, or sign-language presenters in place of regular presenters to assist the hard of hearing, are two examples intended to improve accessibility. The BBC has demonstrated this with an object-based deconstruction of a weather forecast. Recordings of presenters delivering the forecast in front of a green screen were aligned with a data stream containing weather icons, animation data and subtitles, then delivered as a package for rendering on the client device.
OBB is likely to be commercialised initially in second-screen experiences. “The process of streaming what’s on the living room TV is broken,” argues Axonista chief technology officer Daragh Ward. “Audiences expect to interact with it.”
Axonista offers a content management system and a series of software templates that it says makes it easier for producers to deploy an OBB workflow instead of building one from scratch. Initially, this is based around extracting graphics from the live signal.
Its solution has been built into apps for shopping channel QVC, where the ‘buy now’ TV button becomes a touchscreen option on a smartphone; and at The QYou, an online curator of video clips that uses the technology to add interactivity to data about its content.
The idea could attract sport and music producers in particular. Makers of live music shows might want to overlay inter active information about performances for the second screen. Sport fans might want to select different leaderboards, heat maps or track positions over the live pictures. This works particularly well for data-intensive sports like Formula One or MotoGP, where members of the same family might want to check different aspects of the race in progress — a scenario that BT Sport is actively investigating.
In entertainment or daytime shows streamed to a second-screen app, an OBB workflow would enable onscreen tweets to be clicked on, replied to, or favourited. “The production workflow is unchanged and it means audiences can fully participate in a social media conversation without leaving the show itself,” says Ward.
Another idea is to make the scrolling ticker of news or finance channels interactive. “Instead of waiting for a headline to scroll around so you can read it again, you can click and jump straight to it,” he says. Since news is essentially a playlist of items, video content could also be rendered on-demand by way of the news menu.
This type of application still leaves the rump of content ‘baked in’, but offers a taste of OBB’s potential.
“All TV will be like this in future,” predicts Ward. “As TV sets gain gesture capability and force-feedback control, it allows new types of interactivity to be brought into the living room.”
The audio element of OBB is more advanced. Uefa is to trial object-based audio during live Euro 2016 matches from France this summer, using the Dolby Atmos audio system. It will be generated within the Telegenic outside broadcast facility, which is handling Uefa’s host 4K production, and services featuring it are likely to be introduced to consumers as part of a pay-TV operator’s 4K/ UHD package.
Dolby senior product marketing manager Rob France says broadcasters are keen to give subscribers more choice — for example, of commentary from a neutral or team/ fan perspective, in a different language, or a feed from a referee’s mic.
“Object-based audio brings more immersiveness for sports content, such as the sound of the PA and crowd, and also delivers greater personalisation by giving consumers more choice,” he adds.
The BBC’s ambitions are wider. Since making its first public demonstration of OBB during the 2014 Commonwealth Games, under the project name IP Studio, it has conducted numerous spin-off experiments. These range from online video instructions for kids on how to create a 3D chicken out of cardboard, to working with BBC News Labs to demonstrate how journalists can use ‘linked data’ to build stories.
The BBC believes object-based media is essential to harness the full potential of the IP-based broadcasting system to which the whole industry is migrating. It says the aim is not just to produce traditional content better or cheaper, but to pave the way for genuinely new experiences.
Object-based media can include a frame of video, or a line from a script. When conceived around story arcs, a ‘theme’ can be conceived of as an object. Each object is automatically assigned an identifier and a timestamp as soon as it is captured or created.
“The idea is to see if new types of content are possible and to minimise the incremental effort to get more content produced,” explains Page. “Instead of laboriously creating multiple versions of content, as we do now, an object-based production might be able to output more content more efficiently.”
BBC R&D’s Squeezebox, for example, enables users to adjust the duration of a news story using a slider control. The application aims to assist producers wanting to re-edit content rapidly to a different length, or to iterate multiple durations from a single edit. “Trying to do that with fully bespoke editing would be impractical,” says Page.
In object-based broadcasting, he adds, even broadcast equipment can be treated as an object. “A camera is a thing, an archive store is a thing, so is a vision mixer, and they are all connected over IP,” he says. “IP Studio orchestrates the network so that real-time collections of objects work as a media production environment.”
Multi-day, multi-site events like Glastonbury or the Olympics are some of the clearest applications for a workable IP Studio, offering chances for greater coverage and viewer customisation than was previously practical.
The next large-scale public trial of IP Studio will take place around the Edinburgh Festival in August. “The aim is to link the delivery side of IP Studio with the audience experience,” says Page.
Coming down the track is virtual reality, in which the individual viewer’s interaction with the media will change in space. This creates all sorts of challenges, but BBC R&D has been working on it since developing 3D virtual tracking system Piero in 2004, running through its work stitching together 360-degree video, and now into OBB.
“Either we need to produce multiple versions of the same content, which is expensive, or we capture an object once and work out how to render it,” says Page. “Ultimately, we need to change the production methodology. OBB as an ecosystem has barely begun.”
Adrian Pennington is a journalist and editor specializing in the creation, business and technology of moving image media. This article was originally published in Broadcast magazine.