Prevent Spoilers with Stream Sync

Monterosa / Interaction SDK offers a method to synchronise content with a video stream, making it a valuable tool to prevent spoilers.

This ensures that the action is revealed at the right moment within the video stream. Its works by adjusting the internal timing, ensuring that all published elements are synchronised with the video stream's content.

Video stream absolute time

To achieve seamless synchronisation between video and Studio content, the video stream must provide the absolute time the stream was recorded somewhere in its metadata. While the SDK doesn't handle the streaming protocol directly, it offers a method to set and maintain the current delay.

The most common protocols are detailed below with guidance.

HLS (HTTP Live Streaming)

HLS uses media playlists that contain segments of the video at different quality levels. To add absolute time to HLS streams, you can include a EXT-X-PROGRAM-DATE-TIME tag in the media playlist. This tag specifies the date and time of the segment in the playlist. Subsequent segments will have timestamps relative to this initial time.



In Ant Media Server this can be achieved by adding settings.hlsflags=+program_date_time setting to the config.

DASH (Dynamic Adaptive Streaming over HTTP)

DASH uses MPD (Media Presentation Description) files to describe the video's structure and characteristics. You can add absolute time information to DASH streams by including "availabilityStartTime" in the MPD file. This attribute sets the availability start time for the media segments.


xmlCopy code<MPD xmlns="urn:mpeg:dash:schema:mpd:2011" availabilityStartTime="2023-07-31T12:00:00.000Z">

RTMP (Real-Time Messaging Protocol)

RTMP is a streaming protocol used for live streaming and on-demand video. Unlike HLS and DASH, RTMP does not inherently include absolute time metadata. However, you can embed time information in the video content itself or use external methods like cue points or custom metadata to convey absolute time.

WebRTC (Web Real-Time Communication)

WebRTC is mainly used for real-time communication, such as video conferencing, rather than traditional streaming. It doesn't inherently include absolute time information for video streams. However, you can add a custom timestamp as part of the video data payload or use signalling mechanisms to synchronise timestamps between sender and receiver.

Synchronising content

To synchronise content, InteractKit provides the function setDelay()with support for different interfaces:

  1. Setting delay as the difference between the current time and the video timestamp.

  2. Setting delay using an event's timecode.

  3. Setting delay as an absolute time.

Below is an example of how to use the setDelay() function with different options:

// Import necessary modules
import { configure } from '@monterosa-sdk/core';
import { setDelay, getProject, getEvent } from '@monterosa-sdk/interact-kit';

// Configure the SDK with the appropriate host and project details
configure({ host: '...', project: '...' });

// Get the project instance
const project = getProject();

// Option 1: Set a delay of 2 minutes
setDelay(project, 120);

// Option 2: Set delay to match an event's timecode of 01:30
const event = getEvent('...');
setDelay(project, event, 90);

// Option 3: Set delay to an absolute time 1 minute in the past
const time = new Date( - 60_000);
setDelay(project, time);

In this example, you import the required modules and configure the SDK with the appropriate host and project details. Then, you obtain the project instance using getProject(). The setDelay() function is then demonstrated with three different options:

  1. Option 1 sets a delay of 2 minutes using the value 120.

  2. Option 2 sets the delay to match an event's timecode of 01:30 (90 seconds).

  3. Option 3 sets the delay using an absolute time, which is 1 minute in the past, obtained by creating a Date object with a timestamp calculated using

After the setDelay() function is called with the chosen delay option, Studio events and elements will automatically update their states based on the set delay. This means that all published elements, interactions, and triggers within the Studio will be precisely aligned with the corresponding moments in the video stream. Users will experience a seamless and cohesive presentation, with content appearing, changing, or disappearing exactly as intended at the specified times.

By utilising the setDelay() function with these different interfaces, developers can seamlessly synchronise content in their applications, providing users with a cohesive and engaging experience.

Listening to the delay change

Any part of your application that needs to respond to changes in the video streaming timecode and synchronisation can do so by using the onDelayChanged() function. This function allows you to listen for changes in the delay and perform actions accordingly.

// Import necessary modules
import { configure } from '@monterosa-sdk/core';
import { onDelayChanged, getProject } from '@monterosa-sdk/interact-kit';

// Configure the SDK with the appropriate host and project details
configure({ host: '...', project: '...' });

// Get the project instance
const project = getProject();

// Listen for changes in the delay and log the new delay value in seconds
onDelayChanged(project, (delay) => console.log(`New delay: ${delay} seconds`));

In this example, after configuring the SDK with the appropriate host and project details, you get the project instance using getProject(). Then, by calling onDelayChanged() and passing the project instance along with a callback function, you can monitor any changes to the delay.

Whenever the delay changes, the callback function is triggered, and it logs the new delay value in seconds. You can customise the callback function to perform any desired actions based on the updated delay, such as updating UI elements, adjusting the timing of interactive content, or triggering specific events within your application.