From e78bc2532bc251d65b0a00f6754b98050bfeb2b5 Mon Sep 17 00:00:00 2001 From: Santiago-Souto Date: Wed, 14 Aug 2024 17:41:55 +0000 Subject: [PATCH] =?UTF-8?q?Deploying=20to=20docs=20from=20@=20millicast/mi?= =?UTF-8?q?llicast-sdk@ff6d9f5c650c8b6c5403de6a8637f52cd8e8dc82=20?= =?UTF-8?q?=F0=9F=9A=80?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- BaseWebRTC.html | 2 +- PeerConnection.html | 2 +- PeerConnection.js.html | 18 +++++++++--------- PeerConnectionStats.js.html | 30 ++++++++++++++++++++---------- Publish.html | 2 +- Publish.js.html | 2 +- View.html | 4 ++-- View.js.html | 16 ++++++++-------- data/search.json | 2 +- global.html | 2 +- utils_BaseWebRTC.js.html | 3 +++ 11 files changed, 48 insertions(+), 35 deletions(-) diff --git a/BaseWebRTC.html b/BaseWebRTC.html index 38f74ab2..66de76c4 100644 --- a/BaseWebRTC.html +++ b/BaseWebRTC.html @@ -1,3 +1,3 @@ Class: BaseWebRTC
On this page

BaseWebRTC

Base class for common actions about peer connection and reconnect mechanism for Publishers and Viewer instances.

Constructor

new BaseWebRTC(streamName, tokenGenerator, loggerInstance, autoReconnect)

Parameters:
NameTypeDescription
streamNameString

Deprecated: Millicast existing stream name. Use tokenGenerator instead.

tokenGeneratortokenGeneratorCallback

Callback function executed when a new token is needed.

loggerInstanceObject

Logger instance from the extended classes.

autoReconnectBoolean

Enable auto reconnect.

Extends

  • EventEmitter

Methods

getRTCPeerConnection() → {RTCPeerConnection}

Get current RTC peer connection.

Returns:

Object which represents the RTCPeerConnection.

Type: 
RTCPeerConnection

isActive() → {Boolean}

Get if the current connection is active.

Returns:
  • True if connected, false if not.
Type: 
Boolean

(async) reconnect(dataopt)

Reconnects to last broadcast.

Parameters:
NameTypeAttributesDescription
dataObject<optional>

This object contains the error property. It may be expanded to contain more information in the future.

Properties
NameTypeDescription
errorString

The value sent in the first reconnect event within the error key of the payload

setReconnect()

Sets reconnection if autoReconnect is enabled.

stop()

Stops connection.

Events

reconnect

Emits with every reconnection attempt made when an active stream stopped unexpectedly.

Type:
  • Object
Properties
NameTypeDescription
timeoutNumber

Next retry interval in milliseconds.

errorError

Error object with cause of failure. Possible errors are:

  • Signaling error: wsConnectionError if there was an error in the Websocket connection.
  • Connection state change: RTCPeerConnectionState disconnected if there was an error in the RTCPeerConnection.
  • Attempting to reconnect if the reconnect was trigered externally.
  • Or any internal error thrown by either Publish.connect or View.connect methods

\ No newline at end of file +
On this page

BaseWebRTC

Base class for common actions about peer connection and reconnect mechanism for Publishers and Viewer instances.

Constructor

new BaseWebRTC(streamName, tokenGenerator, loggerInstance, autoReconnect)

Parameters:
NameTypeDescription
streamNameString

Deprecated: Millicast existing stream name. Use tokenGenerator instead.

tokenGeneratortokenGeneratorCallback

Callback function executed when a new token is needed.

loggerInstanceObject

Logger instance from the extended classes.

autoReconnectBoolean

Enable auto reconnect.

Extends

  • EventEmitter

Methods

getRTCPeerConnection() → {RTCPeerConnection}

Get current RTC peer connection.

Returns:

Object which represents the RTCPeerConnection.

Type: 
RTCPeerConnection

isActive() → {Boolean}

Get if the current connection is active.

Returns:
  • True if connected, false if not.
Type: 
Boolean

(async) reconnect(dataopt)

Reconnects to last broadcast.

Parameters:
NameTypeAttributesDescription
dataObject<optional>

This object contains the error property. It may be expanded to contain more information in the future.

Properties
NameTypeDescription
errorString

The value sent in the first reconnect event within the error key of the payload

setReconnect()

Sets reconnection if autoReconnect is enabled.

stop()

Stops connection.

Events

reconnect

Emits with every reconnection attempt made when an active stream stopped unexpectedly.

Type:
  • Object
Properties
NameTypeDescription
timeoutNumber

Next retry interval in milliseconds.

errorError

Error object with cause of failure. Possible errors are:

  • Signaling error: wsConnectionError if there was an error in the Websocket connection.
  • Connection state change: RTCPeerConnectionState disconnected if there was an error in the RTCPeerConnection.
  • Attempting to reconnect if the reconnect was trigered externally.
  • Or any internal error thrown by either Publish.connect or View.connect methods

\ No newline at end of file diff --git a/PeerConnection.html b/PeerConnection.html index af0851f6..d653b603 100644 --- a/PeerConnection.html +++ b/PeerConnection.html @@ -1,6 +1,6 @@ Class: PeerConnection
On this page

PeerConnection

Manages WebRTC connection and SDP information between peers.

Constructor

new PeerConnection()

Example
const peerConnection = new PeerConnection()

Extends

  • EventEmitter

Methods

(async) addRemoteTrack(media, streams) → {Promise.<RTCRtpTransceiver>}

Add remote receving track.

Parameters:
NameTypeDescription
mediaString

Media kind ('audio' | 'video').

streamsArray.<MediaStream>

Streams the track will belong to.

Returns:

Promise that will be resolved when the RTCRtpTransceiver is assigned an mid value.

Type: 
Promise.<RTCRtpTransceiver>

(async) closeRTCPeer()

Close RTC peer connection.

(async) createRTCPeer(config, modeopt)

Instance new RTCPeerConnection.

Parameters:
NameTypeAttributesDefaultDescription
configRTCConfiguration

Peer configuration.

Properties
NameTypeAttributesDefaultDescription
autoInitStatsBoolean<optional>
true

True to initialize statistics monitoring of the RTCPeerConnection accessed via Logger.get(), false to opt-out.

statsIntervalMsNumber<optional>
1000

The default interval at which the SDK will return WebRTC stats to the consuming application.

modeString<optional>
"Viewer"

Type of connection that is trying to be created, either 'Viewer' or 'Publisher'.

(async) getRTCLocalSDP(options) → {Promise.<String>}

Get the SDP modified depending the options. Optionally set the SDP information to local peer.

Parameters:
NameTypeDescription
optionsObject
Properties
NameTypeDescription
stereoBoolean

True to modify SDP for support stereo. Otherwise False.

dtxBoolean

True to modify SDP for supporting dtx in opus. Otherwise False.*

mediaStreamMediaStream | Array.<MediaStreamTrack>

MediaStream to offer in a stream. This object must have 1 audio track and 1 video track, or at least one of them. Alternative you can provide both tracks in an array.

codecVideoCodec

Selected codec for support simulcast.

simulcastBoolean

True to modify SDP for support simulcast. Only available in Chromium based browsers and with H.264 or VP8 video codecs.

scalabilityModeString

Selected scalability mode. You can get the available capabilities using PeerConnection.getCapabilities method. Only available in Google Chrome.

absCaptureTimeBoolean

True to modify SDP for supporting absolute capture time header extension. Otherwise False.

dependencyDescriptorBoolean

True to modify SDP for supporting aom dependency descriptor header extension. Otherwise False.

disableAudioBoolean

True to not support audio.

disableVideoBoolean

True to not support video.

setSDPToPeerBoolean

True to set the SDP to local peer.

Returns:

Promise object which represents the SDP information of the created offer.

Type: 
Promise.<String>

getRTCPeer() → {RTCPeerConnection}

Get current RTC peer connection.

Returns:

Object which represents the RTCPeerConnection.

Type: 
RTCPeerConnection

getRTCPeerStatus() → (nullable) {RTCPeerConnectionState}

Get peer connection state.

Returns:

Promise object which represents the peer connection state.

Type: 
RTCPeerConnectionState

getTracks() → {Array.<MediaStreamTrack>}

Get sender tracks

Returns:

An array with all tracks in sender peer.

Type: 
Array.<MediaStreamTrack>

initStats()

Initialize the statistics monitoring of the RTCPeerConnection.

It will be emitted every second.

Fires:
  • PeerConnection#event:stats
Examples
peerConnection.initStats()
import Publish from '@millicast/sdk'
+    
On this page

PeerConnection

Manages WebRTC connection and SDP information between peers.

Constructor

new PeerConnection()

Example
const peerConnection = new PeerConnection()

Extends

  • EventEmitter

Methods

(async) addRemoteTrack(media, streams) → {Promise.<RTCRtpTransceiver>}

Add remote receiving track.

Parameters:
NameTypeDescription
mediaString

Media kind ('audio' | 'video').

streamsArray.<MediaStream>

Streams the track will belong to.

Returns:

Promise that will be resolved when the RTCRtpTransceiver is assigned an mid value.

Type: 
Promise.<RTCRtpTransceiver>

(async) closeRTCPeer()

Close RTC peer connection.

(async) createRTCPeer(config, modeopt)

Instance new RTCPeerConnection.

Parameters:
NameTypeAttributesDefaultDescription
configRTCConfiguration

Peer configuration.

Properties
NameTypeAttributesDefaultDescription
autoInitStatsBoolean<optional>
true

True to initialize statistics monitoring of the RTCPeerConnection accessed via Logger.get(), false to opt-out.

statsIntervalMsNumber<optional>
1000

The default interval at which the SDK will return WebRTC stats to the consuming application.

modeString<optional>
"Viewer"

Type of connection that is trying to be created, either 'Viewer' or 'Publisher'.

(async) getRTCLocalSDP(options) → {Promise.<String>}

Get the SDP modified depending the options. Optionally set the SDP information to local peer.

Parameters:
NameTypeDescription
optionsObject
Properties
NameTypeDescription
stereoBoolean

True to modify SDP for support stereo. Otherwise False.

dtxBoolean

True to modify SDP for supporting dtx in opus. Otherwise False.*

mediaStreamMediaStream | Array.<MediaStreamTrack>

MediaStream to offer in a stream. This object must have 1 audio track and 1 video track, or at least one of them. Alternative you can provide both tracks in an array.

codecVideoCodec

Selected codec for support simulcast.

simulcastBoolean

True to modify SDP for support simulcast. Only available in Chromium based browsers and with H.264 or VP8 video codecs.

scalabilityModeString

Selected scalability mode. You can get the available capabilities using PeerConnection.getCapabilities method. Only available in Google Chrome.

absCaptureTimeBoolean

True to modify SDP for supporting absolute capture time header extension. Otherwise False.

dependencyDescriptorBoolean

True to modify SDP for supporting aom dependency descriptor header extension. Otherwise False.

disableAudioBoolean

True to not support audio.

disableVideoBoolean

True to not support video.

setSDPToPeerBoolean

True to set the SDP to local peer.

Returns:

Promise object which represents the SDP information of the created offer.

Type: 
Promise.<String>

getRTCPeer() → {RTCPeerConnection}

Get current RTC peer connection.

Returns:

Object which represents the RTCPeerConnection.

Type: 
RTCPeerConnection

getRTCPeerStatus() → (nullable) {RTCPeerConnectionState}

Get peer connection state.

Returns:

Promise object which represents the peer connection state.

Type: 
RTCPeerConnectionState

getTracks() → {Array.<MediaStreamTrack>}

Get sender tracks

Returns:

An array with all tracks in sender peer.

Type: 
Array.<MediaStreamTrack>

initStats()

Initialize the statistics monitoring of the RTCPeerConnection.

It will be emitted every second.

Fires:
  • PeerConnection#event:stats
Examples
peerConnection.initStats()
import Publish from '@millicast/sdk'
 
 //Initialize and connect your Publisher
 const millicastPublish = new Publish(streamName, tokenGenerator)
diff --git a/PeerConnection.js.html b/PeerConnection.js.html
index 920c484f..4bd8cd56 100644
--- a/PeerConnection.js.html
+++ b/PeerConnection.js.html
@@ -167,7 +167,7 @@
   }
 
   /**
-   * Add remote receving track.
+   * Add remote receiving track.
    * @param {String} media - Media kind ('audio' | 'video').
    * @param {Array<MediaStream>} streams - Streams the track will belong to.
    * @return {Promise<RTCRtpTransceiver>} Promise that will be resolved when the RTCRtpTransceiver is assigned an mid value.
@@ -194,7 +194,7 @@
    */
   updateBandwidthRestriction (sdp, bitrate) {
     if (this.mode === ConnectionType.Viewer) {
-      logger.error('Viewer attempting to udpate bitrate, this is not allowed')
+      logger.error('Viewer attempting to update bitrate, this is not allowed')
       throw new Error('It is not possible for a viewer to update the bitrate.')
     }
 
@@ -210,7 +210,7 @@
    */
   async updateBitrate (bitrate = 0) {
     if (this.mode === ConnectionType.Viewer) {
-      logger.error('Viewer attempting to udpate bitrate, this is not allowed')
+      logger.error('Viewer attempting to update bitrate, this is not allowed')
       throw new Error('It is not possible for a viewer to update the bitrate.')
     }
     if (!this.peer) {
@@ -223,7 +223,7 @@
     await this.peer.setLocalDescription(this.sessionDescription)
     const sdp = this.updateBandwidthRestriction(this.peer.remoteDescription.sdp, bitrate)
     await this.setRTCRemoteSDP(sdp)
-    logger.info('Bitrate restirctions updated: ', `${bitrate > 0 ? bitrate : 'unlimited'} kbps`)
+    logger.info('Bitrate restrictions updated: ', `${bitrate > 0 ? bitrate : 'unlimited'} kbps`)
   }
 
   /**
@@ -276,9 +276,9 @@
    */
   static getCapabilities (kind) {
     const browserData = new UserAgent()
-    const browserCapabilites = RTCRtpSender.getCapabilities(kind)
+    const browserCapabilities = RTCRtpSender.getCapabilities(kind)
 
-    if (browserCapabilites) {
+    if (browserCapabilities) {
       const codecs = {}
       let regex = new RegExp(`^video/(${Object.values(VideoCodec).join('|')})x?$`, 'i')
 
@@ -290,7 +290,7 @@
         }
       }
 
-      for (const codec of browserCapabilites.codecs) {
+      for (const codec of browserCapabilities.codecs) {
         const matches = codec.mimeType.match(regex)
         if (matches) {
           const codecName = matches[1].toLowerCase()
@@ -306,10 +306,10 @@
         }
       }
 
-      browserCapabilites.codecs = Object.keys(codecs).map((key) => { return { codec: key, ...codecs[key] } })
+      browserCapabilities.codecs = Object.keys(codecs).map((key) => { return { codec: key, ...codecs[key] } })
     }
 
-    return browserCapabilites
+    return browserCapabilities
   }
 
   /**
diff --git a/PeerConnectionStats.js.html b/PeerConnectionStats.js.html
index 16049dd4..06620a23 100644
--- a/PeerConnectionStats.js.html
+++ b/PeerConnectionStats.js.html
@@ -3,7 +3,7 @@
     
On this page

PeerConnectionStats.js

import EventEmitter from 'events'
 import Logger from './Logger'
 import Diagnostics from './utils/Diagnostics'
-import WebRTCStats from '@dolbyio/webrtc-stats'
+import { WebRTCStats } from '@dolbyio/webrtc-stats'
 
 const logger = Logger.get('PeerConnectionStats')
 
@@ -48,7 +48,8 @@
  * @property {Number} totalPacketsLost - Total packets lost.
  * @property {Number} packetsLostRatioPerSecond - Total packet lost ratio per second.
  * @property {Number} packetsLostDeltaPerSecond - Total packet lost delta per second.
- * @property {Number} bitrate - Current bitrate in bits per second.
+ * @property {Number} bitrate - Current bitrate in Bytes per second.
+ * @property {Number} bitrateBitsPerSecond - Current bitrate in bits per second.
  * @property {Number} packetRate - The rate at which packets are being received, measured in packets per second.
  * @property {Number} jitterBufferDelay - Total delay in seconds currently experienced by the jitter buffer.
  * @property {Number} jitterBufferEmittedCount - Total number of packets emitted from the jitter buffer.
@@ -64,7 +65,8 @@
  * @property {String} [qualityLimitationReason] - If it's video report, indicate the reason why the media quality in the stream is currently being reduced by the codec during encoding, or none if no quality reduction is being performed.
  * @property {Number} timestamp - Timestamp of report.
  * @property {Number} totalBytesSent - Total bytes sent indicates the total number of payload bytes that hve been sent so far on the connection described by the candidate pair.
- * @property {Number} bitrate - Current bitrate in bits per second.
+ * @property {Number} bitrate - Current bitrate in Bytes per second.
+ * @property {Number} bitrateBitsPerSecond - Current bitrate in bits per second.
  * @property {Number} bytesSentDelta - Change in the number of bytes sent since the last report.
  * @property {Number} totalPacketsSent - Total number of packets sent.
  * @property {Number} packetsSentDelta - Change in the number of packets sent since the last report.
@@ -94,26 +96,34 @@
   const statsObject = {
     ...filteredStats,
     audio: {
-      inbounds: webRTCStats.input.audio.map(({ packetLossRatio: packetsLostRatioPerSecond, packetLossDelta: packetsLostDeltaPerSecond, ...rest }) => ({
+      inbounds: webRTCStats.input.audio.map(({ packetLossRatio: packetsLostRatioPerSecond, packetLossDelta: packetsLostDeltaPerSecond, bitrate, ...rest }) => ({
         packetsLostRatioPerSecond,
         packetsLostDeltaPerSecond,
+        bitrateBitsPerSecond: bitrate * 8,
+        bitrate,
         ...rest
       })),
-      outbounds: webRTCStats.output.audio.map(({ packetLossRatio: packetsLostRatioPerSecond, packetLossDelta: packetsLostDeltaPerSecond, ...rest }) => ({
+      outbounds: webRTCStats.output.audio.map(({ packetLossRatio: packetsLostRatioPerSecond, packetLossDelta: packetsLostDeltaPerSecond, bitrate, ...rest }) => ({
         packetsLostRatioPerSecond,
         packetsLostDeltaPerSecond,
+        bitrateBitsPerSecond: bitrate * 8,
+        bitrate,
         ...rest
       }))
     },
     video: {
-      inbounds: webRTCStats.input.video.map(({ packetLossRatio: packetsLostRatioPerSecond, packetLossDelta: packetsLostDeltaPerSecond, ...rest }) => ({
+      inbounds: webRTCStats.input.video.map(({ packetLossRatio: packetsLostRatioPerSecond, packetLossDelta: packetsLostDeltaPerSecond, bitrate, ...rest }) => ({
         packetsLostRatioPerSecond,
         packetsLostDeltaPerSecond,
+        bitrateBitsPerSecond: bitrate * 8,
+        bitrate,
         ...rest
       })),
-      outbounds: webRTCStats.output.video.map(({ packetLossRatio: packetsLostRatioPerSecond, packetLossDelta: packetsLostDeltaPerSecond, ...rest }) => ({
+      outbounds: webRTCStats.output.video.map(({ packetLossRatio: packetsLostRatioPerSecond, packetLossDelta: packetsLostDeltaPerSecond, bitrate, ...rest }) => ({
         packetsLostRatioPerSecond,
         packetsLostDeltaPerSecond,
+        bitrateBitsPerSecond: bitrate * 8,
+        bitrate,
         ...rest
       }))
     },
@@ -155,9 +165,9 @@
       })
 
       this.collection.on('stats', (stats) => {
-        const parsedSats = parseWebRTCStats(stats)
-        Diagnostics.addStats(parsedSats)
-        this.emit(peerConnectionStatsEvents.stats, parsedSats)
+        const parsedStats = parseWebRTCStats(stats)
+        Diagnostics.addStats(parsedStats)
+        this.emit(peerConnectionStatsEvents.stats, parsedStats)
       })
       this.collection.start()
       this.initialized = true
diff --git a/Publish.html b/Publish.html
index f6daf251..f74b8891 100644
--- a/Publish.html
+++ b/Publish.html
@@ -23,4 +23,4 @@
  await millicastPublish.connect(broadcastOptions)
 } catch (e) {
  console.log('Connection failed, handle error', e)
-}

getRTCPeerConnection() → {RTCPeerConnection}

Get current RTC peer connection.

Returns:

Object which represents the RTCPeerConnection.

Type: 
RTCPeerConnection

isActive() → {Boolean}

Get if the current connection is active.

Inherited From
Returns:
  • True if connected, false if not.
Type: 
Boolean

(async) reconnect(dataopt)

Reconnects to last broadcast.

Parameters:
NameTypeAttributesDescription
dataObject<optional>

This object contains the error property. It may be expanded to contain more information in the future.

Properties
NameTypeDescription
errorString

The value sent in the first reconnect event within the error key of the payload

(async) record()

Initialize recording in an active stream and change the current record option.

sendMetadata(message, uuidopt)

Send SEI user unregistered data as part of the frame being streamed. Only available for H.264 codec.

Parameters:
NameTypeAttributesDefaultDescription
messageString | Object

The data to be sent as SEI user unregistered data.

uuidString<optional>
"d40e38ea-d419-4c62-94ed-20ac37b4e4fa"

String with UUID format as hex digit (XXXX-XX-XX-XX-XXXXXX).

setReconnect()

Sets reconnection if autoReconnect is enabled.

stop()

Stops connection.

(async) unrecord()

Finalize recording in an active stream and change the current record option.

Events

reconnect

Emits with every reconnection attempt made when an active stream stopped unexpectedly.

Type:
  • Object
Properties
NameTypeDescription
timeoutNumber

Next retry interval in milliseconds.

errorError

Error object with cause of failure. Possible errors are:

  • Signaling error: wsConnectionError if there was an error in the Websocket connection.
  • Connection state change: RTCPeerConnectionState disconnected if there was an error in the RTCPeerConnection.
  • Attempting to reconnect if the reconnect was trigered externally.
  • Or any internal error thrown by either Publish.connect or View.connect methods

\ No newline at end of file +}

getRTCPeerConnection() → {RTCPeerConnection}

Get current RTC peer connection.

Returns:

Object which represents the RTCPeerConnection.

Type: 
RTCPeerConnection

isActive() → {Boolean}

Get if the current connection is active.

Inherited From
Returns:
  • True if connected, false if not.
Type: 
Boolean

(async) reconnect(dataopt)

Reconnects to last broadcast.

Parameters:
NameTypeAttributesDescription
dataObject<optional>

This object contains the error property. It may be expanded to contain more information in the future.

Properties
NameTypeDescription
errorString

The value sent in the first reconnect event within the error key of the payload

(async) record()

Initialize recording in an active stream and change the current record option.

sendMetadata(message, uuidopt)

Send SEI user unregistered data as part of the frame being streamed. Only available for H.264 codec.

Parameters:
NameTypeAttributesDefaultDescription
messageString | Object

The data to be sent as SEI user unregistered data.

uuidString<optional>
"d40e38ea-d419-4c62-94ed-20ac37b4e4fa"

String with UUID format as hex digit (XXXX-XX-XX-XX-XXXXXX).

setReconnect()

Sets reconnection if autoReconnect is enabled.

stop()

Stops connection.

(async) unrecord()

Finalize recording in an active stream and change the current record option.

Events

reconnect

Emits with every reconnection attempt made when an active stream stopped unexpectedly.

Type:
  • Object
Properties
NameTypeDescription
timeoutNumber

Next retry interval in milliseconds.

errorError

Error object with cause of failure. Possible errors are:

  • Signaling error: wsConnectionError if there was an error in the Websocket connection.
  • Connection state change: RTCPeerConnectionState disconnected if there was an error in the RTCPeerConnection.
  • Attempting to reconnect if the reconnect was trigered externally.
  • Or any internal error thrown by either Publish.connect or View.connect methods

\ No newline at end of file diff --git a/Publish.js.html b/Publish.js.html index 1fdaa42f..29dc5f81 100644 --- a/Publish.js.html +++ b/Publish.js.html @@ -314,7 +314,7 @@ if (this.options?.metadata && this.worker) { this.worker.postMessage({ action: 'metadata-sei-user-data-unregistered', - uuid: uuid, + uuid, payload: message }) } else { diff --git a/View.html b/View.html index 1fb07178..6dddee6b 100644 --- a/View.html +++ b/View.html @@ -1,6 +1,6 @@ Class: View
On this page

View

Manages connection with a secure WebSocket path to signal the Millicast server and establishes a WebRTC connection to view a live stream.

Before you can view an active broadcast, you will need:

  • A connection path that you can get from Director module or from your own implementation.

Constructor

new View(streamName, tokenGenerator, mediaElementopt, autoReconnectopt)

Parameters:
NameTypeAttributesDefaultDescription
streamNameString

Deprecated: Millicast existing stream name.

tokenGeneratortokenGeneratorCallback

Callback function executed when a new token is needed.

mediaElementHTMLMediaElement<optional>
null

Target HTML media element to mount stream.

autoReconnectBoolean<optional>
true

Enable auto reconnect to stream.

Extends

Methods

(async) addRemoteTrack(media, streams) → {Promise.<RTCRtpTransceiver>}

Add remote receving track.

Parameters:
NameTypeDescription
mediaString

Media kind ('audio' | 'video').

streamsArray.<MediaStream>

Streams the track will belong to.

Returns:

Promise that will be resolved when the RTCRtpTransceiver is assigned an mid value.

Type: 
Promise.<RTCRtpTransceiver>

(async) connect(optionsopt) → {Promise.<void>}

Connects to an active stream as subscriber.

In the example, addStreamToYourVideoTag and getYourSubscriberConnectionPath is your own implementation.

Parameters:
NameTypeAttributesDescription
optionsObject<optional>

General subscriber options.

Properties
NameTypeAttributesDefaultDescription
dtxBoolean<optional>
false

True to modify SDP for supporting dtx in opus. Otherwise False.

absCaptureTimeBoolean<optional>
false

True to modify SDP for supporting absolute capture time header extension. Otherwise False.

metadataBoolean<optional>
false

Enable metadata extraction if stream is compatible.

disableVideoBoolean<optional>
false

Disable the opportunity to receive video stream.

disableAudioBoolean<optional>
false

Disable the opportunity to receive audio stream.

multiplexedAudioTracksNumber<optional>

Number of audio tracks to recieve VAD multiplexed audio for secondary sources.

pinnedSourceIdString<optional>

Id of the main source that will be received by the default MediaStream.

excludedSourceIdsArray.<String><optional>

Do not receive media from the these source ids.

eventsArray.<String><optional>

Override which events will be delivered by the server (any of "active" | "inactive" | "vad" | "layers" | "viewercount" | "updated").*

peerConfigRTCConfiguration<optional>

Options to configure the new RTCPeerConnection.

layerLayerInfo<optional>

Select the simulcast encoding layer and svc layers for the main video track, leave empty for automatic layer selection based on bandwidth estimation.

forcePlayoutDelayObject<optional>
false

Ask the server to use the playout delay header extension.

Properties
NameTypeAttributesDescription
minNumber<optional>

Set minimum playout delay value.

maxNumber<optional>

Set maximum playout delay value.

Returns:

Promise object which resolves when the connection was successfully established.

Type: 
Promise.<void>
Examples
await millicastView.connect(options)
import View from '@millicast/sdk'
+    
On this page

View

Manages connection with a secure WebSocket path to signal the Millicast server and establishes a WebRTC connection to view a live stream.

Before you can view an active broadcast, you will need:

  • A connection path that you can get from Director module or from your own implementation.

Constructor

new View(streamName, tokenGenerator, mediaElementopt, autoReconnectopt)

Parameters:
NameTypeAttributesDefaultDescription
streamNameString

Deprecated: Millicast existing stream name.

tokenGeneratortokenGeneratorCallback

Callback function executed when a new token is needed.

mediaElementHTMLMediaElement<optional>
null

Target HTML media element to mount stream.

autoReconnectBoolean<optional>
true

Enable auto reconnect to stream.

Extends

Methods

(async) addRemoteTrack(media, streams) → {Promise.<RTCRtpTransceiver>}

Add remote receiving track.

Parameters:
NameTypeDescription
mediaString

Media kind ('audio' | 'video').

streamsArray.<MediaStream>

Streams the track will belong to.

Returns:

Promise that will be resolved when the RTCRtpTransceiver is assigned an mid value.

Type: 
Promise.<RTCRtpTransceiver>

(async) connect(optionsopt) → {Promise.<void>}

Connects to an active stream as subscriber.

In the example, addStreamToYourVideoTag and getYourSubscriberConnectionPath is your own implementation.

Parameters:
NameTypeAttributesDescription
optionsObject<optional>

General subscriber options.

Properties
NameTypeAttributesDefaultDescription
dtxBoolean<optional>
false

True to modify SDP for supporting dtx in opus. Otherwise False.

absCaptureTimeBoolean<optional>
false

True to modify SDP for supporting absolute capture time header extension. Otherwise False.

metadataBoolean<optional>
false

Enable metadata extraction if stream is compatible.

disableVideoBoolean<optional>
false

Disable the opportunity to receive video stream.

disableAudioBoolean<optional>
false

Disable the opportunity to receive audio stream.

multiplexedAudioTracksNumber<optional>

Number of audio tracks to recieve VAD multiplexed audio for secondary sources.

pinnedSourceIdString<optional>

Id of the main source that will be received by the default MediaStream.

excludedSourceIdsArray.<String><optional>

Do not receive media from the these source ids.

eventsArray.<String><optional>

Override which events will be delivered by the server (any of "active" | "inactive" | "vad" | "layers" | "viewercount" | "updated").*

peerConfigRTCConfiguration<optional>

Options to configure the new RTCPeerConnection.

layerLayerInfo<optional>

Select the simulcast encoding layer and svc layers for the main video track, leave empty for automatic layer selection based on bandwidth estimation.

forcePlayoutDelayObject<optional>
false

Ask the server to use the playout delay header extension.

Properties
NameTypeAttributesDescription
minNumber<optional>

Set minimum playout delay value.

maxNumber<optional>

Set maximum playout delay value.

Returns:

Promise object which resolves when the connection was successfully established.

Type: 
Promise.<void>
Examples
await millicastView.connect(options)
import View from '@millicast/sdk'
 
 // Create media element
 const videoElement = document.createElement("video")
@@ -37,4 +37,4 @@
  await millicastView.connect()
 } catch (e) {
  console.log('Connection failed, handle error', e)
-}

getRTCPeerConnection() → {RTCPeerConnection}

Get current RTC peer connection.

Returns:

Object which represents the RTCPeerConnection.

Type: 
RTCPeerConnection

isActive() → {Boolean}

Get if the current connection is active.

Inherited From
Returns:
  • True if connected, false if not.
Type: 
Boolean

(async) project(sourceId, mapping)

Start projecting source in selected media ids.

Parameters:
NameTypeDescription
sourceIdString

Selected source id.

mappingArray.<Object>

Mapping of the source track ids to the receiver mids

Properties
NameTypeAttributesDescription
trackIdString<optional>

Track id from the source (received on the "active" event), if not set the media kind will be used instead.

mediaString<optional>

Track kind of the source ('audio' | 'video'), if not set the trackId will be used instead.

mediaIdString<optional>

mid value of the rtp receiver in which the media is going to be projected. If no mediaId is defined, the first track from the main media stream with the same media type as the input source track will be used.

layerLayerInfo<optional>

Select the simulcast encoding layer and svc layers, only applicable to video tracks.

promoteBoolean<optional>

To remove all existing limitations from the source, such as restricted bitrate or resolution, set this to true.

(async) reconnect(dataopt)

Reconnects to last broadcast.

Parameters:
NameTypeAttributesDescription
dataObject<optional>

This object contains the error property. It may be expanded to contain more information in the future.

Properties
NameTypeDescription
errorString

The value sent in the first reconnect event within the error key of the payload

Inherited From

(async) select(layer)

Select the simulcast encoding layer and svc layers for the main video track

Parameters:
NameTypeDescription
layerLayerInfo

leave empty for automatic layer selection based on bandwidth estimation.

setReconnect()

Sets reconnection if autoReconnect is enabled.

stop()

Stops connection.

(async) unproject(mediaIds)

Stop projecting attached source in selected media ids.

Parameters:
NameTypeDescription
mediaIdsArray.<String>

mid value of the receivers that are going to be detached.

Events

reconnect

Emits with every reconnection attempt made when an active stream stopped unexpectedly.

Type:
  • Object
Properties
NameTypeDescription
timeoutNumber

Next retry interval in milliseconds.

errorError

Error object with cause of failure. Possible errors are:

  • Signaling error: wsConnectionError if there was an error in the Websocket connection.
  • Connection state change: RTCPeerConnectionState disconnected if there was an error in the RTCPeerConnection.
  • Attempting to reconnect if the reconnect was trigered externally.
  • Or any internal error thrown by either Publish.connect or View.connect methods

\ No newline at end of file +}

getRTCPeerConnection() → {RTCPeerConnection}

Get current RTC peer connection.

Returns:

Object which represents the RTCPeerConnection.

Type: 
RTCPeerConnection

isActive() → {Boolean}

Get if the current connection is active.

Inherited From
Returns:
  • True if connected, false if not.
Type: 
Boolean

(async) project(sourceId, mapping)

Start projecting source in selected media ids.

Parameters:
NameTypeDescription
sourceIdString

Selected source id.

mappingArray.<Object>

Mapping of the source track ids to the receiver mids

Properties
NameTypeAttributesDescription
trackIdString<optional>

Track id from the source (received on the "active" event), if not set the media kind will be used instead.

mediaString<optional>

Track kind of the source ('audio' | 'video'), if not set the trackId will be used instead.

mediaIdString<optional>

mid value of the rtp receiver in which the media is going to be projected. If no mediaId is defined, the first track from the main media stream with the same media type as the input source track will be used.

layerLayerInfo<optional>

Select the simulcast encoding layer and svc layers, only applicable to video tracks.

promoteBoolean<optional>

To remove all existing limitations from the source, such as restricted bitrate or resolution, set this to true.

(async) reconnect(dataopt)

Reconnects to last broadcast.

Parameters:
NameTypeAttributesDescription
dataObject<optional>

This object contains the error property. It may be expanded to contain more information in the future.

Properties
NameTypeDescription
errorString

The value sent in the first reconnect event within the error key of the payload

Inherited From

(async) select(layer)

Select the simulcast encoding layer and svc layers for the main video track

Parameters:
NameTypeDescription
layerLayerInfo

leave empty for automatic layer selection based on bandwidth estimation.

setReconnect()

Sets reconnection if autoReconnect is enabled.

stop()

Stops connection.

(async) unproject(mediaIds)

Stop projecting attached source in selected media ids.

Parameters:
NameTypeDescription
mediaIdsArray.<String>

mid value of the receivers that are going to be detached.

Events

reconnect

Emits with every reconnection attempt made when an active stream stopped unexpectedly.

Type:
  • Object
Properties
NameTypeDescription
timeoutNumber

Next retry interval in milliseconds.

errorError

Error object with cause of failure. Possible errors are:

  • Signaling error: wsConnectionError if there was an error in the Websocket connection.
  • Connection state change: RTCPeerConnectionState disconnected if there was an error in the RTCPeerConnection.
  • Attempting to reconnect if the reconnect was trigered externally.
  • Or any internal error thrown by either Publish.connect or View.connect methods

\ No newline at end of file diff --git a/View.js.html b/View.js.html index d0f1eb67..b91dce0c 100644 --- a/View.js.html +++ b/View.js.html @@ -146,7 +146,7 @@ } /** - * Add remote receving track. + * Add remote receiving track. * @param {String} media - Media kind ('audio' | 'video'). * @param {Array<MediaStream>} streams - Streams the track will belong to. * @return {Promise<RTCRtpTransceiver>} Promise that will be resolved when the RTCRtpTransceiver is assigned an mid value. @@ -289,10 +289,11 @@ const metadata = event.data.metadata metadata.mid = event.data.mid metadata.track = this.tracksMidValues[event.data.mid] - - const uuid = metadata.uuid - metadata.uuid = uuid.reduce((str, byte) => str + byte.toString(16).padStart(2, '0'), '') - metadata.uuid = metadata.uuid.replace(/(.{8})(.{4})(.{4})(.{4})(.{12})/, '$1-$2-$3-$4-$5') + if (metadata.uuid) { + const uuid = metadata.uuid + metadata.uuid = uuid.reduce((str, byte) => str + byte.toString(16).padStart(2, '0'), '') + metadata.uuid = metadata.uuid.replace(/(.{8})(.{4})(.{4})(.{4})(.{12})/, '$1-$2-$3-$4-$5') + } if (metadata.timecode) { metadata.timecode = new Date(decoder.decode(metadata.timecode)) } @@ -306,10 +307,9 @@ logger.info('The content could not be converted to JSON, returning raw bytes instead') } } - - // for backwards compatibility, emit the old event as well - this.emit('onMetadata', metadata) this.emit('metadata', metadata) + // FIXME : Remove in v0.3.0 + this.emit('onMetadata', metadata) } } diff --git a/data/search.json b/data/search.json index a56cd765..2178a3f0 100644 --- a/data/search.json +++ b/data/search.json @@ -1 +1 @@ -{"list":[{"title":"AudioCodec","link":"AudioCodec","description":"

Enum of Millicast supported Audio codecs

"},{"title":"BaseWebRTC","link":"BaseWebRTC"},{"title":"BaseWebRTC#event:reconnect","link":"reconnect","description":"

Emits with every reconnection attempt made when an active stream\nstopped unexpectedly.

"},{"title":"BaseWebRTC#getRTCPeerConnection","link":"getRTCPeerConnection","description":"

Get current RTC peer connection.

"},{"title":"BaseWebRTC#isActive","link":"isActive","description":"

Get if the current connection is active.

"},{"title":"BaseWebRTC#reconnect","link":"reconnect","description":"

Reconnects to last broadcast.

"},{"title":"BaseWebRTC#setReconnect","link":"setReconnect","description":"

Sets reconnection if autoReconnect is enabled.

"},{"title":"BaseWebRTC#stop","link":"stop","description":"

Stops connection.

"},{"title":"ConnectionStats","link":"ConnectionStats"},{"title":"DirectorPublisherOptions","link":"DirectorPublisherOptions"},{"title":"DirectorSubscriberOptions","link":"DirectorSubscriberOptions"},{"title":"FrameMetaData","link":"FrameMetaData","description":"

Metadata of the Encoded Frame

"},{"title":"InboundStats","link":"InboundStats"},{"title":"LayerInfo","link":"LayerInfo"},{"title":"LayerInfo","link":"LayerInfo"},{"title":"LogLevel","link":"LogLevel"},{"title":"MillicastCapability","link":"MillicastCapability"},{"title":"MillicastDirectorResponse","link":"MillicastDirectorResponse"},{"title":"MillicastDirectorResponse","link":"MillicastDirectorResponse"},{"title":"OutboundStats","link":"OutboundStats"},{"title":"PeerConnection","link":"PeerConnection"},{"title":"PeerConnection#addRemoteTrack","link":"addRemoteTrack","description":"

Add remote receving track.

"},{"title":"PeerConnection#closeRTCPeer","link":"closeRTCPeer","description":"

Close RTC peer connection.

"},{"title":"PeerConnection#createRTCPeer","link":"createRTCPeer","description":"

Instance new RTCPeerConnection.

"},{"title":"PeerConnection#event:connectionStateChange","link":"connectionStateChange","description":"

Peer connection state change. Could be new, connecting, connected, disconnected, failed or closed.

"},{"title":"PeerConnection#event:track","link":"track","description":"

New track event.

"},{"title":"PeerConnection#getRTCLocalSDP","link":"getRTCLocalSDP","description":"

Get the SDP modified depending the options. Optionally set the SDP information to local peer.

"},{"title":"PeerConnection#getRTCPeer","link":"getRTCPeer","description":"

Get current RTC peer connection.

"},{"title":"PeerConnection#getRTCPeerStatus","link":"getRTCPeerStatus","description":"

Get peer connection state.

"},{"title":"PeerConnection#getTracks","link":"getTracks","description":"

Get sender tracks

"},{"title":"PeerConnection#initStats","link":"initStats","description":"

Initialize the statistics monitoring of the RTCPeerConnection.

\n

It will be emitted every second.

"},{"title":"PeerConnection#replaceTrack","link":"replaceTrack","description":"

Replace current audio or video track that is being broadcasted.

"},{"title":"PeerConnection#setRTCRemoteSDP","link":"setRTCRemoteSDP","description":"

Set SDP information to remote peer.

"},{"title":"PeerConnection#stopStats","link":"stopStats","description":"

Stops the monitoring of RTCPeerConnection statistics.

"},{"title":"PeerConnection#updateBandwidthRestriction","link":"updateBandwidthRestriction","description":"

Update remote SDP information to restrict bandwidth.

"},{"title":"PeerConnection#updateBitrate","link":"updateBitrate","description":"

Set SDP information to remote peer with bandwidth restriction.

"},{"title":"PeerConnection.getCapabilities","link":"getCapabilities","description":"

Gets user's browser media capabilities compared with Millicast Media Server support.

"},{"title":"PeerConnectionStats#init","link":"init","description":"

Initialize the statistics monitoring of the RTCPeerConnection.

"},{"title":"PeerConnectionStats#parseStats","link":"parseStats","description":"

Parse incoming RTCPeerConnection stats.

"},{"title":"PeerConnectionStats#stop","link":"stop","description":"

Stops the monitoring of RTCPeerConnection statistics.

"},{"title":"Publish","link":"Publish"},{"title":"Publish#connect","link":"connect","description":"

Starts broadcast to an existing stream name.

\n

In the example, getYourMediaStream and getYourPublisherConnection is your own implementation.

"},{"title":"Publish#record","link":"record","description":"

Initialize recording in an active stream and change the current record option.

"},{"title":"Publish#sendMetadata","link":"sendMetadata","description":"

Send SEI user unregistered data as part of the frame being streamed. Only available for H.264 codec.

"},{"title":"Publish#unrecord","link":"unrecord","description":"

Finalize recording in an active stream and change the current record option.

"},{"title":"SEIPicTimingTimeCode","link":"SEIPicTimingTimeCode","description":"

SEI Pic timing time code

"},{"title":"SEIUserUnregisteredData","link":"SEIUserUnregisteredData","description":"

SEI User unregistered data

"},{"title":"Signaling","link":"Signaling"},{"title":"Signaling#close","link":"close","description":"

Close WebSocket connection with Millicast server.

"},{"title":"Signaling#cmd","link":"cmd","description":"

Send command to the server.

"},{"title":"Signaling#connect","link":"connect","description":"

Starts a WebSocket connection with signaling server.

"},{"title":"Signaling#event:broadcastEvent","link":"broadcastEvent","description":"

Passthrough of available Millicast broadcast events.

\n

Active - Fires when the live stream is, or has started broadcasting.

\n

Inactive - Fires when the stream has stopped broadcasting, but is still available.

\n

Stopped - Fires when the stream has stopped for a given reason.

\n

Vad - Fires when using multiplexed tracks for audio.

\n

Layers - Fires when there is an update of the state of the layers in a stream (when broadcasting with simulcast).

\n

Migrate - Fires when the server is having problems, is shutting down or when viewers need to move for load balancing purposes.

\n

Viewercount - Fires when the viewer count changes.

\n

Updated - when an active stream's tracks are updated

\n

More information here: {@link https://docs.dolby.io/streaming-apis/docs/web#broadcast-events}

"},{"title":"Signaling#event:wsConnectionClose","link":"wsConnectionClose","description":"

WebSocket connection with signaling server was successfully closed.

"},{"title":"Signaling#event:wsConnectionError","link":"wsConnectionError","description":"

WebSocket connection failed with signaling server.\nReturns url of WebSocket

"},{"title":"Signaling#event:wsConnectionSuccess","link":"wsConnectionSuccess","description":"

WebSocket connection was successfully established with signaling server.

"},{"title":"Signaling#publish","link":"publish","description":"

Establish WebRTC connection with Millicast Server as Publisher role.

"},{"title":"Signaling#subscribe","link":"subscribe","description":"

Establish WebRTC connection with Millicast Server as Subscriber role.

"},{"title":"SignalingPublishOptions","link":"SignalingPublishOptions"},{"title":"SignalingSubscribeOptions","link":"SignalingSubscribeOptions"},{"title":"TrackReport","link":"TrackReport"},{"title":"VideoCodec","link":"VideoCodec","description":"

Enum of Millicast supported Video codecs

"},{"title":"View","link":"View"},{"title":"View#addRemoteTrack","link":"addRemoteTrack","description":"

Add remote receving track.

"},{"title":"View#connect","link":"connect","description":"

Connects to an active stream as subscriber.

\n

In the example, addStreamToYourVideoTag and getYourSubscriberConnectionPath is your own implementation.

"},{"title":"View#project","link":"project","description":"

Start projecting source in selected media ids.

"},{"title":"View#select","link":"select","description":"

Select the simulcast encoding layer and svc layers for the main video track

"},{"title":"View#unproject","link":"unproject","description":"

Stop projecting attached source in selected media ids.

"},{"title":"addPeerEvents","link":"addPeerEvents","description":"

Emits peer events.

"},{"title":"extractH26xMetadata","link":"extractH26xMetadata","description":"

Extract user unregistered metadata from H26x Encoded Frame

"},{"title":"loggerHandler","link":"loggerHandler","description":"

Callback which handles log messages.

"},{"title":"module:Director","link":"Director","description":"

Simplify API calls to find the best server and region to publish and subscribe to.\nFor security reasons all calls will return a JWT token for authentication including the required\nsocket path to connect with.

\n

You will need your own Publishing token and Stream name, please refer to Managing Your Tokens.

"},{"title":"module:Director~getEndpoint","link":"getEndpoint","description":"

Get current Director API endpoint where requests will be sent. Default endpoint is 'https://director.millicast.com'.

"},{"title":"module:Director~getLiveDomain","link":"getLiveDomain","description":"

Get current Websocket Live domain.\nBy default is empty which corresponds to not parse the Director response.

"},{"title":"module:Director~getPublisher","link":"getPublisher","description":"

Get publisher connection data.

"},{"title":"module:Director~getSubscriber","link":"getSubscriber","description":"

Get subscriber connection data.

"},{"title":"module:Director~setEndpoint","link":"setEndpoint","description":"

Set Director API endpoint where requests will be sent.

"},{"title":"module:Director~setLiveDomain","link":"setLiveDomain","description":"

Set Websocket Live domain from Director API response.\nIf it is set to empty, it will not parse the response.

"},{"title":"module:Logger","link":"Logger","description":"

Manages all log messages from SDK modules, you can use this logger to add your custom\nmessages and set your custom log handlers to forward all messages to your own monitoring\nsystem.

\n

By default all loggers are set in level OFF (Logger.OFF), and there are available\nthe following log levels.

\n

This module is based on js-logger you can refer\nto its documentation or following our examples.

"},{"title":"module:Logger~DEBUG","link":"DEBUG","description":"

Logger.DEBUG

"},{"title":"module:Logger~ERROR","link":"ERROR","description":"

Logger.ERROR

"},{"title":"module:Logger~INFO","link":"INFO","description":"

Logger.INFO

"},{"title":"module:Logger~OFF","link":"OFF","description":"

Logger.OFF

"},{"title":"module:Logger~TIME","link":"TIME","description":"

Logger.TIME

"},{"title":"module:Logger~TRACE","link":"TRACE","description":"

Logger.TRACE

"},{"title":"module:Logger~VERSION","link":"VERSION","description":"

Returns the current SDK version.

"},{"title":"module:Logger~WARN","link":"WARN","description":"

Logger.WARN

"},{"title":"module:Logger~diagnose","link":"diagnose","description":"

Returns diagnostics information about the connection and environment, formatted according to the specified parameters.

"},{"title":"module:Logger~get","link":"get","description":"

Gets or creates a named logger. Named loggers are used to group log messages\nthat refers to a common context.

"},{"title":"module:Logger~getHistory","link":"getHistory","description":"

Get all logs generated during a session.\nAll logs are recollected besides the log level selected by the user.

"},{"title":"module:Logger~getHistoryMaxSize","link":"getHistoryMaxSize","description":"

Get the maximum count of logs preserved during a session.

"},{"title":"module:Logger~getLevel","link":"getLevel","description":"

Get global current logger level.\nAlso you can get the level of any particular logger.

"},{"title":"module:Logger~setHandler","link":"setHandler","description":"

Add your custom log handler to Logger at the specified level.

"},{"title":"module:Logger~setHistoryMaxSize","link":"setHistoryMaxSize","description":"

Set the maximum count of logs to preserve during a session.\nBy default it is set to 10000.

"},{"title":"module:Logger~setLevel","link":"setLevel","description":"

Set log level to all loggers.

"},{"title":"module:SdpParser","link":"SdpParser","description":"

Simplify SDP parser.

"},{"title":"module:SdpParser~adaptCodecName","link":"adaptCodecName","description":"

Replace codec name of a SDP.

"},{"title":"module:SdpParser~getAvailableHeaderExtensionIdRange","link":"getAvailableHeaderExtensionIdRange","description":"

Gets all available header extension IDs of the current Session Description.

"},{"title":"module:SdpParser~getAvailablePayloadTypeRange","link":"getAvailablePayloadTypeRange","description":"

Gets all available payload type IDs of the current Session Description.

"},{"title":"module:SdpParser~removeSdpLine","link":"removeSdpLine","description":"

Remove SDP line.

"},{"title":"module:SdpParser~renegotiate","link":"renegotiate","description":"

Renegotiate remote sdp based on previous description.\nThis function will fill missing m-lines cloning on the remote description by cloning the codec and extensions already negotiated for that media

"},{"title":"module:SdpParser~setAbsoluteCaptureTime","link":"setAbsoluteCaptureTime","description":"

Mangle SDP for adding absolute capture time header extension.

"},{"title":"module:SdpParser~setDTX","link":"setDTX","description":"

Set DTX (Discontinuous Transmission) to the connection. Advanced configuration of the opus audio codec that allows for a large reduction in the audio traffic. For example, when a participant is silent, the audio packets won't be transmitted.

"},{"title":"module:SdpParser~setDependencyDescriptor","link":"setDependencyDescriptor","description":"

Mangle SDP for adding dependency descriptor header extension.

"},{"title":"module:SdpParser~setMultiopus","link":"setMultiopus","description":"

Parse SDP for support multiopus.\nOnly available in Google Chrome.

"},{"title":"module:SdpParser~setSimulcast","link":"setSimulcast","description":"

Parse SDP for support simulcast.\nOnly available in Chromium based browsers.

"},{"title":"module:SdpParser~setStereo","link":"setStereo","description":"

Parse SDP for support stereo.

"},{"title":"module:SdpParser~setVideoBitrate","link":"setVideoBitrate","description":"

Parse SDP for desired bitrate.

"},{"title":"module:SdpParser~updateMissingVideoExtensions","link":"updateMissingVideoExtensions","description":"

Adds missing extensions of each video section in the localDescription

"},{"title":"parseWebRTCStats","link":"parseWebRTCStats","description":"

Parses incoming WebRTC statistics\nThis method takes statistical data from @dolbyio/webrtc-stats and transforms it into\na structured format compatible with previous versions.

"},{"title":"tokenGeneratorCallback","link":"tokenGeneratorCallback","description":"

Callback invoke when a new connection path is needed.

"}]} \ No newline at end of file +{"list":[{"title":"AudioCodec","link":"AudioCodec","description":"

Enum of Millicast supported Audio codecs

"},{"title":"BaseWebRTC","link":"BaseWebRTC"},{"title":"BaseWebRTC#event:reconnect","link":"reconnect","description":"

Emits with every reconnection attempt made when an active stream\nstopped unexpectedly.

"},{"title":"BaseWebRTC#getRTCPeerConnection","link":"getRTCPeerConnection","description":"

Get current RTC peer connection.

"},{"title":"BaseWebRTC#isActive","link":"isActive","description":"

Get if the current connection is active.

"},{"title":"BaseWebRTC#reconnect","link":"reconnect","description":"

Reconnects to last broadcast.

"},{"title":"BaseWebRTC#setReconnect","link":"setReconnect","description":"

Sets reconnection if autoReconnect is enabled.

"},{"title":"BaseWebRTC#stop","link":"stop","description":"

Stops connection.

"},{"title":"ConnectionStats","link":"ConnectionStats"},{"title":"DirectorPublisherOptions","link":"DirectorPublisherOptions"},{"title":"DirectorSubscriberOptions","link":"DirectorSubscriberOptions"},{"title":"FrameMetaData","link":"FrameMetaData","description":"

Metadata of the Encoded Frame

"},{"title":"InboundStats","link":"InboundStats"},{"title":"LayerInfo","link":"LayerInfo"},{"title":"LayerInfo","link":"LayerInfo"},{"title":"LogLevel","link":"LogLevel"},{"title":"MillicastCapability","link":"MillicastCapability"},{"title":"MillicastDirectorResponse","link":"MillicastDirectorResponse"},{"title":"MillicastDirectorResponse","link":"MillicastDirectorResponse"},{"title":"OutboundStats","link":"OutboundStats"},{"title":"PeerConnection","link":"PeerConnection"},{"title":"PeerConnection#addRemoteTrack","link":"addRemoteTrack","description":"

Add remote receiving track.

"},{"title":"PeerConnection#closeRTCPeer","link":"closeRTCPeer","description":"

Close RTC peer connection.

"},{"title":"PeerConnection#createRTCPeer","link":"createRTCPeer","description":"

Instance new RTCPeerConnection.

"},{"title":"PeerConnection#event:connectionStateChange","link":"connectionStateChange","description":"

Peer connection state change. Could be new, connecting, connected, disconnected, failed or closed.

"},{"title":"PeerConnection#event:track","link":"track","description":"

New track event.

"},{"title":"PeerConnection#getRTCLocalSDP","link":"getRTCLocalSDP","description":"

Get the SDP modified depending the options. Optionally set the SDP information to local peer.

"},{"title":"PeerConnection#getRTCPeer","link":"getRTCPeer","description":"

Get current RTC peer connection.

"},{"title":"PeerConnection#getRTCPeerStatus","link":"getRTCPeerStatus","description":"

Get peer connection state.

"},{"title":"PeerConnection#getTracks","link":"getTracks","description":"

Get sender tracks

"},{"title":"PeerConnection#initStats","link":"initStats","description":"

Initialize the statistics monitoring of the RTCPeerConnection.

\n

It will be emitted every second.

"},{"title":"PeerConnection#replaceTrack","link":"replaceTrack","description":"

Replace current audio or video track that is being broadcasted.

"},{"title":"PeerConnection#setRTCRemoteSDP","link":"setRTCRemoteSDP","description":"

Set SDP information to remote peer.

"},{"title":"PeerConnection#stopStats","link":"stopStats","description":"

Stops the monitoring of RTCPeerConnection statistics.

"},{"title":"PeerConnection#updateBandwidthRestriction","link":"updateBandwidthRestriction","description":"

Update remote SDP information to restrict bandwidth.

"},{"title":"PeerConnection#updateBitrate","link":"updateBitrate","description":"

Set SDP information to remote peer with bandwidth restriction.

"},{"title":"PeerConnection.getCapabilities","link":"getCapabilities","description":"

Gets user's browser media capabilities compared with Millicast Media Server support.

"},{"title":"PeerConnectionStats#init","link":"init","description":"

Initialize the statistics monitoring of the RTCPeerConnection.

"},{"title":"PeerConnectionStats#parseStats","link":"parseStats","description":"

Parse incoming RTCPeerConnection stats.

"},{"title":"PeerConnectionStats#stop","link":"stop","description":"

Stops the monitoring of RTCPeerConnection statistics.

"},{"title":"Publish","link":"Publish"},{"title":"Publish#connect","link":"connect","description":"

Starts broadcast to an existing stream name.

\n

In the example, getYourMediaStream and getYourPublisherConnection is your own implementation.

"},{"title":"Publish#record","link":"record","description":"

Initialize recording in an active stream and change the current record option.

"},{"title":"Publish#sendMetadata","link":"sendMetadata","description":"

Send SEI user unregistered data as part of the frame being streamed. Only available for H.264 codec.

"},{"title":"Publish#unrecord","link":"unrecord","description":"

Finalize recording in an active stream and change the current record option.

"},{"title":"SEIPicTimingTimeCode","link":"SEIPicTimingTimeCode","description":"

SEI Pic timing time code

"},{"title":"SEIUserUnregisteredData","link":"SEIUserUnregisteredData","description":"

SEI User unregistered data

"},{"title":"Signaling","link":"Signaling"},{"title":"Signaling#close","link":"close","description":"

Close WebSocket connection with Millicast server.

"},{"title":"Signaling#cmd","link":"cmd","description":"

Send command to the server.

"},{"title":"Signaling#connect","link":"connect","description":"

Starts a WebSocket connection with signaling server.

"},{"title":"Signaling#event:broadcastEvent","link":"broadcastEvent","description":"

Passthrough of available Millicast broadcast events.

\n

Active - Fires when the live stream is, or has started broadcasting.

\n

Inactive - Fires when the stream has stopped broadcasting, but is still available.

\n

Stopped - Fires when the stream has stopped for a given reason.

\n

Vad - Fires when using multiplexed tracks for audio.

\n

Layers - Fires when there is an update of the state of the layers in a stream (when broadcasting with simulcast).

\n

Migrate - Fires when the server is having problems, is shutting down or when viewers need to move for load balancing purposes.

\n

Viewercount - Fires when the viewer count changes.

\n

Updated - when an active stream's tracks are updated

\n

More information here: {@link https://docs.dolby.io/streaming-apis/docs/web#broadcast-events}

"},{"title":"Signaling#event:wsConnectionClose","link":"wsConnectionClose","description":"

WebSocket connection with signaling server was successfully closed.

"},{"title":"Signaling#event:wsConnectionError","link":"wsConnectionError","description":"

WebSocket connection failed with signaling server.\nReturns url of WebSocket

"},{"title":"Signaling#event:wsConnectionSuccess","link":"wsConnectionSuccess","description":"

WebSocket connection was successfully established with signaling server.

"},{"title":"Signaling#publish","link":"publish","description":"

Establish WebRTC connection with Millicast Server as Publisher role.

"},{"title":"Signaling#subscribe","link":"subscribe","description":"

Establish WebRTC connection with Millicast Server as Subscriber role.

"},{"title":"SignalingPublishOptions","link":"SignalingPublishOptions"},{"title":"SignalingSubscribeOptions","link":"SignalingSubscribeOptions"},{"title":"TrackReport","link":"TrackReport"},{"title":"VideoCodec","link":"VideoCodec","description":"

Enum of Millicast supported Video codecs

"},{"title":"View","link":"View"},{"title":"View#addRemoteTrack","link":"addRemoteTrack","description":"

Add remote receiving track.

"},{"title":"View#connect","link":"connect","description":"

Connects to an active stream as subscriber.

\n

In the example, addStreamToYourVideoTag and getYourSubscriberConnectionPath is your own implementation.

"},{"title":"View#project","link":"project","description":"

Start projecting source in selected media ids.

"},{"title":"View#select","link":"select","description":"

Select the simulcast encoding layer and svc layers for the main video track

"},{"title":"View#unproject","link":"unproject","description":"

Stop projecting attached source in selected media ids.

"},{"title":"addPeerEvents","link":"addPeerEvents","description":"

Emits peer events.

"},{"title":"extractH26xMetadata","link":"extractH26xMetadata","description":"

Extract user unregistered metadata from H26x Encoded Frame

"},{"title":"loggerHandler","link":"loggerHandler","description":"

Callback which handles log messages.

"},{"title":"module:Director","link":"Director","description":"

Simplify API calls to find the best server and region to publish and subscribe to.\nFor security reasons all calls will return a JWT token for authentication including the required\nsocket path to connect with.

\n

You will need your own Publishing token and Stream name, please refer to Managing Your Tokens.

"},{"title":"module:Director~getEndpoint","link":"getEndpoint","description":"

Get current Director API endpoint where requests will be sent. Default endpoint is 'https://director.millicast.com'.

"},{"title":"module:Director~getLiveDomain","link":"getLiveDomain","description":"

Get current Websocket Live domain.\nBy default is empty which corresponds to not parse the Director response.

"},{"title":"module:Director~getPublisher","link":"getPublisher","description":"

Get publisher connection data.

"},{"title":"module:Director~getSubscriber","link":"getSubscriber","description":"

Get subscriber connection data.

"},{"title":"module:Director~setEndpoint","link":"setEndpoint","description":"

Set Director API endpoint where requests will be sent.

"},{"title":"module:Director~setLiveDomain","link":"setLiveDomain","description":"

Set Websocket Live domain from Director API response.\nIf it is set to empty, it will not parse the response.

"},{"title":"module:Logger","link":"Logger","description":"

Manages all log messages from SDK modules, you can use this logger to add your custom\nmessages and set your custom log handlers to forward all messages to your own monitoring\nsystem.

\n

By default all loggers are set in level OFF (Logger.OFF), and there are available\nthe following log levels.

\n

This module is based on js-logger you can refer\nto its documentation or following our examples.

"},{"title":"module:Logger~DEBUG","link":"DEBUG","description":"

Logger.DEBUG

"},{"title":"module:Logger~ERROR","link":"ERROR","description":"

Logger.ERROR

"},{"title":"module:Logger~INFO","link":"INFO","description":"

Logger.INFO

"},{"title":"module:Logger~OFF","link":"OFF","description":"

Logger.OFF

"},{"title":"module:Logger~TIME","link":"TIME","description":"

Logger.TIME

"},{"title":"module:Logger~TRACE","link":"TRACE","description":"

Logger.TRACE

"},{"title":"module:Logger~VERSION","link":"VERSION","description":"

Returns the current SDK version.

"},{"title":"module:Logger~WARN","link":"WARN","description":"

Logger.WARN

"},{"title":"module:Logger~diagnose","link":"diagnose","description":"

Returns diagnostics information about the connection and environment, formatted according to the specified parameters.

"},{"title":"module:Logger~get","link":"get","description":"

Gets or creates a named logger. Named loggers are used to group log messages\nthat refers to a common context.

"},{"title":"module:Logger~getHistory","link":"getHistory","description":"

Get all logs generated during a session.\nAll logs are recollected besides the log level selected by the user.

"},{"title":"module:Logger~getHistoryMaxSize","link":"getHistoryMaxSize","description":"

Get the maximum count of logs preserved during a session.

"},{"title":"module:Logger~getLevel","link":"getLevel","description":"

Get global current logger level.\nAlso you can get the level of any particular logger.

"},{"title":"module:Logger~setHandler","link":"setHandler","description":"

Add your custom log handler to Logger at the specified level.

"},{"title":"module:Logger~setHistoryMaxSize","link":"setHistoryMaxSize","description":"

Set the maximum count of logs to preserve during a session.\nBy default it is set to 10000.

"},{"title":"module:Logger~setLevel","link":"setLevel","description":"

Set log level to all loggers.

"},{"title":"module:SdpParser","link":"SdpParser","description":"

Simplify SDP parser.

"},{"title":"module:SdpParser~adaptCodecName","link":"adaptCodecName","description":"

Replace codec name of a SDP.

"},{"title":"module:SdpParser~getAvailableHeaderExtensionIdRange","link":"getAvailableHeaderExtensionIdRange","description":"

Gets all available header extension IDs of the current Session Description.

"},{"title":"module:SdpParser~getAvailablePayloadTypeRange","link":"getAvailablePayloadTypeRange","description":"

Gets all available payload type IDs of the current Session Description.

"},{"title":"module:SdpParser~removeSdpLine","link":"removeSdpLine","description":"

Remove SDP line.

"},{"title":"module:SdpParser~renegotiate","link":"renegotiate","description":"

Renegotiate remote sdp based on previous description.\nThis function will fill missing m-lines cloning on the remote description by cloning the codec and extensions already negotiated for that media

"},{"title":"module:SdpParser~setAbsoluteCaptureTime","link":"setAbsoluteCaptureTime","description":"

Mangle SDP for adding absolute capture time header extension.

"},{"title":"module:SdpParser~setDTX","link":"setDTX","description":"

Set DTX (Discontinuous Transmission) to the connection. Advanced configuration of the opus audio codec that allows for a large reduction in the audio traffic. For example, when a participant is silent, the audio packets won't be transmitted.

"},{"title":"module:SdpParser~setDependencyDescriptor","link":"setDependencyDescriptor","description":"

Mangle SDP for adding dependency descriptor header extension.

"},{"title":"module:SdpParser~setMultiopus","link":"setMultiopus","description":"

Parse SDP for support multiopus.\nOnly available in Google Chrome.

"},{"title":"module:SdpParser~setSimulcast","link":"setSimulcast","description":"

Parse SDP for support simulcast.\nOnly available in Chromium based browsers.

"},{"title":"module:SdpParser~setStereo","link":"setStereo","description":"

Parse SDP for support stereo.

"},{"title":"module:SdpParser~setVideoBitrate","link":"setVideoBitrate","description":"

Parse SDP for desired bitrate.

"},{"title":"module:SdpParser~updateMissingVideoExtensions","link":"updateMissingVideoExtensions","description":"

Adds missing extensions of each video section in the localDescription

"},{"title":"parseWebRTCStats","link":"parseWebRTCStats","description":"

Parses incoming WebRTC statistics\nThis method takes statistical data from @dolbyio/webrtc-stats and transforms it into\na structured format compatible with previous versions.

"},{"title":"tokenGeneratorCallback","link":"tokenGeneratorCallback","description":"

Callback invoke when a new connection path is needed.

"}]} \ No newline at end of file diff --git a/global.html b/global.html index 55b42cf1..f2a21ee8 100644 --- a/global.html +++ b/global.html @@ -1,3 +1,3 @@ Global
On this page

Members

(constant) AudioCodec :String

Enum of Millicast supported Audio codecs

Type:
  • String
Properties
NameTypeDescription
OPUSString
MULTIOPUSString

(constant) VideoCodec :String

Enum of Millicast supported Video codecs

Type:
  • String
Properties
NameTypeDescription
VP8String
VP9String
H264String
AV1String
H265String

Only available in Safari

Methods

addPeerEvents(instanceClass, peer)

Emits peer events.

Parameters:
NameTypeDescription
instanceClassPeerConnection

PeerConnection instance.

peerRTCPeerConnection

Peer instance.

extractH26xMetadata(encodedFrame, codec) → {FrameMetaData}

Extract user unregistered metadata from H26x Encoded Frame

Parameters:
NameTypeDescription
encodedFrameRTCEncodedFrame
codec'H264' | 'H265'
Returns:
Type: 
FrameMetaData

parseWebRTCStats(webRTCStats)

Parses incoming WebRTC statistics This method takes statistical data from @dolbyio/webrtc-stats and transforms it into a structured format compatible with previous versions.

Parameters:
NameTypeDescription
webRTCStatsObject

The statistics object containing various WebRTC stats

Type Definitions

ConnectionStats

Type:
  • Object
Properties
NameTypeDescription
rawRTCStatsReport

All RTCPeerConnection stats without parsing. Reference https://developer.mozilla.org/en-US/docs/Web/API/RTCStatsReport.

audioTrackReport

Parsed audio information.

videoTrackReport

Parsed video information.

availableOutgoingBitrateNumber

The available outbound capacity of the network connection. The higher the value, the more bandwidth you can assume is available for outgoing data. The value is reported in bits per second.

This value comes from the nominated candidate-pair.

totalRoundTripTimeNumber

Total round trip time is the total time in seconds that has elapsed between sending STUN requests and receiving the responses.

This value comes from the nominated candidate-pair.

currentRoundTripTimeNumber

Current round trip time indicate the number of seconds it takes for data to be sent by this peer to the remote peer and back over the connection described by this pair of ICE candidates.

This value comes from the nominated candidate-pair.

candidateTypeRTCIceCandidateType

Local candidate type from the nominated candidate-pair which indicates the type of ICE candidate the object represents.

DirectorPublisherOptions

Type:
  • Object
Properties
NameTypeAttributesDescription
tokenString

Millicast Publishing Token.

streamNameString

Millicast Stream Name.

streamType"WebRtc" | "Rtmp"<optional>

Millicast Stream Type.

DirectorSubscriberOptions

Type:
  • Object
Properties
NameTypeAttributesDescription
streamNameString

Millicast publisher Stream Name.

streamAccountIdString

Millicast Account ID.

subscriberTokenString<optional>

Token to subscribe to secure streams. If you are subscribing to an unsecure stream, you can omit this param.

FrameMetaData

Metadata of the Encoded Frame

Type:
  • object
Properties
NameTypeAttributesDescription
timestampnumber

the time at which frame sampling started, value is a positive integer containing the sampling instant of the first byte in this frame, in microseconds

seiUserUnregisteredDataArrayArray.<SEIUserUnregisteredData>

the SEI user unregistered data array

seiPicTimingTimeCodeArrayArray.<SEIPicTimingTimeCode><optional>

the SEI pic timing time codes

InboundStats

Type:
  • Object
Properties
NameTypeAttributesDescription
idString

inbound-rtp Id.

jitterNumber

Current Jitter measured in seconds.

mimeTypeString<optional>

Mime type if related report had codec report associated.

framesPerSecondNumber<optional>

Current framerate if it's video report.

frameHeightNumber<optional>

Current frame height if it's video report.

frameWidthNumber<optional>

Current frame width if it's video report.

keyFramesDecodedNumber<optional>

Total number of key frames that have been decoded if it's video report.

framesDecodedNumber<optional>

Total number of frames that have been decoded if it's video report.

framesDroppedNumber<optional>

Total number of frames that have been dropped if it's video report.

framesReceivedNumber<optional>

Total number of frames that have been received if it's video report.

timestampNumber

Timestamp of report.

totalBytesReceivedNumber

Total bytes received is an integer value which indicates the total number of bytes received so far from this synchronization source.

totalPacketsReceivedNumber

Total packets received indicates the total number of packets of any kind that have been received on the connection described by the pair of candidates.

totalPacketsLostNumber

Total packets lost.

packetsLostRatioPerSecondNumber

Total packet lost ratio per second.

packetsLostDeltaPerSecondNumber

Total packet lost delta per second.

bitrateNumber

Current bitrate in bits per second.

packetRateNumber

The rate at which packets are being received, measured in packets per second.

jitterBufferDelayNumber

Total delay in seconds currently experienced by the jitter buffer.

jitterBufferEmittedCountNumber

Total number of packets emitted from the jitter buffer.

LayerInfo

Type:
  • Object
Properties
NameTypeDescription
encodingIdString

rid value of the simulcast encoding of the track (default: automatic selection)

spatialLayerIdNumber

The spatial layer id to send to the outgoing stream (default: max layer available)

temporalLayerIdNumber

The temporaral layer id to send to the outgoing stream (default: max layer available)

maxSpatialLayerIdNumber

Max spatial layer id (default: unlimited)

maxTemporalLayerIdNumber

Max temporal layer id (default: unlimited)

LayerInfo

Type:
  • Object
Properties
NameTypeDescription
encodingIdString

rid value of the simulcast encoding of the track (default: automatic selection)

spatialLayerIdNumber

The spatial layer id to send to the outgoing stream (default: max layer available)

temporalLayerIdNumber

The temporaral layer id to send to the outgoing stream (default: max layer available)

maxSpatialLayerIdNumber

Max spatial layer id (default: unlimited)

maxTemporalLayerIdNumber

Max temporal layer id (default: unlimited)

LogLevel

Type:
  • Object
Properties
NameTypeDescription
valueNumber

The numerical representation of the level.

nameString

Human readable name of the log level.

MillicastCapability

Type:
  • Object
Properties
NameTypeDescription
codecsArray.<Object>
Properties
NameTypeAttributesDescription
codecString

Audio or video codec name.

mimeTypeString

Audio or video codec mime type.

scalabilityModesArray.<String><optional>

In case of SVC support, a list of scalability modes supported.

channelsNumber<optional>

Only for audio, the number of audio channels supported.

headerExtensionsArray.<RTCRtpHeaderExtensionCapability>

An array specifying the URI of the header extension, as described in RFC 5285.

MillicastDirectorResponse

Type:
  • Object
Properties
NameTypeDescription
urlsArray.<String>

WebSocket available URLs.

jwtString

Access token for signaling initialization.

iceServersArray.<RTCIceServer>

Object which represents a list of Ice servers.

MillicastDirectorResponse

Type:
  • Object
Properties
NameTypeDescription
urlsArray.<String>

WebSocket available URLs.

jwtString

Access token for signaling initialization.

iceServersArray.<RTCIceServer>

Object which represents a list of Ice servers.

OutboundStats

Type:
  • Object
Properties
NameTypeAttributesDescription
idString

outbound-rtp Id.

mimeTypeString<optional>

Mime type if related report had codec report associated.

framesPerSecondNumber<optional>

Current framerate if it's video report.

frameHeightNumber<optional>

Current frame height if it's video report.

frameWidthNumber<optional>

Current frame width if it's video report.

qualityLimitationReasonString<optional>

If it's video report, indicate the reason why the media quality in the stream is currently being reduced by the codec during encoding, or none if no quality reduction is being performed.

timestampNumber

Timestamp of report.

totalBytesSentNumber

Total bytes sent indicates the total number of payload bytes that hve been sent so far on the connection described by the candidate pair.

bitrateNumber

Current bitrate in bits per second.

bytesSentDeltaNumber

Change in the number of bytes sent since the last report.

totalPacketsSentNumber

Total number of packets sent.

packetsSentDeltaNumber

Change in the number of packets sent since the last report.

packetRateNumber

Rate at which packets are being sent, measured in packets per second.

targetBitrateNumber

The target bitrate for the encoder, in bits per second.

retransmittedPacketsSentNumber

Total number of retransmitted packets sent.

retransmittedPacketsSentDeltaNumber

Change in the number of retransmitted packets sent since the last report.

retransmittedBytesSentNumber

Total number of bytes that have been retransmitted.

retransmittedBytesSentDeltaNumber

Change in the number of retransmitted bytes sent since the last report.

framesSentNumber

Total number of frames sent (applicable for video).

qualityLimitationDurationsObject<optional>

Durations in seconds for which the quality of the media has been limited by the codec, categorized by the limitation reasons such as bandwidth, CPU, or other factors.

SEIPicTimingTimeCode

SEI Pic timing time code

Type:
  • object
Properties
NameTypeDescription
secondsnumber
minutesnumber
hoursnumber
n_framesnumber
time_offsetnumber

SEIUserUnregisteredData

SEI User unregistered data

Type:
  • object
Properties
NameTypeDescription
uuidstring

the UUID of the SEI user unregistered data

dataUint8Array

the binary content of the SEI user unregistered data

SignalingPublishOptions

Type:
  • Object
Properties
NameTypeAttributesDefaultDescription
codecVideoCodec<optional>
"h264"

Codec for publish stream.

recordBoolean<optional>

Enable stream recording. If record is not provided, use default Token configuration. Only available in Tokens with recording enabled.

sourceIdString<optional>

Source unique id. Only available in Tokens with multisource enabled.*

eventsArray.<String>

Override which events will be delivered by the server ("active" | "inactive").

SignalingSubscribeOptions

Type:
  • Object
Properties
NameTypeDescription
vadString

Enable VAD multiplexing for secondary sources.

pinnedSourceIdString

Id of the main source that will be received by the default MediaStream.

excludedSourceIdsArray.<String>

Do not receive media from the these source ids.

eventsArray.<String>

Override which events will be delivered by the server ("active" | "inactive" | "vad" | "layers" | "updated").

layerLayerInfo

Select the simulcast encoding layer and svc layers for the main video track, leave empty for automatic layer selection based on bandwidth estimation.

TrackReport

Type:
  • Object
Properties
NameTypeDescription
inboundsArray.<InboundStats>

Parsed information of each inbound-rtp.

outboundsArray.<OutboundStats>

Parsed information of each outbound-rtp.

loggerHandler(messages, context)

Callback which handles log messages.

Parameters:
NameTypeDescription
messagesArray.<any>

Arguments object with the supplied log messages.

contextObject
Properties
NameTypeAttributesDescription
levelLogLevel

The currrent log level.

nameString<nullable>

The optional current logger name.

tokenGeneratorCallback() → {Promise.<MillicastDirectorResponse>}

Callback invoke when a new connection path is needed.

Returns:

Promise object which represents the result of getting the new connection path.

You can use your own token generator or use the Director available methods.

Type: 
Promise.<MillicastDirectorResponse>
\ No newline at end of file +
On this page

Members

(constant) AudioCodec :String

Enum of Millicast supported Audio codecs

Type:
  • String
Properties
NameTypeDescription
OPUSString
MULTIOPUSString

(constant) VideoCodec :String

Enum of Millicast supported Video codecs

Type:
  • String
Properties
NameTypeDescription
VP8String
VP9String
H264String
AV1String
H265String

Only available in Safari

Methods

addPeerEvents(instanceClass, peer)

Emits peer events.

Parameters:
NameTypeDescription
instanceClassPeerConnection

PeerConnection instance.

peerRTCPeerConnection

Peer instance.

extractH26xMetadata(encodedFrame, codec) → {FrameMetaData}

Extract user unregistered metadata from H26x Encoded Frame

Parameters:
NameTypeDescription
encodedFrameRTCEncodedFrame
codec'H264' | 'H265'
Returns:
Type: 
FrameMetaData

parseWebRTCStats(webRTCStats)

Parses incoming WebRTC statistics This method takes statistical data from @dolbyio/webrtc-stats and transforms it into a structured format compatible with previous versions.

Parameters:
NameTypeDescription
webRTCStatsObject

The statistics object containing various WebRTC stats

Type Definitions

ConnectionStats

Type:
  • Object
Properties
NameTypeDescription
rawRTCStatsReport

All RTCPeerConnection stats without parsing. Reference https://developer.mozilla.org/en-US/docs/Web/API/RTCStatsReport.

audioTrackReport

Parsed audio information.

videoTrackReport

Parsed video information.

availableOutgoingBitrateNumber

The available outbound capacity of the network connection. The higher the value, the more bandwidth you can assume is available for outgoing data. The value is reported in bits per second.

This value comes from the nominated candidate-pair.

totalRoundTripTimeNumber

Total round trip time is the total time in seconds that has elapsed between sending STUN requests and receiving the responses.

This value comes from the nominated candidate-pair.

currentRoundTripTimeNumber

Current round trip time indicate the number of seconds it takes for data to be sent by this peer to the remote peer and back over the connection described by this pair of ICE candidates.

This value comes from the nominated candidate-pair.

candidateTypeRTCIceCandidateType

Local candidate type from the nominated candidate-pair which indicates the type of ICE candidate the object represents.

DirectorPublisherOptions

Type:
  • Object
Properties
NameTypeAttributesDescription
tokenString

Millicast Publishing Token.

streamNameString

Millicast Stream Name.

streamType"WebRtc" | "Rtmp"<optional>

Millicast Stream Type.

DirectorSubscriberOptions

Type:
  • Object
Properties
NameTypeAttributesDescription
streamNameString

Millicast publisher Stream Name.

streamAccountIdString

Millicast Account ID.

subscriberTokenString<optional>

Token to subscribe to secure streams. If you are subscribing to an unsecure stream, you can omit this param.

FrameMetaData

Metadata of the Encoded Frame

Type:
  • object
Properties
NameTypeAttributesDescription
timestampnumber

the time at which frame sampling started, value is a positive integer containing the sampling instant of the first byte in this frame, in microseconds

seiUserUnregisteredDataArrayArray.<SEIUserUnregisteredData>

the SEI user unregistered data array

seiPicTimingTimeCodeArrayArray.<SEIPicTimingTimeCode><optional>

the SEI pic timing time codes

InboundStats

Type:
  • Object
Properties
NameTypeAttributesDescription
idString

inbound-rtp Id.

jitterNumber

Current Jitter measured in seconds.

mimeTypeString<optional>

Mime type if related report had codec report associated.

framesPerSecondNumber<optional>

Current framerate if it's video report.

frameHeightNumber<optional>

Current frame height if it's video report.

frameWidthNumber<optional>

Current frame width if it's video report.

keyFramesDecodedNumber<optional>

Total number of key frames that have been decoded if it's video report.

framesDecodedNumber<optional>

Total number of frames that have been decoded if it's video report.

framesDroppedNumber<optional>

Total number of frames that have been dropped if it's video report.

framesReceivedNumber<optional>

Total number of frames that have been received if it's video report.

timestampNumber

Timestamp of report.

totalBytesReceivedNumber

Total bytes received is an integer value which indicates the total number of bytes received so far from this synchronization source.

totalPacketsReceivedNumber

Total packets received indicates the total number of packets of any kind that have been received on the connection described by the pair of candidates.

totalPacketsLostNumber

Total packets lost.

packetsLostRatioPerSecondNumber

Total packet lost ratio per second.

packetsLostDeltaPerSecondNumber

Total packet lost delta per second.

bitrateNumber

Current bitrate in Bytes per second.

bitrateBitsPerSecondNumber

Current bitrate in bits per second.

packetRateNumber

The rate at which packets are being received, measured in packets per second.

jitterBufferDelayNumber

Total delay in seconds currently experienced by the jitter buffer.

jitterBufferEmittedCountNumber

Total number of packets emitted from the jitter buffer.

LayerInfo

Type:
  • Object
Properties
NameTypeDescription
encodingIdString

rid value of the simulcast encoding of the track (default: automatic selection)

spatialLayerIdNumber

The spatial layer id to send to the outgoing stream (default: max layer available)

temporalLayerIdNumber

The temporaral layer id to send to the outgoing stream (default: max layer available)

maxSpatialLayerIdNumber

Max spatial layer id (default: unlimited)

maxTemporalLayerIdNumber

Max temporal layer id (default: unlimited)

LayerInfo

Type:
  • Object
Properties
NameTypeDescription
encodingIdString

rid value of the simulcast encoding of the track (default: automatic selection)

spatialLayerIdNumber

The spatial layer id to send to the outgoing stream (default: max layer available)

temporalLayerIdNumber

The temporaral layer id to send to the outgoing stream (default: max layer available)

maxSpatialLayerIdNumber

Max spatial layer id (default: unlimited)

maxTemporalLayerIdNumber

Max temporal layer id (default: unlimited)

LogLevel

Type:
  • Object
Properties
NameTypeDescription
valueNumber

The numerical representation of the level.

nameString

Human readable name of the log level.

MillicastCapability

Type:
  • Object
Properties
NameTypeDescription
codecsArray.<Object>
Properties
NameTypeAttributesDescription
codecString

Audio or video codec name.

mimeTypeString

Audio or video codec mime type.

scalabilityModesArray.<String><optional>

In case of SVC support, a list of scalability modes supported.

channelsNumber<optional>

Only for audio, the number of audio channels supported.

headerExtensionsArray.<RTCRtpHeaderExtensionCapability>

An array specifying the URI of the header extension, as described in RFC 5285.

MillicastDirectorResponse

Type:
  • Object
Properties
NameTypeDescription
urlsArray.<String>

WebSocket available URLs.

jwtString

Access token for signaling initialization.

iceServersArray.<RTCIceServer>

Object which represents a list of Ice servers.

MillicastDirectorResponse

Type:
  • Object
Properties
NameTypeDescription
urlsArray.<String>

WebSocket available URLs.

jwtString

Access token for signaling initialization.

iceServersArray.<RTCIceServer>

Object which represents a list of Ice servers.

OutboundStats

Type:
  • Object
Properties
NameTypeAttributesDescription
idString

outbound-rtp Id.

mimeTypeString<optional>

Mime type if related report had codec report associated.

framesPerSecondNumber<optional>

Current framerate if it's video report.

frameHeightNumber<optional>

Current frame height if it's video report.

frameWidthNumber<optional>

Current frame width if it's video report.

qualityLimitationReasonString<optional>

If it's video report, indicate the reason why the media quality in the stream is currently being reduced by the codec during encoding, or none if no quality reduction is being performed.

timestampNumber

Timestamp of report.

totalBytesSentNumber

Total bytes sent indicates the total number of payload bytes that hve been sent so far on the connection described by the candidate pair.

bitrateNumber

Current bitrate in Bytes per second.

bitrateBitsPerSecondNumber

Current bitrate in bits per second.

bytesSentDeltaNumber

Change in the number of bytes sent since the last report.

totalPacketsSentNumber

Total number of packets sent.

packetsSentDeltaNumber

Change in the number of packets sent since the last report.

packetRateNumber

Rate at which packets are being sent, measured in packets per second.

targetBitrateNumber

The target bitrate for the encoder, in bits per second.

retransmittedPacketsSentNumber

Total number of retransmitted packets sent.

retransmittedPacketsSentDeltaNumber

Change in the number of retransmitted packets sent since the last report.

retransmittedBytesSentNumber

Total number of bytes that have been retransmitted.

retransmittedBytesSentDeltaNumber

Change in the number of retransmitted bytes sent since the last report.

framesSentNumber

Total number of frames sent (applicable for video).

qualityLimitationDurationsObject<optional>

Durations in seconds for which the quality of the media has been limited by the codec, categorized by the limitation reasons such as bandwidth, CPU, or other factors.

SEIPicTimingTimeCode

SEI Pic timing time code

Type:
  • object
Properties
NameTypeDescription
secondsnumber
minutesnumber
hoursnumber
n_framesnumber
time_offsetnumber

SEIUserUnregisteredData

SEI User unregistered data

Type:
  • object
Properties
NameTypeDescription
uuidstring

the UUID of the SEI user unregistered data

dataUint8Array

the binary content of the SEI user unregistered data

SignalingPublishOptions

Type:
  • Object
Properties
NameTypeAttributesDefaultDescription
codecVideoCodec<optional>
"h264"

Codec for publish stream.

recordBoolean<optional>

Enable stream recording. If record is not provided, use default Token configuration. Only available in Tokens with recording enabled.

sourceIdString<optional>

Source unique id. Only available in Tokens with multisource enabled.*

eventsArray.<String>

Override which events will be delivered by the server ("active" | "inactive").

SignalingSubscribeOptions

Type:
  • Object
Properties
NameTypeDescription
vadString

Enable VAD multiplexing for secondary sources.

pinnedSourceIdString

Id of the main source that will be received by the default MediaStream.

excludedSourceIdsArray.<String>

Do not receive media from the these source ids.

eventsArray.<String>

Override which events will be delivered by the server ("active" | "inactive" | "vad" | "layers" | "updated").

layerLayerInfo

Select the simulcast encoding layer and svc layers for the main video track, leave empty for automatic layer selection based on bandwidth estimation.

TrackReport

Type:
  • Object
Properties
NameTypeDescription
inboundsArray.<InboundStats>

Parsed information of each inbound-rtp.

outboundsArray.<OutboundStats>

Parsed information of each outbound-rtp.

loggerHandler(messages, context)

Callback which handles log messages.

Parameters:
NameTypeDescription
messagesArray.<any>

Arguments object with the supplied log messages.

contextObject
Properties
NameTypeAttributesDescription
levelLogLevel

The currrent log level.

nameString<nullable>

The optional current logger name.

tokenGeneratorCallback() → {Promise.<MillicastDirectorResponse>}

Callback invoke when a new connection path is needed.

Returns:

Promise object which represents the result of getting the new connection path.

You can use your own token generator or use the Director available methods.

Type: 
Promise.<MillicastDirectorResponse>
\ No newline at end of file diff --git a/utils_BaseWebRTC.js.html b/utils_BaseWebRTC.js.html index e176d795..416dbfbf 100644 --- a/utils_BaseWebRTC.js.html +++ b/utils_BaseWebRTC.js.html @@ -100,6 +100,9 @@ this.webRTCPeer.on(webRTCEvents.connectionStateChange, (state) => { Diagnostics.setConnectionState(state) + if (state === 'connected') { + Diagnostics.setConnectionTime(new Date()) + } if ((state === 'failed' || (state === 'disconnected' && this.alreadyDisconnected)) && this.firstReconnection) { this.firstReconnection = false this.reconnect({ error: new Error('Connection state change: RTCPeerConnectionState disconnected') })