Options
All
  • Public
  • Public/Protected
  • All
Menu

The master Audio/Video controller instance. This is available as the singleton game.webrtc

param settings

The Audio/Video settings to use

Hierarchy

  • AVMaster

Index

Constructors

Properties

settings: AVSettings
config: AVConfig
client: AVClient

The Audio/Video client class

broadcasting: boolean

A flag to track whether the current user is actively broadcasting their microphone.

_connected: boolean

Flag to determine if we are connected to the signalling server or not. This is required for synchronization between connection and reconnection attempts.

_speakingData: { speaking: boolean; volumeHistories: any[] }

Type declaration

  • speaking: boolean
  • volumeHistories: any[]
_pttMuteTimeout: number

Accessors

  • get mode(): any
  • Returns any

Methods

  • connect(): Promise<boolean>
  • Connect to the Audio/Video client.

    Returns Promise<boolean>

    Was the connection attempt successful?

  • disconnect(): Promise<boolean>
  • Disconnect from the Audio/Video client.

    Returns Promise<boolean>

    Whether an existing connection was terminated?

  • reestablish(): Promise<void>
  • Callback actions to take when the user becomes disconnected from the server.

    Returns Promise<void>

  • canUserBroadcastAudio(userId: string): boolean
  • A user can broadcast audio if the AV mode is compatible and if they are allowed to broadcast.

    Parameters

    • userId: string

    Returns boolean

  • canUserShareAudio(userId: string): boolean
  • A user can share audio if they are allowed to broadcast and if they have not muted themselves or been blocked.

    Parameters

    • userId: string

    Returns boolean

  • canUserBroadcastVideo(userId: string): boolean
  • A user can broadcast video if the AV mode is compatible and if they are allowed to broadcast.

    Parameters

    • userId: string

    Returns boolean

  • canUserShareVideo(userId: string): boolean
  • A user can share video if they are allowed to broadcast and if they have not hidden themselves or been blocked.

    Parameters

    • userId: string

    Returns boolean

  • broadcast(intent: boolean): any
  • Trigger a change in the audio broadcasting state when using a push-to-talk workflow.

    Parameters

    • intent: boolean

      The user's intent to broadcast. Whether an actual broadcast occurs will depend on whether or not the user has muted their audio feed.

    Returns any

  • activateVoiceDetection(stream: MediaStream, ms: number): void
  • Activate voice detection tracking for a userId on a provided MediaStream. Currently only a MediaStream is supported because MediaStreamTrack processing is not yet supported cross-browser.

    Parameters

    • stream: MediaStream

      The MediaStream which corresponds to that User

    • ms: number

    Returns void

  • deactivateVoiceDetection(): void
  • Actions which the orchestration layer should take when a peer user disconnects from the audio/video service.

    Returns void

  • _resetSpeakingHistory(): void
  • Resets the speaking history of a user If the user was considered speaking, then mark them as not speaking

    Returns void

  • _onPTTStart(context: KeyboardEventContext): boolean
  • Handle activation of a push-to-talk key or button.

    Parameters

    • context: KeyboardEventContext

      The context data of the event

    Returns boolean

  • _onPTTEnd(context: KeyboardEventContext): boolean
  • Handle deactivation of a push-to-talk key or button.

    Parameters

    • context: KeyboardEventContext

      The context data of the event

    Returns boolean

  • render(): any
  • Returns any

  • onRender(): void
  • Render the audio/video streams to the CameraViews UI. Assign each connected user to the correct video frame element.

    Returns void

  • onSettingsChanged(changed: any): Promise<boolean>
  • Respond to changes which occur to AV Settings. Changes are handled in descending order of impact.

    Parameters

    • changed: any

      The object of changed AV settings

    Returns Promise<boolean>

  • debug(message: any): void
  • Parameters

    • message: any

    Returns void

  • _initialize(): void
  • Initialize the local broadcast state.

    Returns void

  • _initializeUserVoiceDetection(mode: string): void
  • Set up audio level listeners to handle voice activation detection workflow.

    Parameters

    • mode: string

      The currently selected voice broadcasting mode

    Returns void

  • _onAudioLevel(dbLevel: number): any
  • Periodic notification of user audio level

    This function uses the audio level (in dB) of the audio stream to determine if the user is speaking or not and notifies the UI of such changes.

    The User is considered speaking if they are above the decibel threshold in any of the history values. This marks them as speaking as soon as they have a high enough volume, and marks them as not speaking only after they drop below the threshold in all histories (last 4 volumes = for 200 ms).

    There can be more optimal ways to do this and which uses whether the user was already considered speaking before or not, in order to eliminate short bursts of audio (coughing for example).

    Parameters

    • dbLevel: number

      The audio level in decibels of the user within the last 50ms

    Returns any