Medialooks Help Center
-
WebRTC 2.0
See All Articles -
Files and network streams playback
- How to play YouTube videos with custom audio and video quality settings in MFormats or MPlatform SDK
- HDR and 10bit video playback
- MFReader Statistics: Monitoring video playback performance
- MFile statistics
- Playback control with MFReader object
- Playback with MPlatform SDK
-
Files encoding and network streaming
- SRT streaming
- Audio and video codecs
- MPTS recording and streaming
- How to Use Named Pipes
- RIST streaming
- Icecast streaming
-
Live sources
- Multiple screens recording
- Window Graphics Capture engine
- MLive/MFLive statistics
- Initialization of a device for input
- External audio for live sources
- Screen Capture engine
-
Output video to a device
- Hardware Internal and External Keying feature
- MRenderer statistics
- Initialization of a device for output
- Convert video format on device output
- MRenderer object properties
- MFRenderer object properties
-
Playlist features
- MItem object properties
- Basic Playlist management
- Break items in playlist
- MPlaylist statistics
- Simple playlist emulation with MFormats SDK
- Scheduling of playlist items
-
Working with audio
- ASIO-based devices support
- Audio filters
- Audio loudness normalization (EBU R128, ITU BS.1770)
- RMS, VU and LUFS audio meters
- MFSignalingDTMF object properties
- Audio stretching and pitching
-
Preview of sources
- Fullscreen preview
- WPF Preview Configuration
- Configuring preview
- Multiple preview
- Choose sound card for audio preview
- MPreview object properties
-
HTML5 Overlay plugin
- Basic usage of HTML5 plugin
- HTML5 plugin as a live source
- HTML5 plugin properties
- HTML5 Plugin: handling events
-
Character Generator
- Character Generator (CG) preoutput configuration
- CG Scheduling
- Configuring CG properties with code
- Ticker items
- Text items
- Performing the L-shaped items overlay
-
WebRTC
- How to communicate with WebRTC signaling server
- WebRTC event list
- SFU (Selective Forwarding Unit)
- Sharing custom information between Publisher and Receiver
- Medialooks WebRTC Q&A
- Еnvironment: signaling, STUN and TURN servers
-
NDI
- Use a specific network adapter
- How to connect to a specific NDI stream?
- Processing metadata with NDI
- Tally light signals in NDI
- Input and output video with NDI
- NDI properties
-
Mixing of different sources
- Mixer OnBoard
- Transitions between MMixer streams
- Using mp://link as a video source in MFLive/MLive objects
- MMixer statistics
- Transition between scenes
- Mix audio from one source with video from another
-
Operations with frames
- Usage of OpenCV with MFormats SDK
- Getting frames from a third-party SDK
- Generation of new frames from WPF UserControl
- Transparency processing
- Zooming a video with mouse move
- Frame rotation
-
Common information
- How to select GPU for decoding, processing and encoding
- MPlatform SDK release notes
- MFormats SDK release notes
- How to use Skype source in Medialooks SDK
- Timecode processing
- How to protect licenses
-
Extensions
- MFShader Plugin
- Chroma Key 2. A new chroma key implementation.
- ST 2110 devices management. AJA.
- Blackmagic Design SMPTE ST 2110: How to configure
- Advanced Chroma Key 2 settings
- ST 2110 devices management. Deltacast.
-
Closed Captions and SCTE triggers
- Transmitting custom CC608 through ANC packets
- SCTE-104 List of available Multiple Operation Message and XML
- SCTE-104 List of available Single Operation Messages and XML
- Enabling Closed Captions for Device Input and Output in Video SDK
- Basic Closed Captions information
- SCTE-35 and SCTE-104 triggers
-
DirectShow
See All Articles -
Supported hardware
- Blackmagic Design. Integration libraries for backward compatibility.
- Quick guide about PCIe slots, lanes and speed
- Magewell: supported models
- Blackmagic Design: supported models
- Blackmagic Design devices properties
- AJA Video Systems: supported models
-
FAQ
- REST API and Video SDK
- Adding GPL libx264/265 codecs to MFormats and MPlatform SDKs
- How to solve a problem with a Design view of samples?
- Export content with alpha
- How to schedule encoding?
- How to create a debug log?
Promoted articles
- WebRTC 2.0 beta
- SRT streaming
- ASIO-based devices support
- Audio filters
- Audio loudness normalization (EBU R128, ITU BS.1770)
- RMS, VU and LUFS audio meters
- Usage of OpenCV with MFormats SDK
- How to select GPU for decoding, processing and encoding
- MPlatform SDK release notes
- MFormats SDK release notes
- MFShader Plugin
- Chroma Key 2. A new chroma key implementation.
- ST 2110 devices management. AJA.