You can overlay one live video source with another used as a mask. As a result, you have a mix of 2 live sources. It is very useful when you need to mix one live source used as a background (RGB video) with another live source that contains transparency (alpha channel).
It is a kind of a reverse keying output when you need to combine fill and key inputs together.
Note please that both sources should have the same video format for proper mixing, i.e. they should have the same frame rate and the same resolution.
In common case, for mixing 2 different sources you should use an MMixer object (in MPlatfrom), or MFOverlay method (in MFormats). But for live sources in MPlatform SDK and any sources in MFormats SDK, there is a specific feature - "mask" type.
MPlatform SDK
The "mask" feature works only for MLive object. You can specify required input source with the DeviceSet method:
myLive.DeviceSet("mask", deviceName, ""); // you should use a name of a device as deviceName, like "UltraStudio Express"
To specify an input line, you should use "mask::line-in" device type:
myLive.DeviceSet("mask::line-in", lineName, ""); // you should use a name of a device as lineName, like "HDMI Video & HDMI Audio"
MFormats SDK
Because MFormats is based on frames, it is possible to use any source as a "mask" for mixing. The idea is to grab a frame from the main source, grab a frame from a mask source and then attach the "mask" frame to the main frame:
MFFrame mainFrame; myMainSource.SourceFrameGet(-1, out mainFrame, ""); MFFrame maskFrame; myMaskSource.SourceFrameGet(-1, out maskFrame, ""); // attach the mask frame to the main one mainFrame.MFObjSet("mask", maskFrame); // process the mainFrame further myPreview.ReceiverFramePut(mainFrame, -1, "");