This article mainly introduces how to use MWCapture SDK to splice videos when multiple channels are being captured.
1. MWCapture SDK provides a method to splice multiple channels when capturing:
- Assign frame buffer memory according to image splicing resolutions and number of channels;
- In order to synchronize video channels, frame sync judgment thread should be executed;
- Use MWCaptureVideoFrameToVirtualAddressEx() to capture video frames from different channels;
- In order to synchronize video screens, MWRegulateDeviceTime() should be used to calibrate all channel clocks;
2. When capturing, the method to synchronize multiple channels:
- In addition to capture thread, execute frame sync judgment thread;
- Determine the status of different channels. Use MWSetDeviceTime() to synchronize the clocks of the channels with input signals;
- Compare the frame info among different channels. If the frame value is within limits, then set the channel mask value as 1;
In all
- Buffer occupies a relatively large memory, and in the buffer memory, each channel has its corresponding target rectangular position (assigned by pRectDest). When capturing and splicinf the multiple videos, you need to use MWCaptureVideoFrameToVirtualAddressEx() to assign the captured frames to different positions in buffer memory.
- In order to synchronize the frames, you need to refer to frame time of channel A as standard, and set a reference range. If the frame times of other channels are within the range, then set channel mask value as 1; Capture thread will capture the synchronized frames according to the mask value.
- In order to synchronize the multiple channels, you need to use MWRegulateDeviceTime() to calibrate the clock values periodically, so as to ensure the synchronization of clock values in different channels.
Examples \ Applications \ XICaptureQuad contain a detailed demo for reference.