-
-
Notifications
You must be signed in to change notification settings - Fork 618
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow to render effects only on stream #1535
Conversation
I understand that this is a feature you want, but I don't quite understand what kind of experience you want to provide to the end-users. Therefore, I would like to start by discussing this.
|
|
Thank you for your understanding so far. Regarding the first requirement, I think the request can be met if we have data from captures of unprocessed devices. From the intention, it seems like you only want the overlay display to be applied to the Stream. I suppose it would be better if effects like sepia or monochrome could also be applied. What do you think? As for the second requirement, aspect fit might also be a solution. It seems that by preparing two raw CMSampleBuffers and two MTHKViews, the intention could be realized. Of course, the video may look slightly different, but I don't think the end-user would feel any discomfort. |
767eb37
to
3f3b2e0
Compare
The idea of being able to apply effects to the unprocessed buffer sounds great. It isn't a requirement in our case, so initially, I chose a simple approach without having two effects arrays. Being able to apply effects is a powerful feature; having more effects settings should be great for HK users imo. Also, on a performance note, it looks like having two roots won't create much overhead, and having two roots will make it easier to apply separate effects to both video buffers. I would appreciate your input on this feature so that I can add any necessary changes into the PR. Alternatively, please feel free to take ownership of this feature and implement it in the way you think is most effective. For the second requirement, we will attempt to use the suggested approach. Rendering the second camera twice seems too expensive, even though it would look better. |
I have a general understanding of what you want to achieve. Thank you. I’d like to return to the discussion about the PR. The unprocessed CMSampleBuffer can be obtained in the following delegate: public protocol IOStreamDelegate: AnyObject {
/// Tells the receiver a video buffer is incoming.
func stream(_ stream: IOStream, track: UInt8, didInput buffer: CMSampleBuffer)
} As we are discussing in this GitHub thread, I believe we can solve this issue using the CMSampleBuffer, but do you anticipate any problems? |
It might be not enough to use the delegate method and enqueue buffers to IOStreamView directly without attaching stream to the view. |
Unfortunately, in the current version, this feature is not available. In cases where the view is not attached, startRunning() is triggered when you start publishing, which is suitable for audio streaming. I think it would be helpful to have APIs like IOStream.startCapture(), stopCapture(), and isCapturing(). |
Yes |
Will try to implement the suggested solution and reopen the PR if some changes are required. Thanks for help! |
Description & motivation
This PR follows #1530 and allows to split effects rendering to show them only on the stream and hide them on the preview. This allows things like rendering UIView as effects. The main reason I implemented the split process in the
ScreenObject
was to avoid having two screen objects and a separate root in theScreen
. I'm looking forward for the feedback before I can polish this implementation.Type of change
Screenshots:
Preview
Stream