-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Input support for macOS #32
Comments
I frequently got CannotDoInCurrentContext when the renderslice sizes don't match. i.e. if the input device has been set with a 512 frame slicebuffer, then AudioUnitRender will only do 512 frames. |
Ah, that's such a dumb error message to use for that case - so unhelpful 😅 Pretty sure the frames per slice is set correctly, though - I did a little digging and sample rate's my best guess so far |
Does the input module actually attach an input device to the audio unit? |
line 95: #if !TARGET_OS_IPHONE
NSAssert(!(self.outputEnabled && self.inputEnabled), @"Can only have both input and output enabled on iOS");
#endif should probably change to something like: #if !TARGET_OS_IPHONE
NSAssert(!(self.outputEnabled ^ self.inputEnabled), @"Must have either output or input enabled, (but not both)");
#endif having both input and output enabled on the desktop didn't work for me. Likely not possible as the input will be running in its own thread. |
Currently, there are issues with receiving audio input on macOS, resulting in CannotDoInCurrentContext errors when rendering. I suspect this is related to sample rate, and that an AudioConverter will be required when the input unit's rate doesn't match the output unit's one. This is only a theory, though.
The text was updated successfully, but these errors were encountered: