-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changing byteArray to SoundSource #34
Comments
Maybe create a Sample and then do sample.writeWavBytes(byteArray) |
@kenfehling I don't have sample. I want to create SoundSource from the byteArray which I have recorded from microphone. |
Oh right, I had that backwards, writeWavBytes(byteArray) puts info into a byteArray. Maybe something like this: const sound:Sound = new Sound(); I'm not sure what the format should be, the default is "float" but you might have signed/unsigned integers or something in the byte array. |
Okay, I was able to change the byteArray to sound and play it. But when I make the soundSource from the sound it does not play. var recordedSound:Sound = new Sound();
Is the sound passed through soundSource should have any specific format or something? |
Yes, give it an AudioDescriptor to specify the sample rate and mono/stereo.
|
Thanks for the quick reply. I still can't play the sound when passed through soundSource using the above code I got this error After passing through the StandardizeFilter there was no audio played in the player. In the source code if the rate is 44kHz and channel is 2 (means stereo) it does not require to pass through StandardizeFilter. So I changed the recording to 44kHz? |
Here is the code soundBytes.position=0;
|
I am trying to pass the microphone data through the filters but don't know how to convert the byteArray to SoundSource to pass it through the filters. I am recording the audio at 11 kHz. Please help!!
The text was updated successfully, but these errors were encountered: