I am trying to adapt a sketch I made using minim in realtime to write a WAV offline (not in real-time). I have come across examples using the javasound api to write a WAV from an array where a sine table is generated byte-by-byte, but I would like to use unit-generator based sound libraries like minim or jsyn to generate the audio data. Since those libraries are mainly used for realtime synthesis to an audio device, I am confused about how this would be accomplished.
One thing that sticks out in my mind is that in minim (I am less familiar with jsyn), I assume that chunks of data are written to the output stream consisting of multiple samples per draw() loop frame. I am just generally unsure of how to capture these frames to a buffer to write a file, or better to a stream that java can use to write the file piece by piece.
Sorry if this is vague, I am a bit lost in general regarding the architecture of these libraries and don;t have an understanding of what the bits and pieces are that would get unit generators working to write data in a non-realtime application.
Thanks for any help in advance, I can provide some of the code that I have worked on if that would clarify things.