I skimmed the Unity API, and didn’t see any functionality for positioning a music source in 3D space. I suspect there is a hacky way to achieve spatial sound through macros that adjust volume and panning based on a listener’s position and orientation in the game world. Ideally this would be something that could be specified through the API.
Here is my specific use case. I’m making a puzzle game that revolves around magical portal doors and infinite, looping spaces. Each room contains and instrument, and if all doors out of the room are closed, the player only hears that instrument’s track. If doors are open, the instruments in the connected room can also be heard. I plan to add mechanics for door thresholds that adjust the player’s relative scale and dilate the flow of time, and Reactional looks like it will be a godsend for expressing those mechanics in the music.
It’s not the end of the world if the music is disembodied, but ideally I would like to be able to position individual instruments in world space. I’d even like to position multiple sources for the same instrument to support looping spaces, though for performance reasons I suspect this is better accomplished by some kind of echo effect.
Currently, we do not support spatialized audio per instrument, as all our instruments use the same Unity Audio Source. However, we do plan to implement spatialized audio and instrument separation in future updates—though this is further down the roadmap.
That said, as you mentioned, it is possible to fake it. Here are some examples:
Option 1: Using the Composer Tool
If you’ve composed your own theme, you could:
Apply energy, density, or low-pass filter macros to individual instruments to simulate doors opening or closing. (Note: This won’t provide directional audio if you move around in a room.)
Extract pitch data from an instrument in your theme using sink and lanes.
Route it to a single-note WAV file, adjusting the pitch and triggering it based on the selected instrument.
Disclaimer: This method is a bit hacky, but it would allow spatialized audio by playing the note from a separate GameObject. The trade-off is losing the original instrument’s sound from the composed music.
If you wanna have a chat about it join our Discord and ping me there with your question again. Reactional Discord
I’m glad to hear that spatialized audio is on the roadmap. My game is between 1-2 years away from release, so I can certainly work with the system as it is now and then implement spatialized audio when that feature becomes available.
It would be useful for us if you wanted to post a feature request here as a thread, what you think is needed in terms of spatialized audio, is it only the instruments. Or how would you envision it to work.