RESound: Interactive Sound Rendering for Dynamic Virtual Environments

RESound: Interactive Sound Rendering for Dynamic Virtual Environments

About 15 years ago, people noticed that rendering virtual scenes with ray tracing was a lot like how sound propagates through an environment. Light rays travel through open spaces, hit objects and then reflect, refract, and bend. Sound waves follow many of the same principles.

What if you use the same ray casting methods to simulate sound traveling through an environment? Instead of standard hacks on sound to make something sound like it’s in a tiled bathroom or a big orchestra hall, you could accurately simulate it – reducing artist time. Simply play the sound and let the algorithm figure out how it should sound.

Not sure what other research has happened since. It was too computationally expensive for real time back then, but it was a cool idea and maybe we have the compute for it with today’s GPU’s.

Paper: https://gamma.cs.unc.edu/Sound/RESound

One thought on “RESound: Interactive Sound Rendering for Dynamic Virtual Environments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.