Augmented reality (AR) has been around for a while, so now we have mobile applications, cool visualization interfaces for scientists and consumer entertainment products like the recent Sony's Eyepet that take advantage of this technology.
I have been thinking lately about some of the conclusions and future applications of AR included in the project [PDF in spanish] I did for my BSc graduation, and now I am looking again for some of the innovative results that can be achieved with libraries like ARToolkit or reacTIVision. For instance: theater, concerts or live shows. What about having in your room a 3D performance that is taking place anywhere at the same time? Ok, we will still need to retrieve what is happening on stage (visually) with things like Lidar and then stream a 3D aproximation of that, but... what about the audio? We can do a proper recording of the performance (even in stereo or multichannel) though I have experienced that when I am, for example, watching a concert, the impact of the experience is sometimes based on your placement and the reactions of the people you have around that are also sound sources.
How did they did it for the Holodeck?
I have heard something about related works in this topic, but I just discovered the official website of european funded 2020 3d media project, which has its homebase in Barcelona:
ReplyDeletehttp://www.20203dmedia.eu/
In some way, it looks like that Wave Field Synthesis technique can solve the problem...
ReplyDeletehttp://en.wikipedia.org/wiki/Wave_field_synthesis
But anyways, today it is really hard to have a speaker array (>8 units) in a regular room.