It’s probably been 20 years, and I’m guessing that somebody could do something way more accurate with modern sensors, but I remember seeing somebody do a demo where they use the Wii sensor bar and LEDs attached to glasses to do head tracking on a window projection like that. The effect was highly convincing for what it was.
The evolution of this is how they do ‘virtual production’, LED walls and tracked camera using realtime rendered scenes in Unreal Engine. You could make a relatively cheap version using something like Valve Lighthouses and a projector.
It’s probably been 20 years, and I’m guessing that somebody could do something way more accurate with modern sensors, but I remember seeing somebody do a demo where they use the Wii sensor bar and LEDs attached to glasses to do head tracking on a window projection like that. The effect was highly convincing for what it was.
I played with a similar setup using xbox 360 kinect, very fun tech!
https://en.wikipedia.org/wiki/Cave_automatic_virtual_environment
The evolution of this is how they do ‘virtual production’, LED walls and tracked camera using realtime rendered scenes in Unreal Engine. You could make a relatively cheap version using something like Valve Lighthouses and a projector.
Why does a window projection need head tracking?
To change the perspective as you move around.
The only issue is it would only work for one person, if you have multiple people the affect would break.