Yeah, this came out during the last few weeks of my time in high school. It is a major reason I got into computer science and became a programmer. Good Times!
Wow, thanks for sharing, that really takes me back. I was so hyped when I saw this as a kid, my dad and I made a mount for my glasses with two IR LEDs and a battery. I remember that I was super impressed with the effect.
I also went to Maplins (UK Radioshack) and bought some infra red LEDs to hack together something to achieve this same effect. In the end I just taped the Wii Sensor Bar to my glasses!
Came here to say the same. I remember playing around with this back in the day, but using two candles instead of the sensor bar. Yes, it works. No, it’s not a good idea to hold candles that close to your hair.
I tried implementing this with face detection-based head tracking after that demo (or maybe before; I can't remember). I got it working but the effect was very underwhelming. It looks great in that video, but it kind of sucks in real life.
I think the problem is in real life you have an enormous number of other visual cues that tell you that you're not really seeing something 3D - focus, stereoscopy (not for me though sadly), the fact that you know you're looking at the screen, inevitable lag from cameras, etc.
I can't view the videos because of their stupid cookie screen, but I wouldn't be too excited about this. The camera lag especially probably impossible to solve.
Thanks for posting, I was sure I recalled something like this form a long time ago. I also build my self a FreeTrack headset (https://en.wikipedia.org/wiki/FreeTrack) around this same time to play the Arma / Operating Flashpoint games using IR LED's attached to a hat that my webcam would track.
We built this because we wanted 3D experiences without needing a VR headset.
The approach: use your webcam to track where you're looking, and adjust
the 3D perspective to match.
It creates motion parallax - the same depth cue your brain uses when you
look around a real object. Feels like looking through a window instead of
at a flat screen.
Known limitations:
- Only works for one viewer at a time
- Needs decent lighting
- Currently WebGL only
We're still figuring out where this is genuinely useful vs just a novelty.
Gaming seems promising, also exploring education and product visualization.
It was definitely useful and appreciated on the "New" Nintendo 3DS XL, which also used a camera to track your eye movements and adjust the divergence accordingly. I hate the fact that Nintendo abandoned this technology because experiencing Ocarina of Time and Star Fox 64 in 3D was world-changing to me.
I'd say I'm not the only one who misses this technology in games, because a used New 3DS XL costs at least $200 on eBay right now, which is more than what I paid new.
I always thought 3D would combine really nicely with a ray traced graphics full of bright colors and reflections, similar to all those ray tracing demos with dozens of glossy marbles.
The technology is still alive and well in some genres, particularly flight sims. One common free solution is to run OpenTrack with the included neural net webcam tracker, which plugs into TrackIR-enabled apps and works like a charm.
I don't know if you designed it for a specific monitor but, feedback. It tried using it on my M1 Mac.
First thing, there is no loading indicator and it takes too long to start so I though it was broken a few times before I realized I had to wait longer.
Second thing, although it was clearly tracking my head and moving the camera it did not make me feel like I'm looking into a window into a 3d scene behind the monitor.
These kinds of demos have been around before. I don't know why some work better than others.
I can confirm that is works decently well with a sunny roof window in the background, which is normally enough for people to complain that my face is too dark.
8yo me, who instinctively tried to look behind the display's field of view during intense gaming sessions, would appreciate this feature very much. My belief is that if it shifted the pov to a lesser degree than in the demo, people generally wouldn't notice, but still subconsciously register this as a more immersive experience.
I'm also glad that the web version doesn't try to cook my laptop - good work.
Also, TrackIR is just an IR webcam, IR leds, and a hat with reflectors. You can DIY the exact same setup easily with OpenTrack, but OpenTrack also has a neural net webcam-only tracker which is, AFAIK, pretty much state of the art. At any rate it works incredibly robustly.
Actually I have already used it to implement the same idea as the post, with the added feature of anaglyph (red/blue) glasses 3D. The way I did it, I put an entire lightfield into a texture and rendered it with a shader. Then I just piped the output of OpenTrack directly into the shader and Robert, c'est votre proverbial oncle. The latency isn't quite up to VR standard (the old term for this is "fishtank VR"), but it's still quite convincing if you don't move your head too fast.
I can see this quite useful for educational demonstrations of physics situations and mechanical systems (complex gearing, etc.). Also maybe for product simulations/demonstrations in the design phase — take output from CAD files and make a nice little 3D demo.
Maybe have an "inertia(?)" setting that makes it keep turning when you move far enough off center, as if you were continuing to walk around it.
The single-viewer limitation seems obvious and fundamental, and maybe a bit problematic for the above use cases, such as when showing something to a small group of people. One key may be to take steps to ensure it robustly locks onto and follows only one face/set of eyes. It would be jarring to have it locking onto different faces as conditions or positions subtly change.
how long on what internet connection? i m on 1min and counting on 50mbit.
but maybe it doesnt work on ubuntu 24 + firefox? should be webgl capable though.
> This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
From my experience this only ever looks good in a recorded video. Because we are used to assume that the real object captured within the video is perceivable in a stereoscopic form. But it‘s not. The real usage of head and viewtracking is as a controller, not to improve immersion.
But does it actually make stuff look like you can reach out and touch it like the 3D movies at theme parks? I don't see how it would. This just seems like parallax.
You can do this with a Kinect for head tracking and people do that with homemade pinball machines using a TV as the table, and a Kinect to track your head, so it looks like the table is 3D down into the TV.
But I don't think it creates the full 3D effect with things looking like they are coming way out of the screen and like a tangible thing you can reach out and touch.
How is this different from the many available head tracking devices? I used to play Flight Simulator with head tracking[1]. It was great, although it did consume a bit of CPU cycles. IR trackers are more efficient
Seemed to have some different names depending on region (Looksley's Line Up, Tales in a Box: Hidden Shapes in Perspective). I recall it working very well at the time.
Interestingly, for this parallax 3D effect to work, the head tracking needs to basically move "backwards" from typical head tracking since it needs to keep the focal point the same, if I'm understanding correctly. Any time I've tried this out it's fun and would likely be most useful for something like a 3D painting you hang on your wall.
Related idea, but not the same, might be my iOS and Android app that uses your phone's AR for head tracking and then sends that data to your PC for smooth sim game head tracking. https://smoothtrack.app
really fancy idea and cool project! Its a bit stuttery probably due to the 30fps of a webcam but works. Feels slightly weird somehow though, probably from the lag.
update: just tried to open the site again now and its gone but leads to some kind of shop?
imho this should be useful for driving/flight sims, giving the player the ability to lean inside the vehicle, changing their viewpoint on the surroundings.
If the latency is low enough to fool you, it makes the screen look like a window. It's much more limited than a headset, but especially with a natural scene, it does get the illusion across.
BTW, the OG demo from Johnny Lee is from 2007: https://www.youtube.com/watch?v=Jd3-eiid-Uw
Yeah, this came out during the last few weeks of my time in high school. It is a major reason I got into computer science and became a programmer. Good Times!
Wow, thanks for sharing, that really takes me back. I was so hyped when I saw this as a kid, my dad and I made a mount for my glasses with two IR LEDs and a battery. I remember that I was super impressed with the effect.
I also went to Maplins (UK Radioshack) and bought some infra red LEDs to hack together something to achieve this same effect. In the end I just taped the Wii Sensor Bar to my glasses!
That was a blast from the past! Many, including me, surely still have those Wii sensors/controllers around. Fun times!
Came here to say the same. I remember playing around with this back in the day, but using two candles instead of the sensor bar. Yes, it works. No, it’s not a good idea to hold candles that close to your hair.
I tried implementing this with face detection-based head tracking after that demo (or maybe before; I can't remember). I got it working but the effect was very underwhelming. It looks great in that video, but it kind of sucks in real life.
I think the problem is in real life you have an enormous number of other visual cues that tell you that you're not really seeing something 3D - focus, stereoscopy (not for me though sadly), the fact that you know you're looking at the screen, inevitable lag from cameras, etc.
I can't view the videos because of their stupid cookie screen, but I wouldn't be too excited about this. The camera lag especially probably impossible to solve.
Thanks for posting, I was sure I recalled something like this form a long time ago. I also build my self a FreeTrack headset (https://en.wikipedia.org/wiki/FreeTrack) around this same time to play the Arma / Operating Flashpoint games using IR LED's attached to a hat that my webcam would track.
Hi HN, I'm Sten, one of the creators.
We built this because we wanted 3D experiences without needing a VR headset. The approach: use your webcam to track where you're looking, and adjust the 3D perspective to match.
Demo: https://portality.io/dragoncourtyard/ (Allow camera, move your head left/right)
It creates motion parallax - the same depth cue your brain uses when you look around a real object. Feels like looking through a window instead of at a flat screen.
Known limitations: - Only works for one viewer at a time - Needs decent lighting - Currently WebGL only
We're still figuring out where this is genuinely useful vs just a novelty. Gaming seems promising, also exploring education and product visualization.
Happy to answer questions!
It was definitely useful and appreciated on the "New" Nintendo 3DS XL, which also used a camera to track your eye movements and adjust the divergence accordingly. I hate the fact that Nintendo abandoned this technology because experiencing Ocarina of Time and Star Fox 64 in 3D was world-changing to me.
I'd say I'm not the only one who misses this technology in games, because a used New 3DS XL costs at least $200 on eBay right now, which is more than what I paid new.
I always thought 3D would combine really nicely with a ray traced graphics full of bright colors and reflections, similar to all those ray tracing demos with dozens of glossy marbles.
Samsung recently released a monitor with similar technology, I believe, as a FYI.
The technology is still alive and well in some genres, particularly flight sims. One common free solution is to run OpenTrack with the included neural net webcam tracker, which plugs into TrackIR-enabled apps and works like a charm.
Vr has taken over this market. Get a vr headset you won't be disappointed.
I don't know if you designed it for a specific monitor but, feedback. It tried using it on my M1 Mac.
First thing, there is no loading indicator and it takes too long to start so I though it was broken a few times before I realized I had to wait longer.
Second thing, although it was clearly tracking my head and moving the camera it did not make me feel like I'm looking into a window into a 3d scene behind the monitor.
These kinds of demos have been around before. I don't know why some work better than others.
some others:
https://discourse.threejs.org/t/parallax-effect-using-face-t... https://www.anxious-bored.com/blog/2018/2/25/theparallaxview...
I can confirm that is works decently well with a sunny roof window in the background, which is normally enough for people to complain that my face is too dark.
8yo me, who instinctively tried to look behind the display's field of view during intense gaming sessions, would appreciate this feature very much. My belief is that if it shifted the pov to a lesser degree than in the demo, people generally wouldn't notice, but still subconsciously register this as a more immersive experience.
I'm also glad that the web version doesn't try to cook my laptop - good work.
The obvious use case would be to replace the clunky head tracking systems which are often used in simulator games.
Systems like trackir, which require dedicated hardware.
You can do this today with OpenTrack: https://github.com/opentrack/opentrack
Also, TrackIR is just an IR webcam, IR leds, and a hat with reflectors. You can DIY the exact same setup easily with OpenTrack, but OpenTrack also has a neural net webcam-only tracker which is, AFAIK, pretty much state of the art. At any rate it works incredibly robustly.
Actually I have already used it to implement the same idea as the post, with the added feature of anaglyph (red/blue) glasses 3D. The way I did it, I put an entire lightfield into a texture and rendered it with a shader. Then I just piped the output of OpenTrack directly into the shader and Robert, c'est votre proverbial oncle. The latency isn't quite up to VR standard (the old term for this is "fishtank VR"), but it's still quite convincing if you don't move your head too fast.
Trackir is just a camera with an infrared led.
definitely laggy, but works even under low lighting conditions and a camera that is not facing straight forward.
Very cool!
I can see this quite useful for educational demonstrations of physics situations and mechanical systems (complex gearing, etc.). Also maybe for product simulations/demonstrations in the design phase — take output from CAD files and make a nice little 3D demo.
Maybe have an "inertia(?)" setting that makes it keep turning when you move far enough off center, as if you were continuing to walk around it.
The single-viewer limitation seems obvious and fundamental, and maybe a bit problematic for the above use cases, such as when showing something to a small group of people. One key may be to take steps to ensure it robustly locks onto and follows only one face/set of eyes. It would be jarring to have it locking onto different faces as conditions or positions subtly change.
The demo gives just a blank screen on Android Firefox, Kiwi and Chrome.
It works for me, just needs a time to load.
Ah, thanks! Got it now on a better connection. A loading indicator would ease confusion.
how long on what internet connection? i m on 1min and counting on 50mbit. but maybe it doesnt work on ubuntu 24 + firefox? should be webgl capable though.
ok took around 2min for me to load, then works.
Devtools say 40.23 MB / 13 MB transferred. Even on 1000Mbps it needed a moment. Works for me on Ubuntu 24 and Firefox.
> This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
OK, that's a no then.
It's YouTube. This is super common, just worded in a weird way.
It's also incorrect. They could just use youtube-nocookie.com instead if tracking cookies are disabled. https://support.google.com/youtube/answer/171780?hl=en#zippy...
From my experience this only ever looks good in a recorded video. Because we are used to assume that the real object captured within the video is perceivable in a stereoscopic form. But it‘s not. The real usage of head and viewtracking is as a controller, not to improve immersion.
But does it actually make stuff look like you can reach out and touch it like the 3D movies at theme parks? I don't see how it would. This just seems like parallax.
You can do this with a Kinect for head tracking and people do that with homemade pinball machines using a TV as the table, and a Kinect to track your head, so it looks like the table is 3D down into the TV.
But I don't think it creates the full 3D effect with things looking like they are coming way out of the screen and like a tangible thing you can reach out and touch.
How is this different from the many available head tracking devices? I used to play Flight Simulator with head tracking[1]. It was great, although it did consume a bit of CPU cycles. IR trackers are more efficient
[1] - https://www.youtube.com/watch?v=P07nIcczles (actually, this one was using a paper tracker because the face tracker had a big impact on fps)
I remember this being done for a DSIware game released ~2010 - I couldn't find much footage apart from this quick clip from a trailer.
https://youtu.be/4zZfsyHEcZA?si=BE2I991zEVxPEt9F&t=57
Seemed to have some different names depending on region (Looksley's Line Up, Tales in a Box: Hidden Shapes in Perspective). I recall it working very well at the time.
Interestingly, for this parallax 3D effect to work, the head tracking needs to basically move "backwards" from typical head tracking since it needs to keep the focal point the same, if I'm understanding correctly. Any time I've tried this out it's fun and would likely be most useful for something like a 3D painting you hang on your wall.
Related idea, but not the same, might be my iOS and Android app that uses your phone's AR for head tracking and then sends that data to your PC for smooth sim game head tracking. https://smoothtrack.app
really fancy idea and cool project! Its a bit stuttery probably due to the 30fps of a webcam but works. Feels slightly weird somehow though, probably from the lag.
update: just tried to open the site again now and its gone but leads to some kind of shop?
update2: oh use the link in the comment for the demo: https://portality.io/dragoncourtyard/
here's a webgl poc that was tested with mobile & desktop browsers: https://www.webgma.co.il/Articles/window-3d-tracking/en/
source code at https://github.com/guyromm/Window
imho this should be useful for driving/flight sims, giving the player the ability to lean inside the vehicle, changing their viewpoint on the surroundings.
Reminds me of the Amazon Fire phone which featured similar tech prominently. "Dynamic perspective".
https://www.youtube.com/watch?v=6trOg2IK2Zg
Does this really fool you if you have two eyes? I haven't been able to try it or watch the videos (they're behind a cookie warning I can't get past).
If the latency is low enough to fool you, it makes the screen look like a window. It's much more limited than a headset, but especially with a natural scene, it does get the illusion across.
But you're still behind the window.
Drawback is that it only works if you constantly move your head or device.
Is this similar to sony spatial 3d display ?
Seem to recall one of the pinball emulators having this as a plug-in years ago.