KinectFusion enables a user holding and moving a standard Kinect camera to rapidly create detailed 3D reconstructions of an indoor scene. Only the depth data from Kinect is used to track the 3D pose of the sensor and reconstruct, geometrically precise, 3D models of the physical scene in real-time. Uses of the core system for low-cost handheld scanning, and geometry-aware augmented reality and physics-based interactions are shown. Novel extensions to the core GPU pipeline demonstrate object segmentation and user interaction directly in front of the sensor, without degrading camera tracking or reconstruction. These extensions are used to enable real-time multi-touch interactions anywhere, allowing any planar or non-planar reconstructed physical surface to be appropriated for touch.
|Published (Last):||12 March 2015|
|PDF File Size:||20.8 Mb|
|ePub File Size:||14.60 Mb|
|Price:||Free* [*Free Regsitration Required]|
It's a little shocking to think about the impact that Microsoft's Kinect camera has had on the gaming industry at large, let alone the 3D modeling industry. To better appreciate what's happening here, we'd actually encourage you to hop back and have a gander at our hands-on with PrimeSense's raw motion sensing hardware from GDC -- for those who've forgotten, that very hardware was finally outed as the guts behind what consumers simply know as "Kinect.
The Kinect took 3D sensing to the mainstream, and moreover, allowed researchers to pick up a commodity product and go absolutely nuts. Turns out, that's precisely what a smattering of highly intelligent blokes in the UK have done, and they've built a new method for reconstructing 3D scenes read: real-life in real-time by using a simple Xbox peripheral.
The actual technobabble ran deep -- not shocking given the academic nature of the conference -- but the demos shown were nothing short of jaw-dropping. There's no question that this methodology could be used to spark the next generation of gaming interaction and augmented reality, taking a user's surroundings and making it a live part of the experience. Moreover, game design could be significantly impacted, with live scenes able to be acted out and stored in real-time rather than having to build something frame by frame within an application.
According to the presenter, the tech that's been created here can "extract surface geometry in real-time," right down to the millimeter level. Have a peek at the links below if you're interested in diving deeper -- don't be shocked if you can't find the exit, though.
Buyer's Guide. Log in. Sign up. Latest in 3d. Image credit:. Sponsored Links. In this article: 3d , AR , augmented reality , AugmentedReality , fusion kinect , FusionKinect , hack , interaction , kinect , kinect hack , KinectHack , microsoft , microsoft research , MicrosoftResearch , motion sensing , MotionSensing , newcastle , research , sensor , sensors , siggraph , siggraph , Siggraph , uk , university , xbox , xbox , xbox kinect , Xbox , XboxKinect.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Inside Indonesia's fight to save its most important soil. Cowboy upgrades its e-bike with a carbon belt and puncture-resistant tires.
Riot is testing 'Valorant' on consoles -- but don't get your hopes up. From around the web. Page 1 Page 1 ear icon eye icon Fill 23 text file vr.
Microsoft's KinectFusion research project offers real-time 3D reconstruction, wild AR possibilities
KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera
KinectFusion: Real-time 3D Reconstruction and Interaction Using a Moving Depth Camera