Microsoft’s patent technology creates a 3D model just by scanning an object

Microsoft has just been awarded a patent for the step preceding 3D printing: making 3D printable digital models of objects, specifically after scanning. 

This patent revolves around a static depth camera that can be used to scan everyday objects from various angles and generating 3D models on your PC. If you might wonder how in the world that can be patented – after all, there are quite a few scanners out there already – the crucial difference is that Microsoft’s patent revolves around scanning objects without placing them on revolving plates or getting perfect shots every time. 

Image credit: Microsoft

Instead, the idea is that you simply hold an object in your hands, revolve it a few times and the software does the rest. The turntable is completely removed from the scanning process The patent also concerns the technology’s ability to distinguish between the object, the users fingers and even the objects around it. The result simply consists of high resolution images with prominent surface patterns and good color quality. 

Watch the Patent Yogi’s video below on how it all works. He even points out how it might be a useful application for making 3D maps. 

The Patent Yogi breaks it down in the below video. Microsoft uses a static depth camera to scan everyday objects, in this case a model cat, then regenerates it on the computer in a three-dimensional model. 

Computer users can then modify the model — an awesome application for illustrators and those working in animation — and even make a real-life replica on a 3D printer. 

Explore further in TNW


FutureEnTech is a platform to express yourself and it helps in spreading awareness about the latest technology that supports our Environment. Let's share the knowledge and help our environment. Subscribe to FutureEnTech & get the latest updates directly to your email.

FutureEnTech has 1584 posts and counting. See all posts by FutureEnTech


Leave a Reply

Your email address will not be published. Required fields are marked *