Simulation / Modeling / Design

If the Virtual Zapato Fits, Wear It! (GPU-Accelerated Augmented Reality)

Foto_NestorThis week’s Spotlight is on Néstor Gómez, CEO of Artefacto Estudio in Mexico City.

Artefacto Estudio is a developer of interactive applications and games. The company’s projects include a real-time virtual shoe fitting kiosk that allows people to “try on” shoes using augmented reality powered by Microsoft Kinect and GPU computing (see the video).

NVIDIA: Néstor, tell us a bit about Artefacto Estudio.
Néstor: Artefacto is an independent development studio. We integrate solutions using cutting-edge technologies like Microsoft Kinect, Oculus Rift and Leap Motion.

NVIDIA: How did you become involved in the shoe industry?
Néstor: An ad agency, Kempertrautmann, was seeking a technology partner to work on a prototype for a virtual shoe fitting exhibit for Goertz, the German shoe company.

NVIDIA: Tell us about the prototype you created for Goertz.

Néstor: The goal was to create a real-time tracking system that could follow the position and orientation of a user’s feet and render realistic shoes with a full HD live video feed. We worked on the prototype for a year and a half, and it was used as the foundation for a Goertz marketing campaign.

Artefacto Estudio's Virtual Shoe Fitting demo.
Artefacto Estudio’s Virtual Shoe Fitting demo.

NVIDIA: What happened next?
Néstor: 
After the campaign was released, we were contacted by companies around the world who were interested in using the system for various purposes. Eventually that road led us to Delcam-Crispin, a leader in design and manufacturing software for the footware industry. We now have a distribution deal with them and we are working on an end-user version of “VSF,” our virtual shoe fitting system.

NVIDIA: What role does GPU computing play in your work?
Néstor: VSF utilizes three Kinect sensors: two for tracking and one for the user interface. We need to process all that data at 30 frames per second (fps). Initially we tried to use only the CPU but it was not enough to get real-time performance. Now we are able to achieve our goal by running image filters and computer geometry algorithms on the GPU.

NVIDIA: In what ways do you leverage CUDA?
Néstor: Our approach is to try to use all the CUDA processors most of the time. For the images, we divide the image into smaller portions and divide them among the available processors. For the voxel algorithms we use a similar approach.

Some of the image processing algorithms we are using are erosion, background removal, gaussian blur and connected component labeling. Other algorithms include finding the minimum and maximum, matrix multiplication, table look up, building a voxel and 3D connected component labeling.

NVIDIA: Why is CUDA important to you as a developer?
Néstor: We take advantage of having thousands of CUDA cores processing information at the same time to reach real-time performance.

Profiling is very good on CUDA and it allows us to find bottlenecks easily. One of our favorite tools is NVIDIA Nsight Visual Studio Edition, which enables us to debug our parallel code and profile the execution times of every part of the tracking algorithm. It’s an excellent tool for debugging and optimization and it integrates well with Visual Studio.

NVIDIA: What’s the “next big thing” for your company?
Néstor: We would like to deliver a complete augmented reality-based retail experience where you can try on virtual clothes. The clothes will look very realistic and the textures will respond naturally to your body movement.

Read more GPU Computing Spotlights.

Discuss (0)

Tags