If you work for the 3D digital animation industry perhaps you've seen some news with this title:
But I didn't want to keep my doubts about it so -as I like 3D and I don't swallow all the information that Internet shares, even from a popular site or a greatly recognized one- I started to do some research because I know some technologies of Virtual Reality that are quite similar to this.
The note and the device sound pretty interesting. But there must be some clearance about this topic.
The news shows this video:
Which it is demonstrative only.
The device's official website (http://quma.jp/en/quma/) shows it's true features and we can clearly realize it is not a wireless device.
It is powered by a USB port and it's detected as an HID. In fact it would be good to know how every axis of movement are delivered from the device to the computer. Even when it says it doesn't need drivers or being detected as an HID, it's going to be needed a plug-in, a DLL file, or even a MEL script for Maya, MaxScript for 3D Studio Max or a Python script for Blender, and so on for every 3D animation development software, to assign data to every axis of every articulation from the armature (bone system) or the biped.
I wish they offered a complete solution, because somebody once lent us a Virtual Reality Glove
(or Data Glove) from the company 5DT (http://www.5dt.com) and it works with the incidence of the ray inside the optical fibre which is refracted when it blends because the movement of your finger's articulations. And we've been said that it work, yes, from the demonstration files, but entering data from the glove to a hand in Maya is another thing. It was needed a MEL script working directly from the libraries of the device. I don't know if the the person in charge of this experiment finished his thesis work, but he left the model prepared to be moved by entering data and part of the investigation. At least I didn't saw it working with the data glove.
In the other hand, the title is merely for diffusion purposes, because it is not a robot any time it lacks of artificial intelligence or any kind of it (http://capek.misto.cz/english/robot.html), but mentioned as a "human-shaped puppet". It is properly a Digital Input Device or Human Input Device (DID or HID) as well.
At last, but no less important, I want to say that in 1992's Jurassic Park movie there was created a similar but rudimentary device by Craig Hays, who had to design a 74 sensor armature with 4 cables each one to motion capture the dinosaur's articulations, the movement of the dinosaur, using this device, was made by Phill Tippet from Tippet Studio who, in coordination with George Lucas's Industrial Light & Magic company gave life to the dinosaurs (http://es.scribd.com/doc/50531322/45/Digital-Armatures http://graphics.pixar.com/library/DinoInputDevice/ http://kvcasestudy.blogspot.com/2009/09/fleshing-out-part-b.html).
SoftEther's sensor-laden QUMA robot demonstrates poses, intimidates your acting coach (video)
But I didn't want to keep my doubts about it so -as I like 3D and I don't swallow all the information that Internet shares, even from a popular site or a greatly recognized one- I started to do some research because I know some technologies of Virtual Reality that are quite similar to this.
The note and the device sound pretty interesting. But there must be some clearance about this topic.
The news shows this video:
Which it is demonstrative only.
The device's official website (http://quma.jp/en/quma/) shows it's true features and we can clearly realize it is not a wireless device.
It is powered by a USB port and it's detected as an HID. In fact it would be good to know how every axis of movement are delivered from the device to the computer. Even when it says it doesn't need drivers or being detected as an HID, it's going to be needed a plug-in, a DLL file, or even a MEL script for Maya, MaxScript for 3D Studio Max or a Python script for Blender, and so on for every 3D animation development software, to assign data to every axis of every articulation from the armature (bone system) or the biped.
I wish they offered a complete solution, because somebody once lent us a Virtual Reality Glove
(or Data Glove) from the company 5DT (http://www.5dt.com) and it works with the incidence of the ray inside the optical fibre which is refracted when it blends because the movement of your finger's articulations. And we've been said that it work, yes, from the demonstration files, but entering data from the glove to a hand in Maya is another thing. It was needed a MEL script working directly from the libraries of the device. I don't know if the the person in charge of this experiment finished his thesis work, but he left the model prepared to be moved by entering data and part of the investigation. At least I didn't saw it working with the data glove.
In the other hand, the title is merely for diffusion purposes, because it is not a robot any time it lacks of artificial intelligence or any kind of it (http://capek.misto.cz/english/robot.html), but mentioned as a "human-shaped puppet". It is properly a Digital Input Device or Human Input Device (DID or HID) as well.
At last, but no less important, I want to say that in 1992's Jurassic Park movie there was created a similar but rudimentary device by Craig Hays, who had to design a 74 sensor armature with 4 cables each one to motion capture the dinosaur's articulations, the movement of the dinosaur, using this device, was made by Phill Tippet from Tippet Studio who, in coordination with George Lucas's Industrial Light & Magic company gave life to the dinosaurs (http://es.scribd.com/doc/50531322/45/Digital-Armatures http://graphics.pixar.com/library/DinoInputDevice/ http://kvcasestudy.blogspot.com/2009/09/fleshing-out-part-b.html).
Comentarios
Publicar un comentario