I Am The Hero
Tools: Vicon Blade 1.7.1, UDK 3, Open Sound Control
Platform: PC
Project Length: 4 weeks
Team Size: 4
Designed as a proof of concept, I Am The Hero was pitched as a game where you don a mo-cap suit, slide on the Oculus Rift and fight boss-monsters.
We used the House of Moves facility located in the FIEA campus to live capture a person moving, broadcasted it from Blade via the Blade SDK using Open Sound Control to a bound DLL that was used in UDK. The broadcasted data consisted of raw position of hips, elbows, and knees. Upon reading the data, in UDK a humanoid mesh with grabbers attached to those joints was then moved rag-doll style across the field, replicating what was done in the mo-cap studio.
The process: Mo-Cap -> Blade -> UDP broadcast to another computer -> UDK -> Oculus Rift
Unfortunately the Oculus Rift was delayed the same day as the pitch for I Am The Hero, dooming it to obscurity.
Also, we ran into reception problems as the Blade SDK had a problem with creating and disconnecting to a Blade datastream. It would crash when disconnecting, thus crashing UDK. Once we established that the Blade SDK couldn't close then re-open we had to run an in-between, open sound control, capture then re-broadcast the data.
This was where we ran out of time. I'm still working on it, but Grapple and my Maya FBX Tool has my current attention.
Our small team consisted of 2 programmers, myself as one, a designer, and a concept artist, whose work is what is presented here.
My Tasks:
Platform: PC
Project Length: 4 weeks
Team Size: 4
Designed as a proof of concept, I Am The Hero was pitched as a game where you don a mo-cap suit, slide on the Oculus Rift and fight boss-monsters.
We used the House of Moves facility located in the FIEA campus to live capture a person moving, broadcasted it from Blade via the Blade SDK using Open Sound Control to a bound DLL that was used in UDK. The broadcasted data consisted of raw position of hips, elbows, and knees. Upon reading the data, in UDK a humanoid mesh with grabbers attached to those joints was then moved rag-doll style across the field, replicating what was done in the mo-cap studio.
The process: Mo-Cap -> Blade -> UDP broadcast to another computer -> UDK -> Oculus Rift
Unfortunately the Oculus Rift was delayed the same day as the pitch for I Am The Hero, dooming it to obscurity.
Also, we ran into reception problems as the Blade SDK had a problem with creating and disconnecting to a Blade datastream. It would crash when disconnecting, thus crashing UDK. Once we established that the Blade SDK couldn't close then re-open we had to run an in-between, open sound control, capture then re-broadcast the data.
This was where we ran out of time. I'm still working on it, but Grapple and my Maya FBX Tool has my current attention.
Our small team consisted of 2 programmers, myself as one, a designer, and a concept artist, whose work is what is presented here.
My Tasks:
- Moving a humanoid mesh in UDK with data similar to that from Blade (We could either use a ragdoll and grabbers on joints or rotate bones, I chose grabbers as it was easier at the time to implement)
- Create a DLL for UDK to read the Blade data
- Broadcast data from Blade in real time (this was really easy once we learned of the SDK, otherwise it was to be packet sniffing and transcoding)
- Recieved data from Blade broadcast (this was to be done in a DLL for UDK)