DIGM Workshops: Week #1 Overview

Standard

This series of blog posts will be completed weekly for two of my graduate workshop classes, DIGM620 and DIGM599i. Both of these courses have the same goal: to produce a 10-week project that aids in the development of my thesis, a virtual reality animation of the titanosaur Dreadnoughtus schrani.

Week #1 Goals: 

  • Meet with production team to discuss overall roles for Fall term.
  • Develop a 10-week project that fulfills the scope of this class.
  • Begin production work for the aforementioned project.
  • Meet with Nick J to determine what thesis work will count towards his independent study.
  • Anatomy / committee meeting with Dave Mauriello.
  • Write the script!
  • Figure out what exactly I’m allowed to post on this blog.

Week #1 Notes:
I spent the majority of this week meeting with professors, team members, and potential committee members to figure out what’s going on this quarter. Simply put, “Project Dreadnought” is a huge thesis project — in terms of both mass and project scope. I’ve spent the past 6 months working with a small team of artists to model assets, test simulation and rigging pipelines, and reconstruct the skeleton of Dreadnoughtus via 3D scanning, photogrammetry, and ZBrush sculpting. Simultaneously, My adviser Nick J and I have been working to develop a pipeline for easily (and reliably) rendering spherical, stereoscopic video content optimized for mobile VR.

There are a lot of plates spinning, but this project has the potential to blow out of scope quickly. That’s why I’m pouring every available resource into production. I’m taking a Digital Media Workshop class while simultaneously doing an independent study / production class with Nick J, in an effort to double my time.  Essentially, I’ll spend the next 10 weeks managing two large chunks of my thesis, overseeing two other artists, and trying to keep my head from rolling off my shoulders. This blog is where I get to document all that!

DIGM 620 Overview:
After spending the entire summer modeling the bones of Dreadnoughtus and articulating the skeleton, now I get to rig it. I’ll be using my time in this class to coordinate the creation of the Dreadnoughtus rig and manage the other artists on my team: Emma Folwer (Senior Project / Paleontology), Brandon Percia (Sophomore / Animation), and Zack Thomas (Former Co-Op / Animation).  Dave Mauriello is advising on this portion of my thesis, and this week I met with him to discuss the bests method for rigging a titanosaur skeleton and simulating muscles. I’ll also be meeting with my committee member Dr. Ken Lacovara once per month to ensure that  While I’m working on the skeleton rig, Emma and Brandon will model and texture the animal’s musculature, following a softbody muscle simulation pipeline I established in my Spring DIGM 540 Class. By the end of this quarter, we aim to have a rigged muscle system ready for skin simulation tests.

DIGM 599i Overview:
This class serves as a 10-week production study in 360-degree stereoscopic pipeline design. My work will include shooting spherical photo / video background plates, converting monoscopic photo spheres to stereo, researching render / stitching options, and seamlessly compositing CG animation for cinematic virtual reality. I’ll also be testing the differences between HMD VR workflow and spherical video optimized for Dome projection. I’ll also be using this time to finish pre-production for my thesis, reworking my proposal and developing a script, story-“circles” and layout tests.

On VR, Scale Visualization, and Really Big Dinosaurs

Standard

via markwitton-com.blogspot.co.uk/2014/09/hey-dreadnoughtus-not-so-close.html

At the behest of my advising faculty, this blog has been reincarnated as a hub for my thesis research and production updates. Don’t mistake my summer hiatus for a lapse in productivity — I’ve been crunching away at my paleoart reconstruction work, but I’m still awaiting information about what I’m allowed to share publicly. Rest assured, cool things are still in the works!

A brief recap: For my Master’s thesis, I’ve been working on an animated reconstruction of the titanosaur Dreadnoughtus schrani, one of the largest creatures to ever walk the earth. This project began as a digital paleoart animation, but has evolved to suit a medium that is rapidly gaining traction in contemporary digital media: Virtual Reality.

While researching scale within the the context of conventional cinematography — achieved through camera angles, frame composition, careful lens choices and selective editing — it occurred to me that this language is absent in the emerging medium of spherical cinema. While VR device ownership is on the rise, interactive spherical videos are becoming more accessible than ever. Facebook now allows users to share 360-degree videos, which use a smartphone’s accelerometer to control the POV within a dome-shaped video scene. 360-degree YouTube videos can also be viewed through the lens of Mobile VR devices like Google Cardboard, conveying an unprecedented level of accessible immersion to an audience of social media users.

When the viewer controls the camera, the director loses control over scale relationships on a shot-by-shot basis. This not only affects how big something LOOKS, but how big something FEELS — an important distinction in any narrative medium. Preserving accurate scale representation is specifically important for Paleoart, which strives to portray extinct animals with the highest degree of accuracy possible (within the limits of scientific literature). When your subject weighs upward of 60 tons, scale visualization becomes an essential factor in this portrayal. Dreadnoughtus could tip the scales against an entire herd of elephants. Most people have never even seen an elephant in person — how can you picture such an animal in your mind’s eye? Creating an educational Dreadnoughtus film for Virtual Reality could facilitate the public’s understanding of the animal, as evidenced in similar studies that suggest that immersive media technology can influence an individual’s ability to remember information in the physical world.  

When communicating scale in virtual reality, film language is substituted with the sense of “Presence” induced through spatially immersive environments that blend the boundaries between “real” and “digital.” This evokes the same visceral feeling one gets when looking up at a mounted skeleton in a museum, and can be achieved through careful environment design and narrative cues. My thesis now takes place over three immersive environments with differently created background plates: Photographic, Videographic, and Computer-Generated. Whether the complexity of the environment has any effect on sauropod scale visualization is yet to be determined. (Storyboards […storycircles?] to come in the near future!)

Presence in realtime media implies a “feedback loop” between the player and the experience, which makes the player feel like they affect their surroundings. The limitations of cinematic VR restrict interactivity, but generally allow for higher graphics quality to be rendered. For some purposes (architectural visualization, for example), real-time rendering engines can achieve photorealism that rivals rendered CG. It’s easier to render “things” than “creatures,” though, and the Dreadnoughtus rig will include softbody muscle simulations that are much harder to achieve with a realtime engine. Rendering this project as an immersive video also dramatically extends its reach, allowing it to be shared via Facebook, YouTube, and Mobile VR, and (potentially) educational programs like Google Expeditions.

I’ll be updating this blog weekly with all updates I’m able to to share publicly. Stay tuned!