top of page
Search

Week 4 & 5 : Load Test and Refinements

  • annaekgoodwin
  • Feb 6
  • 3 min read

Updated: 4 days ago


Completed and To-Dos from Week 3

 After three weeks of building, refining, and world-making in Unreal Engine, it was time to see if our vision could actually survive contact with the XR stage's technical constraints.


Violetta, our model, standing in each of our three environments

Applying Feedback, Tightening the Machine

Coming off our Week 3 presentation, we had a clear mandate: continue pushing into the monochromatic strangeness, ensure every environment was optimized for real-time rendering, and make absolutely certain our hero product, the GENTLE MONSTER X GEMINI smart glasses, could hold up under stage lighting and camera scrutiny.

The team knocked out an impressive list from the previous week's to-do column: Lab lighting, Salt Flats materials, Kinetic Stage optimization - all locked and loaded. Sheng's rigged insect robots were animated and ready. The CG shot lighting had been tweaked to perfection. Pre-production logistics were buttoned up: wardrobe acquired, hair and makeup confirmed, shot lists finalized, pre-vis cuts polished. The Load Test: Where Theory Meets LED Wall

Danci became the undisputed hero of the day. While the rest of us watched frame rates, render quality, and the integration of our live model with varying degrees of panic, Danci dove into live optimization mode - adjusting shadows, tweaking materials, massaging lighting values, color-grading on the fly.

It's one thing to build a beautiful virtual environment. It's another thing entirely to make that environment run smoothly at 60fps while being captured by live cameras under practical lights with real human talent moving through the space. Danci made it happen, troubleshooting in real-time and finding the sweet spot between visual fidelity and technical performance.

The materials work she'd done throughout the week paid off in spades. The scenes held up. The monochromatic palette actually worked better on the LED wall than we'd anticipated, the stark contrasts and minimal color information giving the real-time rendering engine less to struggle with.

Hero Product, Final Polish

While Danci was saving us on the stage, there was still work happening back in the virtual realm. Bua imported Sheng's latest iteration of the GEMINI glasses model into our pre-vis Unreal level, and I spent time refining the lighting setup to really showcase the product's details, paying attention to the way light catches on the lenses, the subtle metallic finish on the frames, the precision of the hinges and arms.

These weren't just beauty shots for the sake of it. We needed proof that our hero product could be a star, that the glasses themselves justified the entire avant-garde apparatus we'd built around them. Every lighting tweak was a test: Does this make you want these glasses? Does this communicate innovation? Does this feel unmistakably GENTLE MONSTER?

Momentum Into the Final Sprint

The load test success gave us something invaluable going into the actual shoot: confidence. Not the naive kind that assumes everything will be fine, but the earned kind that comes from knowing your team can solve problems in real-time when things inevitably go sideways - because they WILL go sideways on shoot day...

We'd proven the environments could handle the XR stage. We'd proven our workflow could adapt on the fly. We'd proven that our weird, monochromatic, robot-filled vision wasn't just conceptually compelling - it was technically achievable.

One more week until we put talent in front of cameras, merge virtual and physical, and find out if all this planning, building, and optimizing translates into the 30-second spectacle we've been chasing since Week 1. I am SO PROUD of this team.

 
 
 

Comments


bottom of page