WEEK COMMENCING 27TH APRIL

The lockdown continues but from the rise in enquiries we are receiving here at MXTreality, for projects benefitting from our particular brand of immersive environment excellence, the return to work is on the minds of many, regardless of what shape that return takes. This was our week.

Toby – Leading From Home

I’m often staggered by the work our team gets through and even more so under the current circumstances, which are challenging for any business, let alone one like ours that relies on big computing power, immersive technology and close collaboration.

We have so much to be excited about, given our recent finished projects and we’re still attracting new business, which is great news obviously.

Just reflecting on the pipeline, right now, and we have the following projects due in the next three months:

  • Animal behaviour simulation;
  • Empathy in customer service;
  • Hazard Awareness on the roads;
  • Immersive Technology familiarisation – fun activities to assist with the use of VR;
  • Vehicle safety checks;
  • Immersive audio-only experience;
  • Not-so-secret project on driving simulation;
  • Virtual Event hosting;
  • A game for playing at events;
  • Iterative multi-user training simulation for managing hazardous incidents.

Alongside these time-intensive projects we try to ensure our team have the time and space to conduct their own research and development, so we have various background projects ongoing, that we hope will bear fruit in existing and forthcoming projects.

If you read these diaries regularly you’ll see the work Cat’s been doing with the Varjo XR-1 headset, as an example, but the whole team continuously review our use of tools and the opportunities available to us from adopting new ones. Keeps us all on our toes for sure.

Josh – Programme Manager

This week has been a little busy. We kicked off a virtual event project this week. Immediately, it is a response to COVID-19, allowing our client to showcase their latest products and services and communicate information to their customers, employees and supply chain, all while adhering to distancing guidelines.

I think it’s quite possible that once work returns to relative normality, virtual events might remain a compelling alternative to physical shows. I am not alone in recognising the exciting benefits of greater accessibility, lower emissions and more flexibility of this approach.

That said, from a technical point of view it’s quite challenging, we’ve scoped the project and are currently prototyping ideas. We have also discussed a brief for a smart motorway driving game – the word game is usually banned from our office in favour of solution; this one will be a nice change.

Most interestingly perhaps, was a brief for what might be a programme of incident management solutions, which it appears might be quite transformative to the client’s business.

Scenario 3 of our customer experience solution has kicked-off and the environment is almost complete, which looks incredible – hopefully Slava will share some of it.

Looking forward to next week, where I fully expect the pace of new projects will necessitate Toby and I being drafted in as developers. Looking forward to MXTreality’s first board game…

Cat – Lead Programmer

This week I’ve been getting to grips with the chroma-keying capability of the Varjo XR-1 headset, as well as starting to add some control capability to our virtual traffic officer vehicle.

Chroma-keying is a technique commonly used in the visual effects industry to isolate elements of film, in order to do things like replace the background, or paint out an actor to be replaced with a computer generated model, but more of that in my blog on the subject.

The latter half of the week has primarily been spent on implementing the ‘driving’ part of the simulator, at least from a software perspective.

We’re not really in a position that we can set up equipment in an actual vehicle at present, so actually pulling control information from a real car’s controls will have to wait until the end of quarantine.

To accommodate this, I’ve designed a generic input layout to control the car from a variety of inputs. So far, we’ve got support for a keyboard and gamepad, and are thinking about getting our hands on some racing gaming peripherals to get as close to an actual car as possible from a desk setup.

With any luck, it should just be a case of linking some sensors on a car to Unity and the rest of the control code will just work.

Along with the control system, I started building a physics-based rig to accurately model the vehicle’s movement. For performance reasons, our current traffic systems are ‘on-rails’, meaning the vehicles follow a set path at a set speed without modelling physical effects like tyre slip, inertia or rolling resistance.

By modelling the vehicle’s wheels, suspension, mass and drivetrain we end up with a result that’s a lot more realistic. All that’s left to do now is to tune it – to take a look at the real vehicle we’re representing and putting together a simulation of its engine, gearbox and real-life centre of mass.

I’ve always been a huge fan of racing games, particularly realistic ones such as Dirt Rally and the Forza series, so I’m really looking forward to this challenge! I think the hardest bit will be resisting adding support for novelty horns.

Sergio – Programmer

Last week started with fresh new project that intends to provide customers, partners and public a set of experiences and information to discover during an exhibition event with Highways England.

This was a great opportunity for us to test our previous progress and ongoing development of our technical workflow. Most of the week was spent doing pre-production state with our project manager and a prototype build, to see if we are going in the right direction.

Our prototype was broken into 2 parts: Initial block out and final block out.

Initial block out would give artists freedom to layout the placeholders in the scene with simple primitive cubes that would pinpoint the location of where the final art would be.

As a programmer, I would use those blocks to develop and test the logic around it. If my code works with primitives, then it will also work with final art.

Final block out allows us to improve quality with more complex geometry imported straight from modelling software into the engine, by using our new workflow tool. Any changes would seamlessly reflect between both.

Our next stage is production, where we will take our prototype and use it a placeholder for final assets and performant code.

Slava – Lead 3D Artist

This week I was working on new environments for the next scenario of our customer experience project, which involves a vulnerable person walking along the motorway.

We decided the scene will take place on the straight piece of road in the early morning. To make the environment I used a new skybox, made during sunrise. I converted the motorway section and surrounding terrain we used for our driving simulation, changing the texture of the terrain to make it more suitable for static scenes.

I also placed some trees and added hedgerow, which helps to explain the person cannot easily be seen from the motorway. I also left the lights of lampposts on and adjusted the scene lighting and position of the virtual sun.

Stefano – 3D Artist

This week I mostly researched in the rigging field to solve issues caused by bad online assets and invested some time to search and reproduce bugs.

But in preparation for another motion capture session, with the challenge of lockdown, I will need to act and record all the scenes by myself, apart from the voiceover recordings. It’s another good opportunity to prove the Xsens suit has been a good investment. But this time in extreme conditions.

While waiting for the voice clips, I recorded some scenes that are independent from the voice, like walk cycles, various moods, idles clips, gestures, etc.

Other than that, I started setting up the scenes in Maya to be ready to receive the motion capture files to transfer to the character and solved some skeleton re-targeting issues that addressed small inconsistencies in the position of the character.

This week finishes with work on improving the facial expressiveness of the character to enhance the communication and the overall experience.

Kyung-Min – 3D Generalist

On a brand new project! Taking a break from modelling and texturing objects I have returned to developing the artist’s pipeline to maximise our productivity in future.

Developing a system that allows us to undertake technical art reviews and approve art before its ever placed in Unity, will provide so many advantages and prevent many of the issues and wasted time of the past.

Our project will be free of clutter and unused files, making it faster to navigate but more importantly the merge request will be faster, and project sizes smaller – a real win-win.

In line with this advancement, I tackled the first internal art, quality assurance (QA) and merger request. This was a small step but one that will allow artists to take control of merging, rather than leaving it to the programmers and increasing their workload unduly.

My main battles this week were fighting with STINGRAY SHADERS and a possible workaround to get them into our pipeline.

As the week drew to a close, this battle got me thinking about converting our project to the Universal Render Pipeline (URP). While it is still being developed and has many issues, it is intended as a future solution and it would make sense to try it for now for valuable feedback, while also potentially being the fix for my workarounds.

With approval from the team to make this leap I’m extremely excited for what the next week will bring!

Jordi Caballol – Programmer

This week we started with a new scenario for the Traffic Officer VR project for Highways England. The scenario is about stopping a man that’s walking on the hard shoulder of the motorway and getting the police to remove him.

So this week I’ve been setting up the scene, a key part of which is the transition between three well-defined scenarios.

  • The first scenario is inside the traffic officer vehicle following the man, here we choose between three options of how to stop the man.

For this setting we needed to be moving and we needed to switch all this movement off when we left this setting, so it was the most technically demanding one.

  • The second scenario is when we are talking to the man on the hard shoulder, with the user’s objective being to convince the man to move behind the barrier. This scenario is like any regular scenario.
  • The third one is almost identical to the second, but behind the barrier and here the user must convince the man that they should call the police to pick him up.

Aside from setting all this up, I also worked on polishing our ‘Eyes and Ears’ project, which mainly concerned improving the outline that appears around the hazards. Now we don’t outline the whole object, but only the part that’s visible (i.e. not occluded by any other object).