Our new teamies have settled in nicely which is just as well; we have a number of new projects on the go and we’re gearing up to repaint and refurbish our offices at the weekend – not a virtual reality project, but a reality project with dirty hands. This was our week.
Toby – leader, mentor, friend, and all-round wonderful human being
I spent the start of the week at a client’s office in Bristol thrashing out a strategy for supporting various events they have planned for the year. While I was there, and equally importantly, I’d been asked by the team to lay my hands on a Traffic Officer jacket – an empty one I hasten to add!
What for? Well mostly, apparently, to accurately record the sound that the jacket makes when you’re sitting in close proximity of the person wearing it! Immersion is all about the detail.
Essentially, if you’re going to accurately model an environment and, especially one familiar to the user, you kind of make a rod for your own back because, if one detail is slightly off, it can ruin the immersion you’ve spent so much time on.
As well as lining up the weekend’s refurbishment works we also have some client works going on with regards to our Innovation Centre; much more of that over the coming weeks but this week we got dates in the diary to install the tracking system and the sound….! Everyone is VERY excited about this one.
Cat – Lead Programmer
It’s been a bit of a varied week for me. As part of learning about Unity’s Package Manager, I’ve set up a package registry for the company to have somewhere to put them.
By using packages to build our projects we can reduce duplicate code and also speed up compile times thanks to Unity’s Assembly Definitions, but there’s a trade-off in that it becomes harder to change the code in those packages quickly.
If code in the packages needs to be updated it requires a new version to be put on the registry, which can cause issues for the project actually using the package, or whatever packages depend on it. This is a phenomenon infamously known as ‘dependency hell’ in software development, so I’m trying to be careful and do my best to avoid it at MXT!
I’ve also built the dialogue tree for the second scenario of Traffic Officer VR, and make some accompanying upgrades to the dialogue system. As the trees are getting bigger we’re needing to update our tools and processes to keep up.
By adding a label system, as well as the option to display vital information about a dialogue node while actually playing through the scenario, I’ve removed a lot of the guesswork needed to debug it. I’ve also created formatting options for the conversation log at the end, to make a little more flexible the messages that actually get displayed.
Lastly, I’ve been doing some preliminary research ahead of receiving the Varjo XR-1 headset, which we’ll use to help build our driving simulator.
There are three key components of the project; the XR-1 headset, the simulation software itself and some sort of sensor apparatus to be able to measure the state of the vehicle’s controls. To get them to work together will be a test of our undoubted talents, but a great learning experience.
Josh – Programme Manager
My week was made up of script writing and project administration, which might not be as exciting as some tasks, but is critical to keeping us all on track. The script writing required me to envisage the challenges faced by a disabled motorist, whose vehicle has broken down in a live lane, with appropriate input on this crucial topic.
Work on the Test and Innovation Centre has resumed. I compiled a backlog (list) of items, and prepared quotes from suppliers. Similarly, a concept/prototype for a mixed reality driving simulator has been ‘okayed’, so we’ve been in the early stages of planning.
Most important work of the week is probably the least exciting, prioritising and organising this week’s sprint for the Customer Experience project, along with the usual stakeholder engagement.
As always, we’ll make our deadline, but we’re a few hours behind schedule. The key for next week will perhaps involve us preserving a good feature or two that may not make this release, but will be deliver impact for the customer, in a future iteration.
Stefano – 3D Artist
This week started with a funny motion capture session with our new Xsens suit. Working with a young actor and a theatre director was interesting, because, as I noticed during the acting that different takes were full of bigger or smaller meaningful differences in the acting itself.
We recorded different takes for every scene, to be able to choose the best one, with the voice recorded simultaneously, but with a different device.
Then came the hard part, as I started pairing files and synchronizing the movements with the sounds – the ‘clap’ idea mentioned in the last blog post, turned out to be very useful and time-saving. After reviewing all the takes, we saved the best ones and began using them in our scenes.
In the beginning, the process of aligning the start and end of every animation to make them fit with the ‘idle’ one was very time consuming, but I resolved the problem by writing a script that automates it the process.
The small space of the car in which the animations takes place makes the work more challenging, but the results so far are very satisfying. Next week will be all about perfecting animations, adding finger movements and integration within the VR app.
Sergio – Programmer
This week I have been helping stitch features and art together in our ongoing projects. Particularly, on our new traffic officer experience which has an amazing new rain effect written and designed by our most recently joined team members, Jordi and Kyung-Min.
I also worked on the scene adaptation and terrain modifications made by Slava and our NPC animations made by Stefano, with the help of our new motion capture approach, as discussed in previous blogs.
In parallel to the merges, I have been worked to enhance our current traffic system on the road, which now includes additional vehicles and can now adapt to our new shader-based infinitely generated road made by Cat.
We have not yet fully transitioned to auto-piloted smart vehicles, but with the new improvements, each vehicle will spawn with a random set of passengers and a driver to make it more realistic – more detail, better the immersive quality of the experience.
Slava – Lead 3D Artist
This week I have been working on the environment for our new scenario, which required rainy conditions. I made a new terrain with dirty and wet materials, adding trees with objects to make it more realistic. The overall environment looks slightly hostile, which with the rain effects, should create an uncomfortable feeling. I also adjusted colour and intensity of ambient light and sunlight, to make it dark and gloomy. Finally, our team merged my scene with rain effect and specific water flowing shader to create rainy British weather.
Kyung-Min – 3D Generalist
I spent a lot of this week working with Jordi, our newest team member, as we made great progress creating a solution to the performance problem shown with the default rain solutions. We created the MXTshader to handle the rain flow on objects, which greatly optimised our rain system.
The final build comprises of three particle emitters, one with sheets and a high-density rainfall around the play area, supplemented by the MXTrain shader that handles water flow and reflections through an array of maps, independently controlled via world space coordinates.
And this week I also realised, the longer I’m here at MXTreality the more I love it.
Jordi – Programmer
This week’s work has been all about getting the shaders ready for the artists to build the new scenarios. I’ve created an effect of water flowing on the surface of objects under the rain for when we require harsh weather conditions. The effects makes the object brighter, simulating it being wet and adds some distortion that emulates the movement of water flowing over it.
Also, the Eyes and Ears experience has been updated to use exactly the same materials as the regular scenarios, but achieving a much better look, with good shadowing and a lot more detail on the surfaces.
This is part of an effort to standardize all the materials we use in the different scenarios, so once we have a material ready to use we can deploy it in any new scenario, knowing it will work perfectly first time.