A busy week, with a move to larger premises and a lot of development work in preparation for future, exciting projects. We really are looking forwards as a business, but recognise that whilst many talk of immersive technology as the future, we know it’s the now. This was our week.
Toby – Delivery Driver, Mask Provider, Hand Sanitiseror
In amongst the “Working on the business” stuff and generally trying to get a grip on what we have going on, we’re also moving offices.
Fortunately we’re only moving to the opposite side of The Courtyard in the same location but we need more space so, having just repainted and refurbished our existing office space, off we go regardless…!
In line with the COVID-19 guidelines we’ve been handing out masks, spraying everyone liberally with hand sanitiser, and I’ve been transporting equipment from point to point.
Client-wise we’ve been further testing out cloud streaming and our 2D driving game, alongside lots of work on Augmented Reality and the use of hidden QR codes.
Fun fun.
Josh – Programme Manager
Scenario 5 – Customer Experience VR
By this point the project is basically developing itself, however I prepped the dialogue tree for the team and sourced voice over artists – Slava’s made a lot of progress too, on some of the art assets.
Endpoint Management – The innovation centre and COVID-19 has caused a big increase in the number of machines we need to manage and maintain, doing so manually would be inefficient, attempting to do it while remote would be almost impossible.
I’ve been trialling endpoint management software for a fair part of the week. It gives us the ability to “centrally discover, provision, deploy, update and troubleshoot” devices, all from the comfort of my bedroom.
We’re just about good to go, without spending a pound on IT consultancy fees, we’ve effectively got enterprise-like control of all our IT services.
Incident Management – We’ve now got an understanding about the requirements for all seven scenarios and backlog planning will begin next week.
Cat Flynn – Lead Programmer
I started the week continuing my work on establishing a toolchain to work on mobile projects. Ultimately we will need to support both Android and iOS projects, but these platforms have different pipeline requirements because of how their platforms work.
In Android’s case, APK files can be built and distributed to devices similarly to how we share builds for desktop solutions, although the build process is more complex as a result of needing to use Android’s Gradle build system.
For iOS, the build and distribution process is significantly more involved. A Mac is required to perform the build at all, and distribution to test devices requires setting up a developer account with Apple and working with Xcode to build and publish apps.
While actually getting builds onto devices is relatively automatic after that point, I am expecting that preparing the pipeline stages between Unity and Xcode will be less than straightforward, as I have not worked with automating iOS build processes before.
I did some work on streaming solutions to remote clients using Amazon AppStream 2.0. After running some experiments internally to smoke out any obvious issues with latency or quality, I prepared a fleet to demo to clients to share our capabilities.
As far as we could tell, the app we were demonstrating was actually a worst-case scenario since every pixel on-screen changed every frame. AppStream tries to optimise bandwidth by only sending pixels that change – a fairly common compression technique when dealing with streaming video.
This works very well for productivity apps such as Word since only the line with the cursor on it is updated, but any 3D game with a movable camera will suffer.
Lastly, I continued working on the paper build for our ‘Animals on the Network’ solution. We’ve been working on building reusable, modular components to use in our solutions and this is the first opportunity I’ve had to work with some components Jordi has developed over the past few weeks.
This approach, is helping us develop a consistent feel to our solutions, while getting them up and running more quickly. However, it’s apparent a lot of these components only have one member of team who truly understands how they work, which needs addressing as we hone our process.
Sergio – Programmer
After returning from a small holiday break this week, I started working on Scenario 5 of our long ongoing Traffic Officer VR project. It was good to work on it again, as it was one of the first projects I worked on since joining MXTreality and especially seeing all the progress achieved by the team.
Thanks to its core building blocks and mechanics already implemented in the first scenarios, the work is minimal from the programming side.
As usual, I started by setting up the scene with placeholder logic that would be filled by final art and content (dialogue text, voice and animations). The most complex part of it all was probably setting up the dialogue tree which requires a lot of attention.
One of the biggest distinctions from the other scenarios is that this time around we are utilising two customer NPCs (Non-player characters) which will have reactions and speech at the same time.
After meeting with our animator Stefano, we agreed the right procedure to take, based on the requirements we had in this scenario. Next week I will implement the animations and audio, along with the rest of the art.
Stefano – 3D Artist
This week I started preparing the two new characters for the next scene while awaiting the recordings from the voiceover artist. I’ve been studying ways to improve accuracy in the movement of the facial muscles and how bones behave, to make our animations more realistic.

After this I worked out how to tackle this new scenario, which has more interactions between characters than usual. Discussing and testing this scenario with our programmers, we found a perfect solution, so now it’s time to record the mocap and get the work done.
I also started an interesting study of QR Codes and their capabilities. I found out that they can be distorted at different levels and still stay recognisable, like changing the colours or distorting the shapes.
Using particular tools, it’s also possible to personalise the appearance of the code, making it similar to an uploaded image. It will mimic the shape described in the picture, within the limits of the QR Code “pixel resolution”.
These techniques should, with a lot of creativity, allow us to camouflage and conceal them within an image, which was my main objective from the beginning. Nice!
Slava – Lead 3D Artist
This week we started a new scenario for one of our current VR projects, which required me to create a new environment for and optimise the new vehicle for it.
This new scenario will take place in the evening, near a city, on a motorway with bridges. To create this scene, I reconfigured various assets from previous scenarios and the simulator. Then I changed vegetation and added new fence, while also adjusting the lighting and skybox for evening conditions.
Today we have 5 different realistic scenes, for 5 different scenarios. They differ in their landscapes, vegetation, weather conditions and time of the day.
Diversity is a very important part of realism in virtual reality experiences. As in real life, things not only have to look photo-realistic, they also must not be too repetitive and should appear in different environments and lighting conditions.
A balanced combination of all these factors, including diversity, contribute to the realism of the training experience and make it truly immersive. By contrast, realistic looking but repetitive and dull environments create the impression of an obviously artificial situation, which does not help achieve training goals.
Another task undertaken this week was the preparation of a new vehicle, an estate car for a big family. I followed our now standard routine of reducing the number of polygons, baking normal maps and applying materials in Unity.


Kyung-Min – 3D Generalist
Moving up to Unity’s High Definition Render Pipeline (HDRP), it’s been great working with higher quality visuals again. With the first of our mini VR experiences approved, the focus switched to building the modular environment set-pieces that would accommodate fast iterations of any future game concepts.
For the environments, the modular set of choice was a series of rocks with different silhouettes that could be rearranged into any form or structure. The first environment design was based around the Hallelujah mountains from Avatar with huge floating segments and also influenced by games like Ico, Shadow of the Colossus, and Richie’s Plank Experience.
Having tested the raytracing abilities of the HDRP pipeline in a previous RND project, I have already seen photorealistic cars and shrubs rendered, but for this project, we decided to head in another direction for art.
Discussing the project with Slava, our lead artist, we both felt a more anime or stylised aesthetic was the look we wanted and working in a new style has been both challenging and refreshing, whilst being a thoroughly enjoyable experience.
A big part of this change and upgrade came with the Shader graph. After learning the basic foundations and comparing them to other similar systems, I have left the education for later and focussed on the environment.
After my brief play with shader graph, I can definitely see the potential for artists to start making a bigger impact on the production pipeline, with the generation of complex shaders and effects, no longer needing to be offset entirely to our shader artist Jordie.
Next week we will witness the ‘rocks’ of my labour as we approach the cliff tops in VR. While a simple experience, the segments built will be reused to create any number of experiences with modularity and customisation being the key.
Jordi Caballol – Programmer
This week I’ve been working with the Varjo XR-1 mixed reality headset, making the driving simulator and have I returned to my particular fight with VR mirrors, finding limitations of various sorts.
But finally we have realistic mirrors in our car!
Now that this part of the simulator is ready, next week I will start getting the physical part of the simulator ready: a gaming set of steering wheel and pedals and a chroma to mask out everything else, so we can really feel like we are sitting inside a real car.