Despite a loosening of the lockdown restrictions, the MXTreality team continues to be productive from locations across the capital as the rest of the business world assesses next steps in the return to the workplace. Happily for us Immersive Technology will play a leading role. This was our week.
Toby – Leading From Home
This is becoming a very busy period for us with a number of solutions due in June. At the same time I’m having conversations with potential customers all of whom are giving serious thought to the adoption of immersive technologies.
This ongoing situation regarding COVID-19, seems to have prompted consideration of alternative ways of working across various traditional workstreams.
One of the hardest hit sectors features businesses creating events and conferences, both of which are obvious candidates for online and virtual delivery. There’s all sorts we’re working on and have been asked to discuss, from delivery of corporate videos to presentation of new product lines.
Training and development, particularly in mission critical roles, needs to continue even in lockdown and, with residential and classroom training postponed, Immersive Technology offers a cost-effective and efficient alternative delivery method.
In the meantime we’re continuing to develop our website, the team all seem to be well, and although I’m working in the office again, the team understand they can take the lead on when they feel safe enough to return.
Right now the reality is that I can’t ensure that people can sit and work 2m apart from one another and I certainly can’t guarantee their safety on public transport but their working so well, right now, that there is no need to enforce changes either.
Josh – Programme Manager
This was the final week of our sprint for Customer Service VR, which wraps up the ‘vulnerable customer’ scenario. The animation of the customer was crucial and I am really happy how it came out. Using the motion capture suit not only saves us time but provides a level of realism that’s difficult to achieve conventionally.
The Events project is coming along really nicely, with the vast majority finished this week. The driving game is going similarly well and progressing faster than we thought. It’s a function of how modular some of our work is that we can take a feature from one project and incorporate it into quite different ones (technically) with little fuss.
Elsewhere, I’ve been discussing our workflow with the developers and integrating Gitlab with Jira. Pretty pleased with the results so far, so we will be applying it to all our projects from next week.
Cat – Lead Programmer
My time has been split this week between continuing work on the traffic simulation and developing a new workflow with Josh.
Here at MXTreality, we think a lot about how we approach our work, with the aim of being more efficient with our time and reducing the effort it takes to organise ourselves across lots of projects.
Ultimately, we’d like to be in a position that our tools can tell us what we need to do without needing to ask each other, and also be able to see who is working on what and when. This has become especially prevalent as we’ve transitioned to working from home, since we’re no longer in shoulder-tapping distance.
We’ve been working hard to ensure important information is available in a reasonable, expected place, using Discord for its communications abilities, but it’s easy for important information to get lost in the sea of messages.
So instead, we’re trying to turn to our more formal tools Jira and Gitlab to store information. By looking through the steps of how we work, we can identify what information is relevant at the time and where we should expect it to be.
In most cases it’s Jira, but for more technical processes, such as code review, it makes sense to leave it on Gitlab where it’s directly linked to the code in question.
By then integrating these tools we can link issues between them, from a pending merge request on Gitlab we can just click an automatically generated link to find the issue it relates to on Jira, so that it’s easy to tell what a piece of work is supposed to accomplish and what part of the solution requirements it fulfils.
Hopefully as the team starts to make a habit of using these integrations, we’ll find we’re asking fewer questions and wasting less work, increasing our capacity for how much we can get done – which given the growing interest in our solutions, is becoming increasingly important!
The traffic simulation is proving to be an interesting problem as I get used to working with the C# Job System. Generally my experience has been with object-oriented programming, so this data-oriented approach still feels a bit weird, but it’s clear to see how this paradigm leads to faster code.
Working on the traffic’s vision and AI, I have ensured that they can not only see other vehicles in their vicinity but also make decisions and act on the information they receive.
However, because I was dealing with actual data instead of references to data, I ended up producing a number of bugs as I tried to access data that I expected to have been updated, forgetting that I was working with a copy of the data instead.
This led to cars not being able to see each other, or thinking they were accelerating without actually changing speed. It was all very confusing, but I’m getting the hang of it and making good progress nonetheless.
Next week, I hope to have cars changing lanes, throwing a player into the mix and integrating the system with all the gorgeous graphics Jordi’s been working on – check back for an update!
Sergio – Programmer
For some time, our team has been reviewing the idea of improving the MXTreality UK website. I was glad to hear that last week we had a green light to start planning and designing our first iteration of it.
Most of the week was spent by evaluating the front-end tools we would require to accomplish our goals. With some extra research into web libraries, I will be starting to sketch the barebones of our first home page soon, but it will be a while before you see the finished product.
Aside from the website, I had a chance to explore our cloud services provider AWS and set up the domain and certificate for our new MXTreality Events project on the web. I am very surprised how Unity supports WebGL platform and runs pretty much smoothly and as expected with locked 60 frames per second.
Slava – Lead 3D Artist
This week I finished the optimisation of the Highways England traffic officer vehicle, whilst planning to make lower LODs to be able to use this car for the traffic.
Another task completed, was the creation of a talking animation for the vulnerable person in our new scenario for a customer experience project, which required more than twenty different animations.
As they were quite emotional, we decided to use facial animation in addition to simple movement of the lips. I set up the character with additional emotional blend shapes and added one more track of animation for each dialogue.
The last task I started this week was the creation of an additional segment of motorway with an emergency refuge area. This new model will include a specific area with orange tarmac and new markings, along with an SOS phone box and different signs.
Stefano – 3D Artist
This week I finally confirmed that work with the Xsens suit can be managed successfully by a single person – me! The results are rally satisfying.
After uploading all the motion capture animations in Unity, I came back to animating my animals, starting by completely remaking the deer rig, followed by animating the deer. I learned the deer has a really strange way to hiss and bark – it’s been a funny task.
I worked on the audio clip to extract the various deer sounds, removing all the disturbing noises and strangers sounds – this is the attention to detail our clients have come to expect now. After this I started the same process for another regular trespasser on the highways network – cows.
This, together with studying a way to get better head and facial animation for all our characters, will be my main task for the coming week.
Kyung-Min – 3D Generalist
With the events spaces project coming to a close, I spent a lot of time creating objects that would populate the final scenes build. This week it was frustrating, with so much to do we could have done without problems with the practical utilisation of our new work pipeline, but it has provided priceless data to improve our workflow.
The work we did in the background regarding the real-time testing of our new art pipeline was significant. A lot of advances were made, finding loopholes or issues in the new pipeline and the additional testing to see how it would react had we worked in a different way.
All the issues we found with the new pipeline could be related back to structural problems early on, allowing us to document how we would work better in future.
The only real problem I found was with the depreciation of Stingray shader, which required a lot of time on a workaround, even though it’s not a current issue. But knowing with its depreciation it would eventually become one, I felt it unfair to leave the problem with no workable solution.
With the ability to work from a simple block out at the start of the project that permeates through to the final build is amazing. Planning and knowing in advance takes precedence, but it also stops the team from straying from the original goal of the project.
Essentially we hot-swap in all the final elements when needed, allowing a bridge between our 3d programmes and Unity.
The issues we suffered were not the fault of our new pipeline, more a little human error. As I had never tried this method in a live project, I did many tests which provided great feedback but didn’t push forward the specific project we were on.
Our biggest issue came not with the system failing, but with the new pipeline being TOO efficient – it remembered connections in the hierarchy of the project, beyond what we thought was possible. This showed the power of the new pipeline, but confirmed our need to be vigilant about how well we secure and structure our planning stages.
Jordi Caballol – Programmer
This week I’ve continued working mainly on the game, except for some fixes in Scenario 3 of the Traffic Officer VR project.
I’ve finished adapting the systems of the road and the terrain to the HDRP, getting some really nice-looking motorways, as you can see. Also, some new systems have been added that allow us to control the speed limits in the motorway using gantry signs.
In addition to the road, some work has been done on the cars. There is a very simple controller for the car to allow us to start interacting with the game and seeing how everything responds, including adjustments to the speed, changing lanes and controlling the lights of the cars.