The first project I worked on during my internship was called “Smoke Filled Room”. The project was designed to simulate the training exercises used by the fire services, when training fire officers to enter, navigate and work in a “Smoke Filled Room”. This project existed prior to my involvement, and as far as I understand it was originally made in Unity. However, it was quickly realized that due to the requirements of the projects and its reliance on realistic and immersive particle effect, that Unreal engine was the way to go. Originally made in UE4.07, the project at this stage was no more than a proof of concept. Its form – a simple room with stock Unreal assets, and a basic exponential fog field creating an obscure cloud.
When I started on the project, we were initially just instructed to upgrade the project to Unreal version 4.12, and to port the current Occulus demo onto the newer (at the time) VIVE system. However, what management didn’t realized is that the project folder we were to work with had been misplaced, and therefore this task was impossible. It was at this stage that me and the other member of my team proposed we recreate it. I had intermediate experience with the engine, and Ben (my team member) had no experience, however, this project was designed as a learning experience, and as such we were happy to take the challenge on.
The project was primarily carried out in blueprints with small c++ sections for more complex functionality. This similar visual form meant we could get quickly up to scratch with the engine. It is also well known that blueprints are a great way to get familiar with the C++ syntax and make working in the c++ side of the engine much easier.
There were three main tasks we had for this project; to implement a robust VR experience with emphasis on the experience being immersive, to create a realistic but optimized particle system for handling the smoke and to create a space that would allow a user to quickly and easily design a room to match the area in which they had there VR system.
The first task we considered what senses we could work with to make the experience immersive. Haptic would be required; through talking to fire officers it was established that touch with both hands and feet were paramount to the navigation experience, through research and experimentation we found we could provide haptic feedback with the VIVE controllers but the feet would be more tricky. We considered vibrating motors that could be attached to the feet as well as creating the virtual space to match a physical space in which would could create borders to match the walls in the room – hence allowing for actual navigation with feet, this area of the project was put on hold and we moved forward with the project. Next, we wanted a visual representation of the users hands – hands being the number one tool for navigating the space, it was decided that due to this being a complex area of investigation a member of the team would work solely on this, it was also decided that due to objects being a physical obstacle in a smoke filled room that we would look into tracking any object that we placed in the VR space, to allow for a virtual representation to be show in software that they could physically interact with. Initially, the leap motion sensor was used to track the hands as well as objects we placed in the space. However, the leaps range of tracking meant that tracking objects was difficult and impractical. The individual on this area of the project decided that using a VICON tracking system was ideal and pursued this avenue of research.
The second task was to decide what form the “Room” would take. It was decided that due to the challenge of the unknown with a room filled with smoke, the contents of the room should be programatically random, and that the space at default should match the VIVE space of the user; naturally given out previous solution for navigation feedback, however, we were keen to give them control over the size of the room should they wish.
The third was to decide the form of the smoke field itself. This was more tricky, with conflicting requirements coming from management. The decision was between having a room that filled with smoke over time, and therefore allow for varying stages of filled room, or whether it should simply be filled with smoke from the beginning. It was decided that we should provide the user with a choice, and ultimately this would move to a slider directly controlling what stage the smoke will have developed to. We discussed how to make the particle cloud itself, considering layers of smoke that would create a thick shroud close to the user, but remove smoke where not required.
We worked on this project for about a month, during this time we were able to develop an established project that functioned at its core level. I spent the majority of my time working with the I/O of the VIVE handling the data and creating a space based on it. I also introduced haptic feedback into the simulation, attempting to create a realistic or at the very least immersive experience for the user. This involved creating a space programmatically based on calculated room size, and filling it with an archive of possible objects.
VIVE sensors can return both the player height and the space they have assigned for their VIVE
Together with the in house 3D modeler, we were able to establish a collection of room objects and obstacles. This would distribute objects throughout the space defined by the VIVE. This placement would be random, and therefore would alter the challenge of the simulation for each run through. The objects were normal house hold objects; chairs, tables, lamps, etc. However, we also added a human body so as to allow the potential for altering a scenarios focus from navigation to that of saving someone who has collapsed.
We were able to achieve a realistic simulation, allowing for the user to feel there way around the room – at least with their hands, and giving them the tools they would use in a real scenario – a thermal imaging camera. We also simulated the visuals of crouching to look under smoke, a common technique used by the fire service, and as such, a vital part of the simulation.
This was my first real full project into Unreal Engine, and although I only worked on it for a month, It has been used a demonstration piece for the capabilities of the company in many outreach events and client interactions. There have also been tests with members of the fire service, and given the positive feedback we have had from them, there is every chance that it will be adopted in the future. The project was carried out using agile software development, with daily scrums, weekly meetings and use of “Trello” as our sprint board. We used GIT as our source control, and had regular client meetings with a Chief of Humber fire and Rescue Services, as well as the head of Humber Fire and Rescue Solutions.