• No results found

Testing the boundaries of virtual reality within ship support

N/A
N/A
Protected

Academic year: 2021

Share "Testing the boundaries of virtual reality within ship support"

Copied!
10
0
0

Loading.... (view fulltext now)

Full text

(1)

Testing the boundaries of virtual reality within ship support

John Newell, MBE, CEng, FIMarESTa, Simon Luckb

aBAE Systems Head of QEC Support, bBMT Defence Services Head of Information Systems *Corresponding author. Email: john.newell2@baesystems.com

Synopsis

This paper looks to expand on the INEC 2016 paper ‘The future role of virtual reality within warship support solutions for the Queen Elizabeth Class aircraft carriers’ presented by Ross Basketter, Craig Birchmore and Abbi Fisher from BAE Systems in May 2016 and by presenting the work undertaken over the last year by BMT to provide an innovative 3D walkthrough training system for the Queen Elizabeth Class (QEC) Aircraft Carriers, supporting the BAE Systems Maritime Services’ Industrial Suitably Qualified and Experienced Personnel (SQEP) Training Programme.

The team at BMT have utilised its next generation training solution, ENGAGETM to develop a 3D walkthrough training system that supports the teams working closely with the QEC Aircraft Carriers in Portsmouth. ENGAGE is BMT Defence Services’ immersive real-time training suite, providing connected scenario and familiarisation training in an intuitive and safe environment. This training solution helps to ensure that those personnel supporting the QEC Aircraft Carriers can navigate the platforms safely whilst carrying out their duties.

This innovative 3D solution also has wider utility as these two ships come into Service as a key part of the ongoing induction process for new-joiners, a useful tool that when operational can be forwarded in advance to embarking forces both from the UK and other nations and a necessary precursor to any international industrial workforce working on these ships worldwide. This technology may provide the facility for remote support with engineers ashore virtually walking round the platform with maintainers at sea, thousands of miles apart, connecting a multitude of data sources to provide necessary insights and enabling a more efficient service. Guidance could be provided in real time directly into the headset. The low cost of the technology also enables training to be done onboard with a laptop so that higher risk/lower frequency tasks can be done virtually and then assessed by remote SQEP prior to being carried out live.

The paper reviews the options put forward at INEC on safety training including the simulation of emergencies (smoke, etc.), escape routes, compartment familiarity and the ability to visualise engineering options such as shipping routes for the removal/replacement of large items of equipment, safe operating envelopes, Alterations and Additions (A&As) design options, detailed design (post laser scans) and HAZID work. The paper continues with possibilities for the future, aiming to stretch and test the boundaries of how we can utilise this technology not only in engineering support but also in rapid prototyping and war-gaming to enhance Combat Power at Sea.

Keywords: 3D, Virtual Reality, VR, Support, Digital thread, Augmented Reality, AR 1 Introduction

HMS QUEEN ELIZABETH will be based in her home base from her First Entry into Portsmouth (FEP) in 2017. Preparations for this include the development and delivery of training for the industrial staff that will be providing support services including engineering design, maintenance, defect rectification, updates and upgrades and waterfront services. It is recognised that to maintain the ship’s programme, Portsmouth Naval

Authors’ Biographies

aJohn Newell joined BAE Systems in October 2014 as Head of Queen Elizabeth Class Aircraft Carrier Support after a career in the Royal

Navy spanning 38 years. The role is to design and implement the support solution for the ships in time for their arrival in Portsmouth from a BAE Systems perspective and then support them through life.

bSimon Luck joined BMT Defence Services in 2004 as a Systems Analyst, responsible for providing his customers with online information

management tools so they could gain insight into their data. Simon now runs the Information Systems Department, delivering core services around software development, data analytics, visualisation and cyber security. His team are responsible for the development of the ENGAGE platform which forms the basis of the immersive QEC Carrier Trainer.

(2)

Base (PNB) support personnel must be safe to work onboard and be effective and efficient in conducting their work as early as possible following FEP and the majority of this training should, where practicable, be delivered before her arrival.

A study of the QEC training gap identified the need for all such personnel to undertake specific spatial awareness and safe onboard training. A needs analysis (a full Training Needs Analysis was not conducted) determined that this training would best be delivered by the use of an interactive virtual environment implemented as a 3D walkthrough tool based on a game-based simulation engine.

In David Eagleman’s television series and book “The Brain”, broadcast in the UK in 2016, he explains how our neural network forms as we develop from birth with trillions of synaptic connections being made. As we grow older, some of these connections fall away, leaving stronger connections behind based on our experiences. Eagleman explains that an important area of the brain for spatial memory is called the Hippocampus. He says that neurons that are activated at the same time establish stronger connections between them, providing a unique signature of the event – this represents the memory. So when we provide the brain with repetitive experiences, such as driving round and round racing circuits on video games, the hippocampus gets busy storing any new details against these connections.

When we encounter the real-world representation of our brain’s 3D model, such as walking around part of a race track that we had played in the video game, we unlock a whole web of associations and bring the entire memory back in full clarity – aiding situational awareness.

The 3D walkthrough tool provides a ‘safe’ learning environment which enables the trainee to experience many of the key visual, spatial and aural cues that will exist within the real world of the carrier without being subjected to the real-life risks of such necessary training scenarios, such as fires and floods. Such cues are key to enabling personnel onboard to identify and initiate the appropriate action or response that forms much of what is involved in being ‘safe onboard’. The required responses and actions within the training environment involve visual recognition and movement through the virtual carrier to specific positions or equipment. The 3D technology provides a through-life solution, is electronically distributable and has wider industrial and MoD utility as suggested in this paper.

The courses for which the 3D walkthrough tool has been developed include:

• Spatial Awareness and Safe Onboard training • Training on generic QEC procedures

• Training on individual equipment

• System training

• Systems of systems training • Systems operation

• Systems maintenance

• Training the Naval Base support teams for QEC

All industrial staff working onboard must be ready to embark and disembark, travel to and from their place of work and deal with any dangers, hazards or situations encountered in the correct, safe and efficient manner. As such they require general QEC familiarisation (experience of the key visual, spatial and aural cues), spatial awareness and the knowledge, skills and attitudes required to initiate the appropriate action or response to potential dangers, hazards and situations.

Across the range of QEC equipment and systems there is a requirement for certain personnel to be intimately familiar with the layout of the main QEC compartments (The Royal Navy uses KILL Cards for each compartment which list all boundaries, key systems and hazards within the compartment, means of isolation, etc.) and the design intent of the equipment and systems (how the system works and why it was designed in this way), the interconnections and interfaces between these equipment and systems. There is also a need for a small number of engineers to understand the operation of the various systems (and their equipment) for the purposes of systems commissioning, maintenance verification and setting to work. Very specialised knowledge will be required on equipment such as gas turbines, diesels, refrigeration and air conditioning, high and low voltage systems, power generation and distribution, machinery control and a wide range of auxiliary/ancillary equipment and this knowledge will be provided via Original Equipment Manufacturer (OEM) courses.

(3)

All courses will involve the use of the 3D walkthrough tool to augment instructor led classroom courses by providing a front-door into that training. The development risk was mitigated by the solution being based on tried and tested techniques used successfully on the Astute, Trafalgar and Vanguard programmes.

A major consideration in the selection of the 3D walkthrough tool is that once procured, it is available through life and readily distributed. As such, this solution has utility and applicability for all other trainee populations not considered in terms of their volumes during the initial training phase. This includes, but is not limited to, ab-initio training of engineering and project management apprentices and graduates, training BAE Systems personnel, training personnel from other Industrial Partners and the training of personnel from other contractors, sub-contractors, OEMs, the MOD, the Royal Navy and Emergency Services.

2 Commercial technology

Commercial visualisation technologies have been exploited for over a decade within the Maritime sector, both in terms of supporting the platform design process but also through information systems development to support training and technical documentation tasks.

The entertainment industry is said to be worth approximately $100bn per year. Facebook bought Virtual Reality (VR) technology company Oculus for $2bn alone in 2014. Since then, the entertainment industry has made huge investments in developing hardware and software that supports a range of activities that we can make use of in the Defence industry. Thanks to this investment, we are able to provide realistic and high-performing environments to support training objectives, ensuring that the training system is designed around the needs of the trainee and the desired outcome, rather than just developing the highest fidelity system for the sake of it. It’s this human-centred design that has enabled the development of an accurate and realistic representation of the QEC in order to perform a range of training scenarios that staff are required to carry out prior to embarkation.

Figure 1: ENGAGE QEC Carrier Trainer User Interface.

As the investment in technology continues, we are anticipating higher performing systems, better human factors around the integration of this technology and the rest of the training (or real) environment and an appropriate blend of training and operational systems so that we can maximise investment in systems to assist with training, support, operational procedures, business uses and operational planning.

(4)

3 Current use of Virtual Reality in QEC

Together with the delivery of the 3D walkthrough tool, a functional Virtual Reality (VR) demonstration, based around the QEC bridge has been developed. To avoid security issues it is representative of the bridge and not an exact replica. The demonstration uses BMT’s ENGAGE platform and has been built to work with the HTC Vive VR hardware.

The demonstration allows the user to navigate the compartment by walking freely around and utilising a teleport function to enable them to reach areas outside of the room scale area, (the maximum floor space of an HTC Vive is 5m x 5m). User interactions are also made available.

There are a number of anticipated uses for this VR demonstration. It has been widely used to showcase what has been achieved to date in preparing the industrial workforce for the arrival of HMS QUEEN ELIZABETH. It is available for use by Communications Teams at events such as DSEI, Conferences such as EAAW, in the PNB Visitor’s Centre (Ark Royal building) and in outreach events with schools and colleges. This is currently the limit of the 3D and VR initial capability.

But it is in developing our thinking on where this rapidly advancing technology will lead us that is most attractive. A number of areas are relatively straightforward to envisage, particularly in supporting the platform. As will be shown, it is when the tool is embellished and linked with simulation (e.g. flight simulators) that our thinking needs to really broaden. As ever, the balance of investment decisions will influence how fast the Defence Industry adopts this technology but development costs are judged to be fairly insignificant given the rapid advances driven by the gaming industry.

4 The digital thread

We make significant investment in our platform designs, from conceptualisation all the way through to detailed design and build. The designs and information databases created throughout this process are vast and support a wide range of activities that are required as the design progresses and moves to build and beyond. We have an opportunity to utilise this huge information set right from the outset in the creation of a digital twin to the physical ship that rolls off the production line at the end of the process. This is currently an ambition.

As soon as the first 3D models are created we can use these for spatial acceptance of machinery and equipment areas. As we progress and add to these 3D models we can understand how we may go about installing and removing such machinery and equipment in the most efficient manner. We will also start to gain an understanding as to how those individuals onboard will carry out their duties in the safest and most effective ways possible through task-oriented human centred design. This information can be visualised at any point in the design and build process to support the ongoing decision making process and help with the concept of ‘Designing for Support’.

Such models can be used in an immersive environment to validate the human-centred design, ensuring that individuals and teams can carry out their objectives in the most effective manner. Thanks to full body-tracking technology, iterating through design versions can lead to saving valuable time and money in the longer term as we replicate complex environments virtually and test these with real subjects, such as a design for a new ship’s Operations Room as shown in Figure 2.

(5)

Figure 2: Future Operations Room Visualisation.

Maintaining the configuration of this model is vital as we move into the build, completion and maintenance of our platforms. This digital twin can be used at each stage of the platform’s life, enabling design understanding, training needs analysis, operational activities, maintenance issues and finally identifying specific locations of materials prior to scrapping. Updating the digital twin whilst operating the platform will help inform future designs as well, learning from operational usage and how this compares with our intended usage during design. We can use the digital twin as an audit trail of decisions that have been made and use the model to understand the impact of planned decisions given a variety of scenarios. An example of this might be to visibly highlight within the model where changes have been made from the original design (such as colour coding) so that ship designers can learn from experience; this might be adding additional isolation valves to chilled water systems so that individual plants can be shut down, systems drained and worked on, without affecting other plants.

This information can support the design of operational procedures, enabling operators to carry out planned tasks in a virtual environment to test the effectiveness and the design of these procedures and rectifying potential human interaction issues prior to implementation.

Although the use of 3D visualisation technology is already widespread in the design process of warships, VR can be used to gain a greater ‘understanding’. Immersing yourself in the design allows perception that you cannot replicate using a flat screen monitor. Supplementing this visual representation with the other physical and functional characteristics that have been collated as part of the design process provides further insight that can be used in this immersive environment. The commercial construction industries make use of the Building Information Modelling process and data exchange formats to support design, build and maintenance of physical assets, including fault finding using data from the physical entity within the virtual model. As we look to build and support ever-increasingly complex platforms with fewer personnel on board we need this information at our finger tips to provide the necessary support.

5 Support

The most obvious use of VR, which requires little investment to achieve compared to proposals later in this paper, is in the planning and execution of support in its widest sense; helping with defects, maintenance, updates and upgrades.

Utilising the digital twin and immersive environment in VR, as already described, provides the engineer with a realistic representation where freedom of movement and situational awareness can help across a range of

(6)

activities. By training in the immersive environment, the maintainer gains valuable situational awareness prior to stepping onboard the platform. This type of pre-maintenance training can potentially shorten the timescales for maintenance procedures and reduce risks as potential issues can be assessed in the training or planning phase. Maintainers can benefit from using the same training approach whilst on-board, providing scenario-specific training at the point of need, which is critical for high-risk/low-frequency tasks.

The immersive 3D environment can provide situational awareness for maintainers without the need for the physical platform to train on. This can include as much detail as necessary to perform the specific function of the maintainer to ensure they are familiar with safety equipment as well as the location of this equipment within specific compartments. The immersive VR environment has benefits above that of a flat-screen version in that the freedom of movement allows easy line of sight to understand more complex compartments and spaces. An example may be trying to identify a valve behind several pipes and ducts that is above the maintainers head.

Using this technology, maintainers can walk through complex procedures in a safe and monitored environment, ensuring that the planned procedure is executed in the most appropriate manner prior to doing it for real onboard. In these circumstances, scenarios can be made more difficult to train maintainers to carry out tasks in potentially hazardous environments without the risk of physical injury.

Figure 3: ENGAGE QEC Compartment.

Such training systems can be networked to test how maintainers and ship staff can work collectively on a task. Data captured from these exercises can be used to identify improvements in both procedures and how training is executed. Trainees can be geographically dispersed which can make a training scenario realistic and can reduce costs of training delivery.

Other tasks pertinent to the use of this technology include but are in no way limited to:

• Familiarisation with maintenance evolutions

• Location of items of equipment in specific compartments • Maintenance visualisation

• Access difficulty visualisation • Planning for removal of equipment • Slinging sequences and rehearsals

• Work in way and work in wake estimations

(7)

• Evaluation of Capability Insertion options • Drone-based survey rehearsal

• Use as a potential diagnostics tool, particularly when integrated with Enterprise software tools • Helping to evaluate options for key defects

Coupling existing systems with a virtual environment may provide further value to staff, such as allowing them to review live data feeds from systems in context with the representative environment as though they were onboard themselves. Technical publications that have been stored against equipment or systems within the model can be reviewed within the virtual environment. Such data can also be used in augmented reality applications for maintainers onboard the platform, allowing them to review procedures whilst carrying out maintenance or potentially providing real-time guidance and animated overlays for some maintenance procedures.

Remote maintenance support could use such systems to navigate around the virtual platform reviewing real-time performance of machinery and equipment, identifying parts that may need maintenance or replacement potentially linking to logistics applications with a selected part number so replacements can be ordered and ready for planned maintenance for the crew onboard. In the future, as Augmented Reality (AR) technology improves (where a computer-generated image is superimposed on a user's view of the real world providing a composite view), the maintainers onboard could be joined virtually by their shore-side colleagues, enabling collaborative maintenance tasks with dispersed SMEs. This is very similar to head up displays for pilots.

Spatial capture technology can be employed to generate 3D data where this isn’t already available from existing CAD models; such is the case with some of the older platforms. Laser scanning technology can provide high density point clouds which can be processed and utilised within an immersive 3D environment. This technology can also be used to identify class differences and be used to generate platform-specific instances of virtual models so users can train on highly realistic and spatially accurate platform representations.

The ability to maintain the configuration of the model and data will be crucial to the success of the through-life digital twin. Specialists onboard platforms may want to use these models to mock up scenarios to help resolve emerging issues or to run through high-risk, low-frequency tasks prior to executing them for real.

Generally, support can be provided more effectively with good situational awareness and accurate information such that someone working remotely from the platform has the same visual environment as the maintainer onboard. It’s likely that this remote worker will then have more supporting data than the onboard maintainer thanks to the Building Information Modelling (BIM) data that has been collated as part of the design process (BIM is a process involving the generation and management of digital representations of physical and functional characteristics of places). Coupling the high-fidelity immersive environment with overlaid data from sensors and automated fault-finding software can give the remote worker all the necessary information they need to perform a remote support task effectively – far more so than trawling through a stack of ring-binders and comparing these to print outs of data feeds next to a large general arrangement plan!

Investment is required to gather the data available from the various systems onboard and to provide this to the analyst on the shore in a suitable timeframe during the support activity. The use of machine learning and big data analytics will support the engineer in fault finding and guiding them to the area that is most likely going to be of interest.

6 Operational procedures

With a reduced number of staff onboard, we need to exploit technological advancements with more automated systems to achieve higher availability and remote secure connectivity into these platforms and systems, while not forgetting the increased need for Cyber resilience. We must not erode the skill and competency of the maintainer. Providing good situational awareness in conjunction with data and analysis from these more advanced systems will enable fewer staff to carry out a greater number of more complex activities effectively.

Live data streams from the platform can form part of the VR immersive environment, potentially allowing accurate interaction for operational procedures so that the platform is no longer reliant on having multiple crew

(8)

members to carry out certain procedures as the augmented crew member ashore can make use of intelligent systems and real-time overlays in augmented reality.

Other tasks pertinent to the use of this technology include but are in no way limited to:

• Basic ship knowledge prior to embarkation of personnel from the Battle Staff, Fleet Air Arm, RAF,

USAF, US Marine Corps (USMC) and Royal Marines (RM)

• Flight Deck operations from both a procedural and safety perspective allowing the user to interact with a

simulator

• Hangar operations from both a procedural and safety perspective allowing the user to interact with a

simulator

• Air Traffic Control procedures from both a procedural and safety perspective allowing the user to interact

with a simulator from Flyco (see Figure 3)

• Diving operations could be rehearsed in dry and safe conditions prior to wet operations. This might include blade replacement, placing blanks over underwater openings, ship bottom searches and offensive procedures against other ships

• Tug operations could be rehearsed prior to foreign visits with local tug crews involved

• Berthing procedures could be rehearsed prior to foreign visits with local teams involved • Ship-shore connections could be rehearsed prior to foreign visits with local teams involved • Use of different fenders and catamarans can be simulated

• Warping, slinging, anchoring and cabling evolutions can rehearsed • Boat transfers can be simulated

• The VR model can be used for Chemical, biological, radiological and nuclear (CBRN) scenario

development, responses and testing

• Smoke clearance scenarios could be modelled if the model is coupled to a smoke movement simulator • Damage visualisation (Flood, fire, impact damage, explosive damage, etc.)

Figure 4: View from Flyco in the HMS Queen Elizabeth 3D model. 7 Business uses

While not specifically designed for the following uses Virtual Reality may have a significant positive impact which should not be discounted for:

• Business development and marketing

(9)

• Science, Technology, Engineering and Maths (STEM) engagement • An Engagement tool

• An Investigative tool 8 Operational planning

Looking into further uses of this technology within the defence environment BMT have created an immersive mission planning application titled Joint User Mission Planning – JUMP©. JUMP© facilitates cyber mission planning alongside typical physical mission planning, allowing the commander to understand the impact of both kinetic and non-kinetic effect (see Figures 4 and 5). The system forms part of the typical planning process, providing a deeper understanding of the situation and risks associated with specific planned courses of action.

Figure 5: Joint User Mission Planning (JUMP)© Tablet Visualisation.

(10)

Similar immersive planning could be utilised by ships staff, allowing them to visualise weapon damage, firing arcs and presenting ‘what-if’ scenarios with automated analysis. Specific actions could be rehearsed using the virtual environment, removing certain levels of risk and uncertainty whilst at the same time enhancing training effectiveness. The system could make use of other technological advancements, such as machine learning, allowing it to provide prompts and warnings based on inputs that it’s receiving from both users and data streams from onboard the platform and other friendly assets.

Once one platform is modelled it can be inserted into a more expansive VR environment which would include a multitude of sub-surface, surface and airborne assets.

Such an environment could also be utilised to visualise real-time courses of action, representing assets such as remote Unmanned Vehicles (UXVs) or friendly forces, giving analysts a very realistic view of the environment and providing a context that was previously unavailable. It’s important to emphasise that the system should be designed around the human and their role and task objective. Certain tasks may require a higher or lower fidelity of model and with the baseline model captured from the design phase, this should be achievable as the level of fidelity is built going through the design, build and operational phases.

8 Conclusions

Although the authors’ work has been primarily focussed on QEC, this technology can be applied to any Naval platform or task group but also more widely to the whole of the maritime, land or air environments. The re-emergence of carrier strike capability and the exploitation of this type of technology, enables further integration of warfare capability through organisations such as the RAF’s Air

Warfare Centre and Air Battlespace Training Centre (ABTC) (see

http://www.raf.mod.uk/rafwaddington/aboutus/wwwrafmodukairbattlespacetrainingcentre.cfm) ,both located at RAF Waddington. The ABTC currently uses immersive tools for doctrine and concept development; extending this to include carrier strike capability would seem to be the natural next step, thus integrating air, land and maritime environments.

There is also state-of-the-art in simulation and training in the Military aviation world. The likes of the Typhoon Training Facility (TTF) at RAF Coningsby uses a blend of full-motion simulators, cockpit trainers, mock-ups and systems functional training devices to train aircrew and maintainers. Building on this type of capability in AR and VR potentially enables this type of training to be provided at lower cost and enhanced adaptability in other environments.

3D, VR and AR technology can be applied to many scenarios, many of these possibilities are yet unknown and are likely to evolve throughout the life of the platform as technology advances and the gaming industry invests. Our imaginations are the only limitation. The aim for the future must be to stretch and test the boundaries of how we can utilise this technology, not only in training, engineering support, testing operational procedures, business uses and operational planning but also in rapid prototyping and war-gaming to enhance Combat Power at Sea.

Figure

Figure 1: ENGAGE QEC Carrier Trainer User Interface.
Figure 2: Future Operations Room Visualisation.
Figure 3: ENGAGE QEC Compartment.
Figure 4: View from Flyco in the HMS Queen Elizabeth 3D model.
+2

References

Related documents

Newlands, “Risk of recurrent hydatidiform mole and subsequent pregnancy outcome following complete or partial hydatidiform molar pregnancy,” BJOG: An International Jour-. nal

These systems allows the following of the dynamics of early root development in a high throughput manner and will be probably very useful to screen the root architecture of rice core

dBiqryh dykdkj nLrkus dh rjg dBiqryh dks vius gkFk esa igurk gSA dBiqryh ds flj rFkk nksuksa gkFk dykdkj ds gkFkksa }kjk lapkfyr gksrs gSaA dykdkj ds gkFk rjg&rjg

Projections are based on staff assessment of current policies (see Fiscal Policy Assumptions in text). Note: For country-specific details, see Data and Conventions in text, and

Protects your lawns, trees & shrubs, roses & flowers, fruits & nuts, and vegetables from: alfalfa loopers, ants, aphids, armyworms, bagworms, beetles, borers, brown

The proposed ensemble-based modeling has the following main characteristics: (1) Our ensemble approach is able to select the most promising samples in an online

Round or square element shape, dependant on shape of stylus Application: - Automotive - Aerospace - DOD Advantage: - Permanent - No consumables Disadvantage: - Alters