D-CID Lab : the game changer
Serious games simulate events or scenarios to solve real-world problems. In addition to their applications in training and education, they play an increasing role in developing new concepts of operations and complex, high-tech systems for users. And when the users are armoured vehicle crews with the latest optronic vision and targeting systems, these games are deadly serious…
D-CID Lab (Dismounted Combat Innovation Design Laboratory) is a concept development and experimentation (CD&E) environment that provides a simulation capability in a fraction of the time it used to take. Focused primarily on Thales’s land optronics offering, it is designed to accommodate all the components of land forces systems and is a fine illustration of Thales’s open approach to innovation. D-CID Lab is based on an easily configurable open architecture to incorporate contributions from external players with minimum effort, including the latest systems and ideas from all Thales business lines and input from partners and suppliers.
Thales first developed an D-CID Lab environment in 2014 to experiment with new optronics concepts for airborne and joint tactical forces. In 2015 came a nomadic version that could be set up in-country at the premises of our industrial partners or customers to fit into the users’ operational routine. Presented at last year’s Paris Air Show and the Dubai Air Show, this “deployable” capability attracted a great deal of interest, and this year Thales is proud to unveil a nomadic D-CID Lab for land forces operations.
Unveiled to the public at Eurosatory 2016, D-CID Lab is used for fast virtual prototyping of optronic products or functional concepts with immediate in situ assessment of user experiences and feedback. It combines an advanced simulation system with a modelling toolbox and a set of simulation stations built around the same screens and displays as operational personnel use on a whole range of land vehicle or infantry missions.
It’s impractical to get users and engineers to work together in real-life conditions in a fighting vehicle — it’s noisy and cramped, and the vehicle is moving and jolting as it advances across the terrain. Land Image’Inn reproduces a realistic environment that helps everybody see how our products and their users respond in realistic situations.
The simulator typically reproduces the organization inside a military vehicle with three posts. The driver’s position shows a through-the-windscreen view and includes an internal display for controlling platform functions. The commander position may include an internal display with joystick, a panoramic local situation awareness display, and the commander’s sight, while the gunner’s position would feature an internal display, a joystick and the gunner’s sight. Whatever the configuration, the display sizes and positions are the same as in a real armoured vehicle. In the future, the same virtual environment will extend the scope of the simulation to include new concepts for both vehicle crews and dismounted soldiers — and develop a new generation of real-world vision capabilities for the deployed forces of tomorrow.
Observers can guide the exercises and introduce new scenarios, while users or developers demonstrate the results and different combinations of sensor feeds on the various operator displays.
At the inauguration of the nomadic Land Image’Inn in April 2016, for example, French ministry representatives had a chance to see the panoramic observation capabilities of the Antares local situational awareness system currently under development for the Scorpion armoured vehicle programme. They could visualise the operational value of using a combination of images from Antares and the Kate-LR long-range thermal sight, and assess the potential for adding new self-protection functions, such as laser warning and missile warning, to the future 360° hemispherical vision system.
Simulation systems have a long history in training military personnel, and Thales is one of the world’s leading experts in simulation technology and collaborative training services. Land Image’Inn uses ThalesView, a real-time training and simulation tool, to develop and play back complex operational scenarios with an unprecedented level of realism. ThalesView combines a direct-projection visual system offering a wide 210° x 70° field of view for an immersive experience, a high-resolution geo-specific imagery provided by a state-of-the-art image generator, and an electrical motion system with six degrees of freedom.
Simulation technology has advanced and realism has improved, but where Land Image’Inn really breaks new ground is with its fine behavioural modelling toolbox, which makes it possible to prototype any optronic sensor or function with a realistic rendering of its operational performance.
Thanks to the impressive computing power that is readily available today, Thales can produce new modules and game elements almost instantaneously. Using clusters of PCs, for example, functional models can be generated much more quickly. What took days or months to develop can now be completed within minutes or hours.
These models and modules include functional descriptions of equipment, including third-party products, which are used to create the bricks of the “game” and are fed into the simulation in real time. The modelling is physical as well as functional. Functional modelling is used to realistically reproduce images from different sensors, while physical modelling takes masking and intervisibility factors into account. Working with platform manufacturers and other partners, development engineers can anticipate the impact of hardware such as antennas, weapons or additional fuel tanks, for example, and reach a consensus on optimum design and integration features.
The open, modular architecture of Land Image’Inn favours a common approach to functional simulation based on sharing packages within a growing community of Thales entities, all of which benefit from the common building blocks and their associated updates. The ability to quickly simulate interactions between all existing or future land forces systems, including third party products, gives Thales a real competitive advantage.
The finest details
Used with the ThalesView high-performance visual system, Thales’s visual databases and functional models reproduce an extremely realistic view of the outside world. With resolutions as high as 2.5 cm, these are the sharpest images ever achieved by a simulation system. This is particularly important for land forces applications, where factors like rain, sunlight and heat need to be factored into the simulation. The database takes into account weather conditions, climate, display quality and sensor resolution, and includes all the textures and reflectance characteristics of the elements in the scene at different wavelengths (MMIR, LWIR, daylight TV, etc.). The level of detail makes the rendering more realistic and more useful — sufficient to write a complete functional specification that can then be modified collaboratively to achieve the optimal trade off.
With all the different sensors, systems and data feeds, human system interfaces have become more and more complex. Future collaborative imaging and multi-sensor data fusion techniques add to that complexity, further increasing the cognitive workload on operators. Optimising and testing human system interfaces is one of the key functionalities of the Land Image’Inn fast prototyping environment and arguably the area where users most appreciate the value of the iterative approach.
Usability testing is a good illustration of the benefits of comparing different configurations or display combinations in real time to adjust a user interface, change a procedure or even adapt a deployment doctrine if needed. Working side-by-side in Land Image’Inn, engineers and customers listen to user feedback, watch their reactions, and identify both explicit and implicit needs.
When operational users compare a new sensor configuration or optimise a new user interface, for example, they eventually show signs of fatigue. They may not immediately admit that their concentration is flagging — indeed they may not even notice — and the ability to detect weak signals by observing users directly provides UI designers and developers with precious insights into the human factors in play in real-life situations.
Land Image’Inn is more than a fast prototyping platform for engineers. Above all, it’s a way to share ideas efficiently with customers and partners in order to improve and validate everyone’s ideas and reach a consensus on our product roadmaps. By working together in this realistic environment, customers have the opportunity to visualise future capabilities and assess their operational value before they are developed.
With new simulation technologies and the computing power to model functions and hardware almost instantaneously, the Image’Inn environment is a potential game changer. Customers and engineers can immediately understand each other’s real needs and constraints. Operational staff can work easily with technical teams to reproduce real-world problems or demonstrate new requirements. Marketing staff can show procurement officers the added value of a future system. And outside partners can be brought into the process to contribute their specialist expertise at any time.
This cross-fertilisation is more important than ever in a world where innovation is the key to operational superiority. It not only drives the creative process but helps to develop new solutions faster, more cost-effectively and with less risk. The new Land Image’Inn environment for land forces optimises the learning curve for users, customers, partners and development engineers themselves, quickly building mutual trust and confidence and ultimately saving everyone time and money.