What’s the Difference?
The good news for those in the simulation (or synthetic training) industry is that the gap is now closing faster than ever. Software, assets, formats, and engines are increasingly playing bigger roles in the creation, development and exporting of simulator projects.
In the scenario described above, video game developers will sometimes pour tens of millions (!) of dollars into creating fun-to-drive vehicles, spectacular weapons, rich environments complete with pattern of life, fauna, atmospheric effects, and all of this is wrapped in movie-like sound effects (and music). Of course, these vehicles are far from realistic, and the environments are rarely geographically accurate, nor do they contain attributes that make them anything more than entertaining or pretty.
So how does the simulation differ? Let me explain.
In a video game, the helicopter – or ownship – is of course massively simplified. Sure, it will throttle, pitch, yaw, and roll, but that’s all. In a helicopter simulator, the rotor blade physics, the engine power, the weight of the fuel and cargo, and thousands of other variables all come into consideration and factor into calculations whether a pilot is taking off, maneuvering, evading, landing, or deploying weapons. In addition, all of these actions are dependent on the hundreds of environmental (altitude, wind, temperature, etc.) attributes as well.
In the scenario above, a game will not calculate the actual speed of the Hellfire missile, the speed and direction of the convoy, the weight of the vehicles, or the level of damage that will occur. Thousands of calculations are occurring in real-time all the while continuing to display your OTW (out of the window), EO (electro-optical) or thermal or infrared sensor views – and this is true of all those participating in the simulation, whether in other aircraft, or on the ground. But more on that later.
Setting vehicles and weapons aside, game environments are designed mainly for entertainment. In simulators, generic or geo-typical environments can be used when training skills. But in instances where mission planning, or training requires accurate true-to-life detail, such as an airport, geo-specific environments are crucial. If a building has two doors facing east in real life, then the simulation must have the same doors, in the same direction.
Taking it a step further, environments and objects can be assigned material attributes. For example, is the road surface rock, gravel, earth, or sand? Are buildings glass, wood, steel, or concrete? Or a mix of all four? By designating materials on buildings, vehicles, and vegetation, simulators take a giant leap forward in two ways:
- Materials are able to react to atmospheric conditions, i.e.: some surfaces heat up faster in the sun, or cool slowly at night; and
- The varying temperatures or types of materials is realistically emulated in specific sensors like infrared, radar, or night vision.
Many games have thermal or night-vision modes, but these sensors are approximate representations that are not physics-based like those in simulators.
Simulators require the utmost vehicle and performance realism for training purposes. Vehicles, weapons, sensors, and environments should react as realistically as possible in order to be effective training tools. It is because of this priority that the visual aspect of synthetic training has taken a back seat to physical realism.