Sensory deprivation

Most machine learning or AI applications will rely on input. Whether from a keyboard or more possibly from a real time sensor (for the purposes of this brief thought cloud I will consider cameras as real time).

It’s very easy for an algorithm to be written in a safe, warm environment without giving thought to the environment in which sensors (providing the very input the algorithm relies on) have to work faultlessly.

Sensors, like everything, can break, give erroneous output or misinterpret what they are sensing. In some situations, taking readings from two identical sensors in the same environment could flag up problems with my first two examples but maybe not my third.

Let’s use the example of two adjacent wind speed and direction sensors. Both giving similar output to the other, then both indicate the wind has dropped away to nothing. Could there be a problem or has the wind really stopped blowing? In this instance, a sudden squall of freezing rain has effectively disabled the sensors by freezing them solid. One could argue that the likelihood of the wind falling away dramatically was remote and that could flag up an interrupt in the program to disregard the sensors. However, in the eye of a storm the wind can fall away too!

What could be done to make the algorithm more reliable? Perhaps include temperature sensor input as well? Not necessarily fool proof especially if the ambient temperature is hovering around zero. Take supplementary wind speed and direction readings from a site not too far away, then a little further etc.

Sensors used in the automotive environment can have a very tough time. I’m sure we’ve all sat behind a lorry in a traffic jam, one with a rear facing camera? You have to look hard to see it because it’s usually covered in mud!

Doubtless, use will be made of camera technology (and many other things) in AI self driving cars. Even if perfectly clean, a forward facing camera on a winter’s day when the sun is just above the horizon and barrelling down the camera lens, the camera will have to adopt sophisticated exposure techniques to be able to see a pedestrian crossing the road in front of the vehicle. The pedestrian might be running across the road and there could be local rain, so even a camera / radar combination would struggle and we can rule out infrared sensors because of the sun’s position. Aberrations and internal reflections in the camera lens could give false positives even without a jaywalking pedestrian.

Nothing is much more essential to a pilot than knowing whether or not a sensor is working when flying a plane. I would direct interested readers to a statement issued by Boeing over an AOA (angle of attack) Disagree Alert. https://boeing.mediaroom.com/news-releases-statements?item=130431

The AOA (angle of attack) Disagree Alert should inform the pilots that the plane’s sensors may be transmitting contradictory data about the plane’s angle of attack. From my humble microlight flying days, the greater the angle – the more likely a stall. Not a great place to be and completely opposite to the 1988 chart topping single by Yazz and the Plastic Population.

Ideally, sensors should be used in a closed feedback loop where the change in a sensors output is followed by the system changing state and following the change, the sensor feeds back the result of that change to the system. Critical system damping is required so as to avoid oscillation.

Leave a Comment