How 1966 & all that advanced AI
200 years after it was written, an 18th century manuscript provided the basis for a new approach to probability theory
Now before we begin, my apologies for jumping on the bandwagon of Saturday night's amazing Champions League final in Kiev and the building anticipation of England's trip to Russia - in hope rather than expectation of winning the World Cup for the first time in more than 50 years!
Unlike other recent blogs on the topic, this is not another one of those sports stories relating to business; it is instead the true story of how pre-big data solutions and mathematics are still as relevant today as they were in 1966 (when we would have hoped, rather than expected, that the two biggest football finals of 2018 would be hosted behind the Iron Curtain).
As discussed in previous blogs, the truth is that data science is not just about CPU, GPU or big data; mathematical probability theory (MPT), which helped us to put a person on the Moon in 1969, is just as important.
A Navy scientist named John P Craven demonstrated an excellent use of MPT in early 1966 when, at the height of the Cold War, a B52 bomber collided with its refuelling tanker over the Mediterranean and lost its crew and four hydrogen bombs over the Spanish Coast.
As I mentioned, this is not a sports story!
Three bombs were recovered within 24 hours, but it was a race against Soviet submarines to recover the fourth. After weeks of searching, Dr Craven realised a new approach to the search for the missing missile, when referring to an unpublished 1740s manuscript by Thomas Bayes on updating beliefs with new information.
Craven asked his team to create some AI by placing bets on where the bomb had fallen. There were many unknowns at the beginning of the search: the bomb had two parachutes - what were the odds that one opened? That both opened? That neither did? What were the odds that it fell straight into the water? What if it fell at such-and-such an angle?
After exploring hundreds of possibilities and calculating the probabilities of each one, Craven's team created a ‘probability map' that indicated the most promising places to search for the lost bomb - but they lay far from where conventional search techniques said it was. The scientists' bets indicated the bomb was nowhere near the plane's wreckage.
The Navy sent down research subs to check the locations specified, but their searches turned up empty. However, Craven simply had his data science team recalculate their odds based on the new search information, a bit like a machine learning algorithm recently discussed in my previous blog on AI to find drains and sewers.
Following several submarine dives in uncharted territory and updated information from Spanish fishermen, the US Navy discovered the missing bomb and recovered it from a depth of 2.2km. It was right at the centre of Craven's probability map.
Fast forward 43 years and Craven's unorthodox technique was again called on for a complex deep sea problem, having proved itself many times in the following decades.
In 2009, Air France Flight 447 traveling from Rio de Janeiro to Paris crashed into the mid-Atlantic and sank more than two miles under the surface. French investigators searched for two years without finding the wreckage before turning to a consulting firm. The firm applied Craven's method to the entire search effort to-date and assigned probabilities to events, scenarios and locations. Sure enough, the new odds pointed to a location the investigators had previously overlooked.
I am aware that the advent of advanced compute power (such as CPU/GPU) means that we can process data faster than ever, allowing us to find the proverbial needle in haystack solution, but without once-in-a-generation mathematical geniuses like Bayes and Craven I wonder where data science would be today!
I hope that England manager Gareth Southgate has been using similar techniques to plan England's assault on the World Cup in Russia this year, but maybe I expect too much!