by Gavin Blackett, OR Society Secretary and General Manager
It is 80 years since Alan Turing first raised the concept of a universal machine and 66 years since he described the ‘imitation game’ in which a person has to decide whether written answers to questions were generated by a human or a machine. In 2015, the Alan Turing Institute (ATI) was formed as a partnership of the Engineering & Physical Sciences Research Council (EPRSC) and five universities (Cambridge, Edinburgh, Oxford, UCL and Warwick) to ‘make great leaps in data science research in order to change the world for the better’ (their mission statement). The Institute has over 150 researchers and has formed strategic partnerships with Lloyd’s Register, GCHQ, Intel and HSBC.
ATI’s director, Professor Andrew Blake, gave the 2016 Blackett Memorial Lecture at the Central Methodist Hall in Westminster, just a stone’s throw from its base in the British Library. His thought-provoking title was ‘Machines that learn: big data or explanatory models?’.
The main thrust of his talk was the common conflict, or decision faced by modellers (depending on the circumstances, obviously) – whether to use an empirical classifier or some form of generative model (which Andrew also referred to as analysis by synthesis). Andrew used examples, including painful ones from his own background, to illustrate the struggle between the two approaches. The first examples included the Netflix challenge and face recognition software. In 2006, Netflix offered a prize of $1m to help design an algorithm for to make film recommendations to its users (if you enjoyed Groundhog Day, you’ll love …). In the case of face recognition software, the efficient, black-box approach, learning from masses of examples won out, and as we all know, for a number of years even the humblest of digital cameras has been making use of this to identify faces to help the camera user frame their shot.
Andrew gave us a live demonstration of the next success – image recognition. Even in the Microsoft Office suite there’s software which can pull out a particular item from a complex image, and insert it into (for example) a Word document. Andrew told us the strengths and weaknesses of both approaches needed to be considered, a lesson he’d learnt in his time with Microsoft working on the Kinect 3D Camera project. Andrew had nailed his colours to the generative model mast, but the modelling was proving difficult. Fortunately, a young tenacious researcher demonstrated that the black box approach could work, and the outcome is now sitting on top of TVs in many of your living rooms. Andrew also explained that there are gains to be made by combining both modes.
The field is changing fast, and Andrew highlighted the magnitude of improvements over recent years. It’s not only the technology that’s changing, though. Data protection, ethical approaches and legal issues are also having an impact. The impenetrable nature of the empirical classifier (black box) approach can be problematic with an increasing need to be able to demonstrate the variables and data key to a model’s output. In some cases, generative models are being used to try to explain how the classifier models are obtaining their predictions.
Finally, Andrew gave us a brief glimpse into research into how to improve learning. The typical classifier models need many, many cases to learn from, and once they’ve learned the first thing, the same number of examples are needed for the second. Small children demonstrate a much more efficient way of learning. If they’ve had quite a few examples to learn how to identify a car, very few additional examples are required to allow them to identify lorries. Andrew’s talk was certainly entertaining, even if it might not have been what one or two were expecting from the presumably deliberately vague title. It could only ever be a flavour of the type of research work being done through the ATI. The concept of considering the modelling pluses and minuses of different approaches is definitely not a new one to the O.R. world but it was fascinating to see Andrew’s take on this.