Recently I was asked to speculate on what strides had been made in urban and transport modelling during the last 20 years and what did I think models would evolve to in the next 20. The current editorial in EPB summarises my thinking. In many senses, this was prompted by the oft-quoted sentiment that agent-based models of transport which build on many developments in the last decades including activity time budgeting, discrete choice and the ability of computers to handle many many objects through rapid computation, have not made the world better, but produced much inferior performances than earlier more aggregative model structures. For a while there has been the sneaking suspicion that aggregate models with all their limits in terms of representation, somehow generate more realistic predictions that their micro-dynamic equivalent, Of course there can be no true test as these model types are so different. However what is interesting is whether we can generalise in any way from the widest possible model experiences: as we add more detail and attempt to explain more, all other things being equal, are we more likely to get poorer or better predictions from comparable models? The implications is poorer although the jury is out because the evidence has rarely been assembled. This question remains unresolved, and probably will do so.
To an extent it might be logically plausible to show that aggregate models might perform better if the strong structural constraints that determine how aggregate populations travel are more difficult to represent or rather are more difficult to emerge as the product of many travel decisions within micro simulation models. But all of this involves incredibly well defined controlled experimentation and given the exigencies of the very different situations in which different models are built, it may well be impossible to come to any definitive conclusion in this regard. Moreover the whole question of what is good prediction anyway is raised in this debate which is more about models and science in human affairs than about specific types of model. Yet at the end of the day, we still have to choose between different models and different predictions and learn to live with these tensions that are endemic in our field. The bigger question I think is whether or not our world is becoming more unpredictable or rather more uncertain one might say for this does and will have important implications for modelling. I have written an editorial about all this in the current edition of Environment and Planning B which can download here.
Once in a while along comes a wonderful piece of historical research that again illustrates that in most fields, there is little new under the sun. Andrew Odlyzko’s recent paper entitled “The forgotten discovery of gravity models and the inefficiency of early railway networks” is just such a paper. In it, he shows that it was not Carey who was the first to argue that human interactions vary directly with their mass and inversely with the distances between them – Newton’s second law of motion – but a Belgian railway engineer Henri-Guillaume Desart who in 1846 (perhaps even before that date) argued that rail traffic on any new line would follow such a law. He based this on the ‘big data’ of his time, namely railway timetables and he thus joined the debate which raged for over half a century as to whether new rail lines built point to point in straight lines with no stations between would generate more traffic than would be attracted locally if stations were clustered around big cities. This is a debate that has some resonance even today with the debate in Britain about new high speed lines such as HS2 and what stations they might connect to.
Odlyzko’s paper also notes that in 1838, a British physicist John Herapath suggested that this local law of spatial interaction for rail traffic in fact followed a negative exponential law with traffic proportional to exp(-bd) where d was the distance from some source to a station. Arguably this is an earlier discovery although it was Desart who fitted his model to data, coming up remarkably with an exponent on the inverse power of the distance in the gravity model of 2.25.
Elsewhere I have recounted the tale of the how the Lyons electronic computer team much in advance of their time, cracked the shortest route problem in the early 1950s, several years before Edgar Dijkstra who is the accredited inventor of the algorithm. You can see the video of this here where they took on the problem of pricing freight on the British Railways network by breaking their big data into chunks of network which they needed to move in and out of store to solve the problem. In fact somewhere in the recesses of my mind, there is also a nagging thought that someone even earlier, just after Newton’s time, first applied his gravity model to human interactions. I seem to remember this was at the time of the French Physiocrats whose input output model anticipated Leontieff by more than 150 years when Quesnay devised his Tableau Economique. Old theories of social physics seem to go back to the beginnings of natural physics and although we live in a time when the modern and the contemporary swamp our history, we are gradually discovering that our human wisdom in learning to apply science to human affairs goes back to the deep past.
A very nicely produced review of geocomputation in this edited book by Chris Brunsdon and Alex Singleton. It covers many interesting new techniques from agent based models to new visual statistics, from crowdsourcing methods to the newer scripting languages that are making their appearance as central to the development of contemporary spatial analysis. What is noteworthy about the book is the beautiful presentation and the visual ease in which the reader is exposed to these somewhat arcane arts of making sense of space and geography. There is a nice web site with some content that the reader can download here, and at the risk of infringing my own copyright, I will share my own chapter with you which you can download here too. Its not in the glorious presentation of the published book, merely a PDF or the word file but the figs are in colour