About Michael Batty

I chair CASA at UCL which I set up in 1995. I am Bartlett Professor In UCL.

Theoretical Filters

spam-filters-work

Science celebrates the goal of parsimony. Occam’s razor and all that. Indeed I wrote an editorial about this a few years ago in 2010 (click here). The idea rests on assumptions that our best theories are those that strip away that which is superfluous to our purpose and generates ideas that are as “… simple as possible but not too simple” as Einstein reportedly said. The trouble with cities and their planning is that we have a surfeit of theories and many seem plausible. One way of proceeding is to define their essence and in this editorial, I focus on how we might do this in many different ways. One way is to take out of any theory that which is obvious, that which is geometrically or physically determined prior to the events involved and James S. Coleman in 1964 called this The Method of Residues in his seminal text Introduction to Mathematical Sociology (New York: Free Press of Glencoe). In essence, he suggested that it is in the ‘residues’ – the residuals – where we will find enlightenment and only when the obvious has been removed will we be in a position to explore the non-obvious.  One way of progressing theory is to produce synthetic data, to look at idealised situations, and search for enlightenment in these. Virtual realities help in this but so do speculations about future cities. There are many suggestions and in this editorial, we suggest how these theoretical filters can enable us to get the best out of theory and to test what is best theory. Read on. You can also retrieve the editorial by clicking on the above image.

20 years of quantitative geographical thinking

cybergeo

In 1996, Denise Pumain set up the online journal Cybergeo . When she first proposed this the web was in its infancy and I remember thinking that this was a very high-risk proposal in a world where the notion that we might communicate our ideas across wide-area networks was still a novelty. In 1986, email was virtually unheard of apart from a few geeks like ourselves who used computers in universities that were beginning to be networked to each other as well as connecting to the rest of the world through arcane but perfectly workable yet slow email systems such as BitNet ……. Fast forward 20 years and on 26 May 2016 (see http://cybergeo2016.sciencesconf.org/), I found myself in Paris with Helen Couclelis at the 20th anniversary of Cybergeo with both of us delivering celebratory speeches on the fact that not only had the journal survived for two decades but it had flourished as well. Only in hindsight can we say that it was a model for many other journals and in one very positive sense, it was in the vanguard of traditional hard copy journals which have fast moved during this time to the Cybergeo model …..

The world of publishing is changing dramatically – read my editorial in Environment and Planning B which covers the full impact of the online world and how we might communicate our ideas in the future.

 

Modelling World 2016: Complexity in Land Use Transport Models

modelling-world

Giving a paper on Thursday 2nd June in London at Modelling World. Talking about Complexity in Land Use Transport Interaction (LUTI) Modelling, outlining very briefly our QUANT model for the Future Cities Catapult. This model is designed to simulate employment population and the interactions between and the constraints imposed on development for all of England and Wales at the scale of middle layer super output areas. There are 7201 such areas in E&W so to the scale is pretty typical of LUTI models. The model is web-based meaning that you can run it from anywhere and it is designed for anyone in E&W to explore the impact of changes in employment and population, networks time and costs and also land use constraints. Why only E&W? Well we are working in getting Scotland into it but the data is a little different when it comes to the journey to work and related census geographies.

It is early days as yet and we are still very much in the experimental stage of building this – it is in fact a proof of concept – to show we can build tools that are applicable to everywhere – and so far there are very few models like this one. We are stretching the state of the art in that the model, the data and the user(s) all interact with each other across servers and clients. The data and storage and speed implications of all this are pretty immense. More on this once we develop it further. I will post the talk after I have written it but before the session on June 2nd at 16-30pm at the Oval. Drill down for the content of the meeting etc. And here for my PDF of the talk which is hard to see in the meeting