Local Transport Today is the authoritative, independent journal for transport decision makers. Analysis, Comment & News on Transport Policy, Planning, Finance and Delivery since 1989.

Trust me, I'm a transport modeller…

In theory, statistics should help settle arguments. They ought to provide stable reference points that everyone – no matter what their politics – can agree on. But rather than diffusing controversy and polarisation, it seems as if numbers are actually stoking them, says Tom van Vuren

Tom van Vuren
The declining authority of statistics – and the experts who analyse them – is at the heart of the crisis that has become known as ‘post-truth’ politics
Lie-proof your life with Sagan’s ‘baloney detection’ checklist
Lie-proof your life with Sagan’s ‘baloney detection’ checklist
Tom van Vuren
Tom van Vuren

 

In a recent ‘long read’ article in The Guardian*, William Davies says: ‘In theory, statistics should help settle arguments. They ought to provide stable reference points that everyone – no matter what their politics – can agree on… However in real life, not only are statistics viewed by many as untrustworthy, there appears to be something almost insulting or arrogant about them. Reducing social and economic issues to numerical aggregates and averages seems to violate some people’s sense of political decency…

The declining authority of statistics – and the experts who analyse them – is at the heart of the crisis that has become known as “post-truth” politics. And in this uncertain new world, attitudes towards quantitative expertise have become increasingly divided’.

Read ‘transport models’ for ‘statistics’ and you get a pretty accurate picture of how we as transport modellers tend to be viewed by our colleagues, clients and the public in general. And if our models and resulting forecasts are not trusted when assessing relatively straightforward road, rail or cycle schemes, what hope do we have when attempting to forecast futures affected by disruptive technology, changing travel behaviour and an increasingly uncertain economic and environmental context?

Inevitably the question can be asked if models are of any real value in predicting what the future might be like; and an accompanying rise in alternative predictions based on hunches, emotions or self interest. How can we as modellers play a more relevant role in decision-making for long-term policy and investment in an uncertain world, if our credibility is already being questioned? 

Below I set out some of the actions we might take, and changes in how we operate and communicate. These are inspired by a couple of recent articles that were spotted by James Gleave of Transport Futures; and also by the report by Transport for Quality of Life on the evaluation of road projects, commissioned by the CPRE:

And if our models and resulting forecasts are not trusted when assessing relatively straightforward road, rail or cycle schemes, what hope do we have when attempting to forecast futures affected by disruptive technology, changing travel behaviour and an increasingly uncertain economic and environmental context?

1. Treat modelling as a science. Develop more than one hypothesis or scenario of what the future may look like and how disruptive technologies may operate in such futures. This should address both the uncertainty in the future in which new technologies operate, and how they might function and affect behaviour.

This has been suggested by others and was put into practice a few years back by Glenn Lyons for the Ministry of Transport in New Zealand. There is of course a danger that the range of scenarios that is developed is too limited, biased or inconsistent. Don’t get too attached to a particular scenario or set of scenarios just because it’s yours, or because it suits your cause. If you can find reasons for rejection, others will too. Select from these multiple workable hypotheses a manageable set of scenarios for testing, and on a transparent basis. 

2. Don’t use model results as weapons. Encourage and embrace much more substantive debate of the results by knowledgeable proponents of all points of view. Transport for Quality of Life says in its report on the impact of road projects in England that they used the same data for their meta-analysis as Highways England’s consultants, but reached quite different conclusions. This is valuable and, if done well, avoiding overly emotive language and adhering to the other principles I set out here, increases the knowledge and expertise of everyone involved and hopefully lead to better decisions.

The natural reaction of modellers in the face of criticism has been to become defensive rather to engage; and to hide behind authority (in our case often linked to current guidance or ‘tried and tested’ methods). But isn’t it more important to get it right than to be right?

The current processes around modelling and the use of models in project appraisal is essentially adversarial. Recognising and admitting uncertainty in our numbers is probably as difficult as similar reporting and decision-making on medical trials. Rarely is their answer one of 100 per cent success or of 100 per cent clarity. In such situations experts involved in the trials (read for us ‘model development and application’) need to interpret and advise. Despite being challenged, or because of that, we must be able to explain why we believe something is correct and provide evidence for that belief; preferably quantified. And be willing to engage in the debate around reliability of the results and their dependence on assumptions (exogenous and endogenous).

3. Be curious. In cases of new technology and its impacts, don’t rely on single model results. Whereas in the past we have often sought to create a single version of the truth (single model, single forecast, BCR as the key outcome), there is value in analysing the results of not just multiple scenarios (my first point, above), but also multiple model approaches, some simpler than others but more transparent, others more detailed and complex. If the scheme is successful in all or most cases, this provides more confidence. In other fields such independent confirmation is much more accepted, and actually expected. Comparative analyses and interpretations would also be helpful to inform and strengthen the debate described in point 2 above, allowing different world views to be represented.

Certainly in today’s world, numerical evidence is necessary but not sufficient. It is usually not impossible to identify other qualitative and quantitative sources that can provide insight – triangulate! In one of my recent projects we assumed a hierarchy of evidence as follows:

  • Use model results where we believe that they can reflect an intervention well;
  • Seek evaluation studies, experience from similar projects elsewhere in the world where the model cannot be expected to perform as well;
  • Where no evidence exists from studies elsewhere, seek expert opinions on what the (preferably quantified) impacts of a policy or intervention may be.

OK, this will not give the precision of a model forecast, but as Luis Willumsen argues (Better Traffic and Revenue Forecasting, Maida Vale Press, 2014) is it precision or accuracy that we seek?

4. Be credible and trustworthy. Make sure all information you have used is reliable and easy to access; and written in plain language. Like many experts, we are portrayed as elitist and even arrogant; and probably with some justification. Be willing to show the totality of the results, don’t just cherry-pick those that suit your cause, or present the statistics in a biased way. Learn from the outliers, rather than ignoring them. Avoid jargon. And again, be transparent, particularly in your assumptions. Be willing to share your reasoning, your assumptions and your data. 

Some of these are easier to achieve than others. There should be no barriers to sharing our information and assumptions more freely, and in language others understand. Scenario modelling is not yet the norm, but certainly considered more and more in practice. Both Transport for London and the Department for Transport have developed scenarios, although they are, as far as I know, not yet widely tested. We certainly haven’t yet fully resolved how to develop these scenarios comprehensively and consistently.

There should be no barriers to sharing our information and assumptions more freely, and in language others understand

I expect our greatest challenge will be a more open debate about model results and their interpretation in decision-making. This will require humility (we don’t always get it right but are willing to learn), confidence (but we are professionals and actually pretty good at our job), respect (we don’t cook the books and adhere to the ethical codes of practice of our professional organisations) and trust (that by opening the dialogue we will improve our own and others’ expertise and understanding of the role of modelling in decision-making).

Since 2006, Modelling World has strived to be a mechanism to share best practice, present emerging techniques, challenge conventional thinking and explore innovation. Join us on June 14 and join the debate!

* https://www.theguardian.com/politics/2017/jan/19/crisis-of-statistics-big-data-democracy

Tom van Vuren is a Divisional Director at Mott MacDonald, a Visiting Professor at the University of Leeds and Chairman of Modelling World


TransportXtra is part of Landor LINKS

© 2024 TransportXtra | Landor LINKS Ltd | All Rights Reserved

Subscriptions, Magazines & Online Access Enquires
[Frequently Asked Questions]
Email: subs.ltt@landor.co.uk | Tel: +44 (0) 20 7091 7959

Shop & Accounts Enquires
Email: accounts@landor.co.uk | Tel: +44 (0) 20 7091 7855

Advertising Sales & Recruitment Enquires
Email: daniel@landor.co.uk | Tel: +44 (0) 20 7091 7861

Events & Conference Enquires
Email: conferences@landor.co.uk | Tel: +44 (0) 20 7091 7865

Press Releases & Editorial Enquires
Email: info@transportxtra.com | Tel: +44 (0) 20 7091 7875

Privacy Policy | Terms and Conditions | Advertise

Web design london by Brainiac Media 2020