work email: firstname.lastname@example.org
mobile: +44 7967 481693
As a fan of the scientific method, and a celebrator of honesty in public life, looking back over the past few weeks in British politics makes me want to weep.
Can I leave aside for a second the naked ambition and frankly sociopathic behaviour of a few power-hungry Etonites. I guess we should expect that. (And anyway they still make me too angry to think straight.)
Instead, let’s just look at how, as a population, we have behaved when it comes to intelligently sharing and filtering facts and information. As so many people have said, in the run-up to the referendum, the fact that some claims were founded on empirical truth, and some on lies, didn’t seem to feature at all when people were making their decision. The fact that some claims had been falsified by nationally recognised people and organisations made no difference at all.
We seem to have participated in a mass frontal lobotomy — an astounding sudden forgetting of how to think properly. (And there has been no shortage of terrified well-educated people pointing out that the last time there was a collective abandoning of intellectual principles of this magnitude was in the 1930s…)
But look at the irony. Here we are, armed with a global communications network that even our most prescient ancestors could hardly have dreamed of, and yet we have managed to increase our ignorance, and allow obvious lies and rumours to rule unchecked.
The tragedy is, that although we have this almost magical asset, the internet — which gives us the ability to instantly and effortlessly pass on data, insight and ideas to each other, and have them flow without government or corporate control from person to person — we have let it become a high-speed rumour mill. Maybe that shouldn’t come as a surprise but I can’t help thinking it’s the most amazing own goal.
I’m not sure what the best way to address this is, and it’s certainly going to take a long time for us as a population to collectively wake up, gen up, and re-learn the good old lessons of empirical free-thinking from our enlightenment predecessors.
But there is one thing that occurred to me that might just help and that is “trust networks”. I’m not an expert on them, and there are many geniuses our there working on this right now who know much more than I do. But it strikes me that if we had had in place a strong internet-based trust network (and one that is federated — i.e. not run by a single company), we may not have circulated quite as many shoddy claims as we did.
While I was based at the Hub, Islington (a fascinating crucible of all sorts of people creating projects and startups) I had some memorable conversations about how such networks might work. (FYI: I was running Dynamic Demand at the time, and joining in conversations that would soon become Demand Logic.)
As far as I understand it, the principle is a little like Google’s Page Rank — the “importance score” that Google’s robots allow to flow from page to page, through links, in order to evaluate what pages searchers are likely to find interesting.
We all have people we trust to speak on certain topics, and they have people they trust, and so on. Trust networks are ways to score information and ideas on their credibility, based on their long journey from originator to you, through a complex ‘web of trust’. If a post, idea, claim etc comes to your attention, it should be tagged with a trust score based on who it came from and who it was circulated by.
In some implementations, you would not only see how likely something was to be factually correct (due to it having had credibility added to it by the people you trust to assess this kind of thing, and the people they trust…) but you would also see an instant picture of the ethical ramifications of, say a new policy idea, because it would also carry scores accumulated from people with whom you share some ethical fundamental principles.
There are some obvious dangers. For example, there could be golden nuggets (especially novel, disruptive ideas) that could get crushed too early if the network reacts wrongly. And such a system, if not cleverly designed, might result in a strengthening of ‘silos of thought’ and ethical divisions. And of course the running of such networks needs to be open, honest and free from interest — no small ask. But these issues can’t be beyond our wits to sort out.
It think it’s time to think seriously about open trust networks. Surely any tool to help us out of the vicious rumour mill has got to be worth a try?
(Written a little too rapidly in one of Stoke Newington’s many wonderful cafes while recovering from a rather horrid dental appointment…)
This image shows 28 days of space temperatures in 14 office buildings (taken at random from buildings monitored by our startup, Demand Logic). Each row is a building. Each box shows a day. The heights of the blueish areas show the portion of rooms in the building that were getting a little cold on that day. Converse for red. Around 6.8 million temperature samples were used to create this image. It represents the daily comfort of about 15,000 people!
It’s a nice demo of how much data we’re streaming from buildings and how many opportunities there are now for new visualisations.
I used this shot in a short talk at a recent Ashden “After Work” event and it seemed to go down well. It’s compiled from a new summary view we’re working on to give people an instant glance of their building performance. (With our nice clear headings and legends removed to add a little mystery for the talk…)
Just dug this out. I drew it a few years ago when I was getting angry about the way the media covers climate change science.
Who? What? Where? When? and Why are you telling me this?
These are the five questions that my late father, George Short, used to say must be answered in the first paragraph of any good news story. He was Training Editor for Reuters, one of the world’s largest news-gathering organisations.
(And his many wise and witty words of advice come to my mind nearly every day, even 20 years after his death…)
At Demand Logic, the clean-tech web startup we formed a few years ago, I now have the wonderful privilege of being able to dream up and lead development on new ‘user experiences’ — data-driven web pages designed to help save energy and increase performance of buildings. And it occurred to me that what we are actually striving towards is good “visual journalism”. In other words our job should be to lay before our users the ‘headline news’ about the buildings they manage.
However complex the underlying systems (did you know that the typical building depends on the smooth inter-operation of up to a thousand items of machinery?), Demand Logic’s goal is to lead the eyes of both busy managers and data-hungry engineers towards the top story — those few items of plant that are letting the side down. Or that single controls tweak that could lead to 6-figure annual savings. (Yes, believe it or not, we find those all the time…)
Since we formed the company, we have been building the back-bone: the solid technologies capable of harvesting, storing, retrieving and processing staggering amounts of building services data. We are now streaming around 40 million values each day. And already we are providing several hundred users with novel insights about how their building systems are behaving.
But we’ve really only just started to scratch the surface in terms of visual communications. Although there’s more to do on our core systems to increase data-processing capability, we now have a huge data-crunching resource that we can use to drive some real visual creativity.
When ‘debugging’ a building to find where the wastage is coming from, you need to see several stories at once. The summary view below, released recently, gives a taste of the other kinds of data we collect and display.
My attempt at a tribute to Alan Turing in oil paint. What a great man, and surely one of Britain’s most badly-treated heroes.
I found a black-and-white photo of Turing and tried to guess the colours.
I was thinking the other day how much we owe this man. At Demand Logic, as we work on the next energy-efficiency visualisation, we forget that long ago computers didn’t exist. Turing anticipated them, worked out their theoretical basis, and then dedicated much of his life to building them and programming them. Incredible that he could see so far.