Big Data and Government

Check out this BIG data dump…

When Grant Shapps, the Transport Secretary, was asked by Julia Hartley-Brewer on TalkRadio if the cost-benefit analysis would be presented to MPs later in the week, before they vote on another new-variant lockdown, Shapps retorted with a killer answer. Not only has such analysis been done, apparently, but a “huge data dump of a lot of analysis” would be delivered. This great dump, he suggested, could be pored over by MPs and, after their foraging in the dump, they would be super-wise to make a decision. 

Shapps’ answer says it all. Government policy, regardless of what it is, can be justified by obfuscation and lots of references to the bigness of the data. He made clear that data analysis would be part of the smoke and mirrors process to make sure that MPs were no wiser at all about the rationale for locking people down from meeting, drinking, eating and shopping before Christmas.

Shapps, apparently, got his big job back in government (after being side-lined by previous administrations) because he was seen to be a good media performer. But his past, littered with get-rich-quick schemes and dodgy pyramid-selling, has been all about saying one thing and meaning something else entirely.  Much of his dealings with the media in the past have been about justifying behaviour that was incompatible with high public office. 

But being good with the media seems, whatever form this takes, is all.  The Shapps approach involves knowing just enough about “data” to be able to evade what the data mean – what story it tells.

However, the arguments against the lockdown-based Covid response of the government have been made very well by scientists and very effective number-crunchers outside of government – notably Ivor Cummins, Carl Heneghan and Michael Yeadon.

Between them, they have meticulously destroyed the arguments for lockdown based on claimed Covid threat (centred around the R number). Instead, they have argued that PCR tests result in high levels of false positives, that the pandemic is probably over (based on reduced hospital admissions and evidence indicating T-cell immunity in a large percentage of the population).  They have also made compelling cases indicating that lockdowns just don’t work.  Along the way they have used precise and relevant data – not data dumps – to provide evidence for their assertions. But the government ministers responsible for implementing policies that are increasingly seen as damaging can run away from empirical evidence and use arguments that are based, frankly, on mumbo jumbo or just plain obfuscation.

The cost-benefit analysis of lockdown is likely to be complex and is likely to require some considerable evidence. But the result that we are seeking is an answer to this question: Is continued lockdown justified if the result is massive economic destruction, huge curtailment of non-Covid related treatment in hospitals, and significant damage to our civil society – even if a few more people get infected with what is, for most, a mild disease or one that results in no symptoms.  The answer cannot be a “huge data dump of a lot of analysis”.  That’s just not good enough anymore. 

Big Data and Government

Northern Ireland’s “Pandemic”

Every day people die. It’s an unfortunate fact. But people die. We’ll all die. And, when we do, our deaths – little data-points in the aggregate – will get added to the mix. Northern Ireland, like every administrative region in every part of the developed world, has a statistics agency that counts up all the deaths every day and collates them by month. And this year, 2020, is no different to any other year. Deaths have been tallied. There’s no denying the data. Or is there? 

In this, a ‘pandemic’ year, the month by month death data are more interesting than most. I’ve been looking at the numbers. And one month really jumps out. The month?  January 2018.  Why is this month particularly note-worthy?  Well in most months, most Winter months, around 1,200 or 1,300 or even 1,400 people die. But in January 2018 2,101 people died. In fact, more than 500 more people died in January 2018 than the average number of deaths for the five years previously. This is the excess deaths number.  506 to be precise. 

But, we’re told, 2020 is a pandemic year? So presumably we’ve knocked that previous record out of the park? Well, no. In not one month of 2020 have we seen that number of deaths. In April we came close with 1,933 deaths. In January this year we had much fewer deaths than in January 2018.

In 2018 it was well publicised that the health service came close to being overrun.  There was a crisis. Seasonal deaths were very high. But, of course, no convoluted tests were being used to determine what people were dying of. No doubt, respiratory diseases played a big part in causing end of life, especially among older people. But no new test was conceived. After all, if people are dying from chronic respiratory failure the symptoms are obvious, the diagnosis easy. No fancy tests are needed to test for what is normally described as Winter flu.  Or perhaps a particularly virulent form of cold.

Now let’s focus on April 2020 again. This is the month this year that was nearly as bad as January 2018 in terms of death count. If the health service was completely overrun in 2018, surely that was the case again in April, in a pandemic year? Well, no. That’s not the case. Because in 2020 – in March and April and May – the chronically ill patients, mostly elderly, were sent to care homes to die – for fear of hospitals being overwhelmed. So there was no real crisis in the hospitals. And, of course, elderly people with co-morbidities aren’t eligible for the limited ICU beds (there are less than 100 of them in Northern Ireland).  So ICU beds never got to capacity in April. 

So what about the so-called “second wave” in 2020 in Northern Ireland?  Well, we’re told, hospitals, many of them, are at more than 100% capacity. But in September deaths were pretty average for time of year at 1,384. This number isn’t significantly worse than the death number in September 2015. And yet, in 2015, most of the economy hadn’t been closed down. People weren’t on state-funded furlough. We could all still get out for a meal or a pint, and still get our hair cut.  But in 2015 we didn’t have the PCR test – a test contrived to define people as sick who clearly aren’t. And one result of this PCR test is that perfectly healthy medical staff are being sent home to “self-isolate” – meaning that they can’t help the spike in Winter patients – spikes that have occurred frequently in the past. 

And remember, the most severe spike was in January 2018, not in any month, so far, in 2020. 


Lockdown Folly

Dr Mike Yeadon’s video, explaining the nonsensical response by the UK government (and its crazy devolved government variants) to the Covid virus, has been removed from YouTube for violating its terms. Quite why, is not explained. Dr Yeadon explains carefully, succinctly, and based on years of experience as an immunologist, why lockdowns are failing and why bad data is driving awful policy. This is a tour de force argument against lockdowns. And I’m posting it here as an homage to informed debate and freedom of speech. And as a tribute to business people whose livelihoods are being destroyed by catastrophic policy.


TECH|The New Era

I’m delighted to be working in partnership with Switch New Media on a new event that will take place to coincide with London Tech Week in September. TECH|The New Era will be on September 8 – during London Tech Week. It will be a day-long conference involving technology thought leaders, start-ups, scale-ups, commentators, analysts, writers, economists, investors, and policymakers. Our objective will be to provide a forum for some of the best, unhindered thinking on how Tech should respond and bounce back – and what technology solutions we should be building and funding right now. We will also hear from those who see real opportunities for a V-shaped recovery rather than one that looks more like an L. 

We plan to run the event from early morning to early evening allowing participation from across time-zones. We hope (subject to social distancing rules in place at the time) to anchor and live stream the event from London with a rolling studio audience and studio guests throughout the day.


Not all virtual events are created equal

Just over 10 years ago I hatched an audacious plan in conjunction with Diarmaid Lynch of Switch New Media. The plan was that we’d run a tech event (with a twist). But what made this an audacious plan was that we’d live stream it and allow live chat in real time, using Twitter.  We’d have a small audience, crammed into a conference room in central London, but we’d pack the room with broadcast quality cameras and run the thing like a TV show. Great sound, a proper set, good lighting, – and we’d allow people to comment and ask questions live. Amazingly, we pulled it off.

We had a good live stream audience (in the thousands) as well as around 100 people in the room. Despite corporate firewalls most people were able to watch it live, regardless of the browser they were using (and this was an issue back then). We had media sponsors who helped promote it. And we had corporate sponsors behind us – like IBM and Microsoft – to help fund the thing. And we had compelling keynote speakers to draw people in. 

The secret to its success was that it was a proper event. It had an element of theatricality. In fact, possibly even an element of show business.  There were glitches, of course. But it worked. And since then the concept has been honed and improved.

Fast forward just over ten years and everyone is offering “Webinars” or live streamed events. In lockdown there is no alternative. But, my goodness, where is the show business? 

The business world has suddenly, apparently, embraced what we were offering a decade ago. Except, of course, they are overlooking the fact that without the show business the audiences will ultimately go away. The sheer volume of “webinars” is simply unsustainable.

All the old world event rules still apply. There needs to be a reason to attend an online event as much as a real in-person event.  Networking and connecting comes into it, certainly. But the content needs to engage as well – offline and online. 

As we come out of lockdown physical events will return but it’s likely that travel policies will be slow to unstick. Large events will be tricky. And, in any case, there will also be a reluctance to travel and mix in big groups. But I suspect that the amalgam event will return – a modest, in-studio live event (with invited guests and speakers) combined with streamed as live content. And the whole thing professionally anchored. And everything properly mixed, streamed from a robust platform, with crisp sound and polished continuity and slick collaboration and networking. 

Because anything less than this isn’t an event – virtual or not. 


Where to now?

I’ve worked at home for around 18 years. Admittedly, I’ve had the opportunity to get out and about from time to time. I’ve travelled a lot – across the UK and internationally. But now, of course, I just work from home. Travel is not really an option when international travel is curtailed or banned.

Working from home, when working at home is the only option, will be as novel for me as it will be for the millions of others around the world who have been effectively told to go work at home. Admittedly, I’ve the tech and have honed some home working processes. I’ve also got a rather lovely office. But bear in mind that most of us who might call ourselves ‘knowledge workers’ of one type or another, have based our working life on the premise that we work between events. Working is, in effect, event focused.

Let me explain. Prior to Covid-19-imposed-home-working-exile the typical home worker worked towards a series of objectives that were, to all intents and purposes, events. In my case these events may have been one or several of the following: a meeting with a client company in location x; a workshop focused on subject y in location z; a team meeting in HQ where we discussed achievements during the last month or quarter; a conference; a briefing; a networking gig.

These tended to be physical events requiring preparation, planning, travel, logistics and people getting together face-to-face. And the arguments made in favour of this approach were well honed: relationships with people require face-to-face contact.

Now, of course, face-to-face contact, we’re told, could be fatal – to us or to our elderly relatives. So, we’ll work at home and we’ll try to replicate the physical with the virtual: remote working tools, collaborative tools, video and audio conferencing. These are the things we’ve been aware of and have used in the past, but they’ve played second fiddle to physical mixers and “getting business done” meetings.

Some are saying that the new-normal exile may result in these tech-driven solutions revolutionising how business is done. Perhaps they’ll cause businesses to run better and result in vastly reduced costs of doing business. Perhaps businesses will even reconsider whether they really need offices and HQs – and whether nations need all the infrastructure to support business travel – when business can be done so much more effectively remotely and with a distributed workforce armed to the teeth with collaborative gadgets and connectivity.

Time will tell. And I’m hoping that the Covid-19 crisis will be short-lived and that business (and nations) will bounce back quickly. But I suspect that while the home working tools may teach us that there are other ways of doing things, and may make us ask ourselves whether jumping on a plane (when we can again) is the best way to build a relationship, ultimately we’ll revert back to what is tried and trusted: people getting together and sparking ideas, doing business and having a pint down the pub.

The exile period will allow us to take stock and question our respective roles. It will allow us to be families and remind us just how important they are. It will require us to focus on the oldest and most fragile in our society and the wisdom and humour that they bring to us all.

After 9-11 many who lived and worked in New York claimed that they were chastened by the experience. Many were of the view that the city would shake off its reputation as one of America’s most ‘me-focused’ cities and would become much more community spirited. Perhaps that happened, perhaps it didn’t. Perhaps it was short-lived. But, then again, perhaps New York recalibrated in ways we’ll never know because of the pain and the collective horror. On the surface it may be back to the way it was – but just a bit better and more controlled.

At the heart of this pandemic derived crisis I’m convinced that we’ll come out of the experience as different people. Some of us will suffer loss – the loss of a loved one, the loss of income, the loss of understanding of what we stand for. But most of us will probably just have time to take stock. Perhaps we’ll use the new-found tools and processes to get to know our co-workers and customers a bit more. Perhaps it will be the excuse we need to focus on our collective humanity and get some priorities in order. And when we emerge from the tunnel, perhaps we’ll be just a little bit better in one way or another. And our first pint in the pub, with others, will taste just so good. 



Bosch Connected World in Berlin was one big show.

The IoT ecosystem attends in force but more than ever the focus is on how artificial intelligence and machine learning are integral to next generation products.

5G is on the verge of making possible more and more connected devices. Making sense of the data they generate is no longer possible without machine intervention.

The CEO of Bosch, Volkmar Denner, made this clear in his opening keynote. But he also made the point that better learning algorithms are required, obviating the need for AI based on millions of training “cases”.

The alternative, he pointed out, is that AI will be dominated by giant monopolies rather than a rich and vibrant community of ISVs.


AI Round Tables

So what is Artificial Intelligence?

Well, simply put, it’s where machines are taught (or teach themselves) to do things that we – humans – aren’t very good at. 

Most of us aren’t good at car parking or finding our way in a place we’ve never been to before. Few of us like really dull repetitive tasks. Most of us aren’t very observant, it’s just the way our brains work. And because we’re busy we’re often just not on top of stuff that we should be.

And the great thing is that computers can be taught to do an awful lot of stuff that we hate to do – and, frankly, can do it better.

Like scanning medical images to detect potential problems. Or making predictions about crop yields based on loads of data sources. Or monitoring routine things going on in our car engines. But they need to do these things to suit us and make our lives better. And we need to understand how they do it.

The United Kingdom has pioneered the use of machine learning and AI. We have some of the world’s most innovative AI companies here building solutions for better mobility, better run cities, better healthcare. But we recognise that innovation is a collaborative business.

So we want to encourage dialog, co-operation, and partnership. It’s for that reason that we’re running a series of AI and machine learning workshops and round-tables to encourage the best AI entrepreneurs, decision-makers and funders to get together and talk. We’ll be hosting these sessions in the UK and also in UK overseas embassies in places like Berlin, Helsinki, Madrid and Lisbon.

We already have a few dates in the diary. But if you’d like to take part just get in contact with me and I’ll get you connected to the DIT’s local teams in-market.

Smart Cities

City (R)evolution

One of the themes of this year’s Mobile World Congress in Barcelona is the ‘fourth industrial revolution’. This ‘revolution’ is the move to a much more automated society. Therefore, it’s all about conjoining processes via ‘the network’, and machines making their own decisions based on some type of ‘deep learning’ or AI. The end-game of the revolution, it is proposed, is all about making society a better, easier, happier place in which to live.

The move to an automated society is rather complex. As the organisers of the Congress put it, “this theme [the fourth industrial revolution] unravels the complex web of technology trends, partnerships, business concerns and opportunities that enterprises of all kinds need to address to survive and thrive in a digital automated world, and the demands this places on city and national governments.” 

Cities and nations are slow moving ships. They are also highly complex and re-engineering them is extremely difficult. Moreover, the ‘estate’ (infrastructure and built environment) may not be suited to a rapid move to the fourth revolution.

‘Revolution’ implies some type of big bang solution. But it’s probably better to think of relatively quick (but iterative) wins that collectively create the revolution (over time).

For example, several years ago, London Underground introduced the Oyster card. This was a new contactless ticketing system that removed the need for paper tickets. Commuters could charge up their cards online and just go use the tube. With contactless technology installed in the ticket barriers, London Underground was also able to announce just a couple of years ago that – in addition to Oyster cards – commuters could also use contactless payment cards and mobile phone contactless payment. This meant non-Londoners and day-trippers could also use contactless payment without purchasing an Oyster card. Soon Google and other mapping vendors added live feeds from Transport for London into mobile mapping apps, meaning that passengers could avoid very busy routes (e.g. during the London 2012 Olympics). Therefore, progress is iterative. But, over time, multiple actors and layers of innovation make things smarter, and simpler to use and enhance customer experience.

In time, so-called ‘deep learning’ could augment live feeds to make suggestions re. alternative travel routes based on aggregated data and data projections. Increased use of multi-functional sensors around cities could allow more information to be communicated more quickly to make the city experience better. Recently, the University of Manchester announced that it had developed a new type of graphene based sensor that could be integrated into an RFID chip that could then communicate with a wide area network. Over time, graphene sensors will be able to communicate multiple types of sensor information: possibly sensing pollutants, warfare agents, even explosives, thereby protecting citizens. Or simply monitoring air quality. But sensors will abound, measuring and feeding both quantitative and qualitative data.

Sensors, of course, are just one element of a wider ecosystem designed to augment and improve the environment and the fixed estate of the city. Over time sensors will provide the networks upon which self-driving vehicles will depend, or unmanned public transport. Sensors will also help to visualise the city with citizens, themselves, feeding into these data visualisations using pervasive mobile devices and even wearable (mobile) sensors.

These things must work together. The orchestration doesn’t necessarily require master planning or grand, centralised schemes. Rather it’s about data sharing and different specialists looking at different but interlocking problems in their own way. Collectively they tend to spark ideas, opportunities and improvements off each other. Increasingly, the fourth industrial revolution feels like evolution.

Sensors and the so-called Internet of Things (IoT) have key parts to play. But national government, too, needs to be more aware of the digital opportunities available. According to IHS Markit Technology, as of the second quarter of 2017, the United Kingdom was the country with the highest number of smart city projects (45) in Europe. This helps the UK build significant expertise – and expertise that has relevance well beyond these shores.

It’s said that smart cities, increasingly, need to focus on the three I’s i.e. they need to be instrumented, interconnected and intelligent. Instrumentation (and data) is of little use if no-one sees it or acts upon it. Similarly, it’s tricky to allow people to make smarter decisions if the information is out of date or inaccurate or simply not available. The end-game is about making society a bit better for the people who live in it. Technology, unquestionably, can help.

Citizen Experience Digital Transformation Uncategorized

Apps, Platforms and Government

This one of a series of articles focused on transformation of government service, produced in association with Equiniti

A few years ago, we were discussing the app economy. Apps (i.e. applications, typically, on mobile devices) were revolutionary, or so it appeared. Everyone wanted an app and tech entrepreneurs fell over themselves to get in on the act. Even government departments and local authorities rolled out apps.

But not all apps were created equal. The app market became the ultimate long-tail exemplar – people used a few apps, but most of the rest were wannabes.

The problem with many of the apps was that they weren’t joined up. Each had to do its own convincing of its own importance. After a while, they failed. They were deleted. They died.

There’s something allegorical about the app story. Apps continue to be important – we all use them. But apps aren’t important in themselves…they are merely windows into information. Some provide huge vistas into a vast, connected world. Some don’t.

The API economy, on the other hand, is something different.

Where many apps were standalone and insignificant, APIs provide for a joined-up world of possibilities. The API economy is as important for government as the private sector. Here’s how an article in Forbes defined APIs (and why they’re important):[1]

APIs (Application Programmer Interfaces) are the components that enable diverse platforms, apps, and systems to connect and share data with each other.  Think of APIs as a set of software modules, tools, and protocols that enable two or more platforms, systems and most commonly, applications to communicate with each other and initiate tasks or processes. APIs are essential for defining and customizing Graphical User Interfaces (GUIs) too. Cloud platform providers all have extensive APIs defined and work in close collaboration with development partners to fine-tune app performance using them.

In short, APIs allow applications to be built without the need to constantly reinvent the wheel.

In a government context, this is very significant. APIs allow applications and user interfaces to share critical information and processes. But it also means that government can become more like a platform than a set of apps that don’t talk. This makes the process of government more seamless, less annoying and much, much more efficient.

The Institute for Government (IfG) has recognised this. In its report published in June (Improving the Management of Digital Government) it pointed out how the cyber-attack that took down hospitals and doctor surgeries across the UK (largely because Old PC operating systems hadn’t been updated) showed the fragility of government IT.  It also called into question the role of the Government Digital Service. The report, while recognising that the UK was considered to have one of the most digitally developed e-governments, also laid out what more could be done.

More recently, Francis Maude, the former government minister who created the Government Digital Service, also criticised the civil service in terms of its embracing of the need for greater efficiency and reform. In his speech, delivered in September 2017, he said, “imperceptibly, inch by inch, with a control dropped here or not enforced there, the old silos and departmental baronies are re-emerging, with nothing to restrain the old unreconstructed behaviours from taking hold once more.”

The Civil Service and GDS hit back. But regardless of whether criticism is due it’s clear that there are rewards waiting if the government can reject the departmental baronies and move towards an API-focused model for government.

The IfG Report defined what needed to be done:

  • GDS should create a store for Application Programming Interfaces (APIs) for the public sector that encourages reuse and supports the development of API standards.
  • The Government should urgently clarify the roles of GOV.UK Verify and the Government Gateway, to spread the benefits of secure identity verification.
  • GDS needs to manage the market for digital services more actively, by: a) configuring the Digital Marketplace for different users b) ensuring that standards are enforced with vendors, including on shared services, to save money and provide a better service for users.
  • GDS should work with the Treasury to review practices around charging for sharing data within government and the public sector, and establish principles so that incentives to share data adequately reflect the public interest.

Sharing is the watch-word here. The creation of an API store for the public sector works to ensure reusability of core information assets – meaning that complex processes can be made seamless as far as the citizen is concerned.

Many of the services provided by government require (currently) multiple systems to be accessed independently of each other. That’s why the IfG is right to highlight the importance of identity verification. Silo verification is a key reason why interoperability doesn’t work within government – and it’s also a major source of citizen frustration.

With a commitment to efficiency and reform within government we’re tantalisingly close to all- digital government service. However, the government needs to create its own API economy before that’s achieved.