On the 27th of February I was fortunate enough to be invited (okay, I bought a ticket) to the “Public Sector Data and AI Summit 24” in Westminster, bringing together government and industry specialists in this rapidly emerging field. I cannot help but contrast the tone and atmosphere in the room with that of another conference, “Building the Smarter State” that I went to only 5 months ago. It WAS 5 months ago, but it feels like a few technological ages have passed since. AI’s role in the conversation has shifted from a curiosity to be managed in good time to an absolutely vital, exciting and scary part of almost everything that government has to do in the next few years.
As always there were many excellent questions but very few easy answers. The large departments responsible for drawing up frameworks for making AI unbiased, sustainable, trustworthy or whatever else we want it to be, acknowledged that the answer to most of the questions asked is “it depends” – as in it depends on the context, who you talk to, the degree to which you can accept imperfections and many other aspects that cannot be easily resolved.
Add to this that the technology experts presenting on the day had some truly wonderful tools and techniques to share but few real-life cases where the full power of AI had been used. Westminster Council presented their lovely “Report It” service that used AI to power reporting potholes, littering, antisocial behaviour etc. in a faster, more convenient way. A great start, and it threw me back to the days of FixMyStreet (remember when it was launched? I do, I am that old!). It was a precursor to much of the digital revolution in government in the early 2010s, so I can see that from small projects like this mighty oaks can grow.
Similarly, Camden Council is using geospatial data to map and show antisocial behaviour in real time. Their representative also mentioned that it’s vital to move on from thinking of data as something that can only be used to produce one-off reports for e.g. policy decisions, to a space where it’s used continuously. But few government organisations have reached this phase, and many of the potential use cases involving large amounts of citizen and business data are yet to be seen. The technology is there (or can at least be bought), but the processes, skills, guidelines and governance are not. More importantly, the data is not ready. One of the speakers mentioned something that we at Transform has said for a long time: – you must be driven by your data and have a clear strategy before you can do anything amazing in AI. Very few government organisations have cracked the data nut yet, and that it's a necessary condition to deliver the future services.
Another key blocker / enabler was mentioned by the representative for Camden Council – it's important for people to understand their data won’t be used against them. A tricky thing to explain to people, especially Brits who tend to have less trust in government than e.g. my fellow Swedes. And even trickier as, in some instances, it's indeed likely that my data could be wielded against me. Possibly by a future government that I had no way of foreseeing when I shared the information in the first place! Considering this, “transparency” was a key word during the day. Transparency not only of the data but also the algorithms used in any AI consuming it and giving advice or even taking decisions. Only in this way can the users feel assured that there aren't built in biases, faults, political tweaks etc.
Another key consideration that is often talked about but seldomly addressed is the sustainability of the ever-expanding AI energy usage. Again, good engineering is required and the difference in energy and time use between an inefficient solution and a good one, was demonstrated as being 1-2 orders of magnitude, depending on approach and underlying technology. This all served as a strong demonstration of the value of long-term planning when selecting and building AI. Pushing solutions into place without consideration for likely future requirements and usage could lead to significant costs and systems that aren’t fit for purpose, so both tactics and strategy are needed. But how can you strategise for an area moving as fast as this one? Multimodality (AI models taking multiple inputs, including text, pictures and audio to generate insights and content) were mentioned as the next big challenge/opportunity, and (again) there were more questions than answers in the session.
All this sounds quite negative, but Transform’s take is that the best thing to do to ride this particular wave is to start with the right foundations. First, make sure your data is fit for purpose – clean, structured, safe and with the right governance and processes surrounding it. Whatever the future will throw at us, this will be useful. Second, make sure you build your skills and capabilities. Westminster Council mentioned they had no scientists and data engineers 12 months ago, a situation that had to change. Transform often provides interim or project capacity to help organisations address their immediate data/AI needs, butat the same time we also support recruitment and upskilling processes to build inhouse capability in parallel for our clients. An organisation can only manage its suppliers and take ownership of its own technology if it has the right skills in place. Again, this can be done here and now, and projects such as “Report It” or Camden’s geospatial data visualisation serve as powerful drivers of change and upskilling, making public sector more comfortable with designing, delivering and operating AI systems in the future.
I really look forward to going to this conference again next year. I have no idea how much things will have changed by then, but I do know they will have. Massively!