When Tony Blair looked back on his time in power, he had a simple assessment of his decision to introduce the Freedom of Information Act: “You idiot.”
While the technology secretary, Peter Kyle, is a fan of the former prime minister, he may be inclined to agree with that verdict after the act was used to reveal that he had been asking ChatGPT which podcasts he should appear on. The disclosure has already caused frustration among ministers, given its possible repercussions.
Blair’s gripe was that the act risked stopping the frank discussions needed among ministers and officials. Ever since, it has become notoriously difficult to have a freedom of information (FoI) request granted, as officials exploit various legal exemptions to refuse them.
The successful use of the legislation to probe into Kyle’s AI chatbot use has led some to conclude that a new precedent has been set, one that will have officials across Whitehall sweating over their recent chatbot interactions.
“It’s the first time I’ve come across that information being released, but it’s completely in line with the basic principles of the act,” said Martin Rosenbaum, a former BBC journalist and FoI consultant. “If ministers or officials are doing stuff on their phone or computers which is done for work purposes on behalf of a public authority, that is subject to FoI – whatever device they’re using. The same logic would apply to whatever prompts you type into ChatGPT, or any other AI.”
WhatsApp messages and texts – even on personal phones – can already be subject to FoI requests, thanks to a series of fiercely fought cases. In practice, however, officials have become adept at finding ways to knock back or heavily redact responses. Given that attitude, the granting of access to Kyle’s ChatGPT queries has shocked experts.
“I’m surprised the department didn’t fight it harder,” said Rosenbaum. “Some departments would have tried to resist it all the way. Obviously, it’s going to prompt a lot of other requests.”
Chris Stokel-Walker, the journalist behind the request for Kyle’s ChatGPT use, already has plans to ask for further disclosures. “My jaw dropped – I thought there was no way they’re going to give over this data,” he said. “In the time since I got the response, I have put in other requests for other generative AI interactions.”
Given this success, could the act now be used to reveal ministerial Google search requests? Rosenbaum said there was no reason in law stopping such a move. “As journalists, that is a theory we should now test,” said Stokel-Walker.
There is still plenty of scope to limit what is revealed, however. Access to Kyle’s ChatGPT queries was initially denied on the basis that some of his prompts were made in a personal capacity. It was later granted when the request was limited to prompts he made in an official capacity. But the difference between personal and official use is a grey area.
Heather Brooke, whose use of an FoI request helped break the MPs’ expenses scandal, said the distinction could be used to keep information secret. “It does give power of interpretation to people who have an interest in keeping things hidden,” she said. “With MPs’ expenses, some of the most egregious abuses were the exact things that they tried to claim were private.”
After this rare victory, Rosenbaum envisages a future in which AI could play an even bigger role in the legislation – becoming both the creator of requests, as well the target of them: “I’m sure FoI requesters themselves will be typing into ChatGPT – what should I request next?”