AI and data driven technologies are opening up whole new vistas of efficiency and improvement for local government services. But they’ve also opened up a new can of worms.
And those worms are all about the sort of data that’s collected, how it ends up being used, and what that means in terms of community trust in the agencies that govern them.
Those issues are explored in a new report from Swinburne University on the use of AI by council garbage trucks owned by a metropolitan Melbourne Council.
The researchers say they hope the report, AI governance in the smart city: a case study of garbage truck mounted machine vision for roadside maintenance , can guide other government operations as they develop smart city initiatives.
Developed in collaboration with Swinburne, Brimbank City Council, Optus and AWS, the Mobile IoT-RoadBot helps council identify roadside assets that are in need of maintenance using the 5G network.
During a two week trial, constantly-running cameras were mounted on 11 garbage trucks, servicing a 123 km2 area of Melbourne’s western suburbs.
AI was used to analyse data and alert a relevant team to damaged assets like roadsigns and bus shelters, eliminating the need to send council officers out in cars to do inspections.
Each truck was in service for about seven hours between 5 am and 2 pm, streaming about 5GB data of data.
Researcher and co-author of the report Anthony McCosker, Professor of Media and Communication at Swinburne University and an expert in automated decision making, says the Brimbank case study is significant because it’s pushing the boundaries of what 5G was developed for.
“The technology is mainly trained to pick up sign damage, graffiti damage to bus shelters and dumped rubbish,” he told Government News.
“It picks up a high level of accuracy around those key maintenance needs.”
The technology earned fellow researcher Yong-Bin Kang a gong in the ‘Government & Public Sector Solution’ category at the VIC iAwards 2023, with judges finding the project had “brought significant advancements to the municipality of Brimbank City Council”.
Concerns about ‘scope creep’
But the benefits don’t come without potential complications.
The research team’s latest report highlights governance issues arising from technology gathering sensitive information that can be misused or used for purposes not originally intended – a tendency known as ‘scope creep’.
As well as picking up data about graffitied bus stops or rubbish dumped on median strips, the cameras can also record number plates, and photographs of residents going about their business.
“This promising development also raises concerns about managing unintended personal data, the possibility of scope creep, and new data governance needs,” the study warns.
“This promising development also raises concerns about managing unintended personal data, the possibility of scope creep, and new data governance needs.AI governance in the smart city: a case study of garbage truck mounted machine vision for roadside maintenance, Swinburne University
“Scope creep could manifest in the intentional expansion or abuse of the system, leading to consequences for individual privacy and data trust. For instance, the collected data could be used for policing certain behaviours, issuing fines, or monitoring and punitive action.”
Professor McCosker says scope creep is a key issue for the development and use of AI technology – not only for the community but for council workers.
“It’s complicated because it’s often unexpected,” he told Government News.
“It’s something that comes down the track when you deploy a system like this and then suddenly realise that you can use it for other purposes, and those other purposes might not be as well received by communities that they’re acting in.
“Data collected by the garbage trucks may repurposed for traffic violations, they could also gather the type of data that could be used to monitor the truck drivers themselves, in terms of their workflow and route taken, and whether they’re driving safely.”
To manage these risks, the report proposes a governance framework which looks at how universal AI ethics principles can be adapted and translated for a local government setting.
For councils, this can mean establishing how to evaluate a program, communicate with stakeholders, and develop policies and guidelines around data management and staff training.
“We tried to bring in responsible and ethical IA principles the federal government has identified, for example accountability, privacy and transparency, and translate those princicple into a practical framework for the council,” Professor McCosker said.
In practical terms that means governance around everyday data management; skills and accountability within Council; risk management, and decision making about how the model is managed.
It also includes setting the scope and purpose of the technology being used, and making sure it isn’t used outside it’s original purpose, he says.
The main message for councils, Professor McCosker says, is to get the governance right and undertake appropriate consultation before implementing the technology.
“Do the exploration of the ethical issues and governance needs upfront, consult with the community and put plans in place about deploying the technology safely,” he says.
“Its about translating sound principles into actions and practice.”
Comment below to have your say on this story.
If you have a news story or tip-off, get in touch at firstname.lastname@example.org.
Sign up to the Government News newsletter
Sorry. No data so far.