The Prime Minister, Malcolm Turnbull, is supposedly a bit of a digital whiz kid au fait with all the wonders of modern technology and its promise to make us more agile.
On his government’s digital track record – the Census disaster, losing the head of his digital transformation unit shortly after he started and now the Centrelink problems – one has to wonder whether the Turnbull expertise is actually as great as promised or whether he simply hasn’t managed to convey the message to his Cabinet.
Now the blog is neither a digital whiz nor a kid. But it doesn’t take much effort to find out what can and does go wrong with IT and what is probably at the heart of the Centrelink tragedy – there’s something profoundly wrong with the algorithms Centrelink is using. In the blog’s case it took very little effort because its good friend, John Spitzer, who is a digital whiz, sent him a statement by the Association for Computing Machinery Public Policy Council (USACM).
The statement says: “Computer algorithms are widely employed throughout our economy and society to make decisions that have far-reaching impacts, including their applications for education, access to credit, healthcare and employment. The ubiquity of algorithms in our everyday lives is an important reason to focus on addressing challenges associated with the design and technical aspects of algorithms and preventing bias from the outset.”
It describes what algorithms are, some of the problems they can cause, how they can go wrong and argues that “There is also growing evidence that some algorithms and analytics can be opaque, making it impossible to determine whether their outputs may be biased or erroneous.” Well the Centrelink versions are obviously opaque but it is pretty easy to conclude, if you are not a Turnbull Cabinet Minister, that the outputs are biased and erroneous.
To help combat this USACM has outlined some principles for algorithmic transparency and accountability. The principles in broad are: awareness of the problems and potential harms; access and redress for those adversely affected by them; accountability; explanations of the procedures and decisions which are made; data provenance including not only the source data but potential biases in human gathering processes; auditability so that models, data and decisions can be audited where harm is suspected; and, validation and testing to validate the models and ensure that the model avoids generating discriminatory harm. So far it seems none of the above have been considered by the Turnbull Government over Centrelink.
The human cost of the Centrelink errors is only one example of what can go wrong. Sue Halpern (New York Review of Books 22/12/2016) reviews a number of books on algorithms and gives some vivid examples of discrimination and error. One of them, Cathy O’Neil’s Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, demonstrates that datafication often relies on proxies that bear no relationship to what they are supposed to represent. One example is the Cambridge Psychometrics Centre which purports to use personality tests to predict employee performance but which translates Facebook likes into frequently erroneous predictions as to whether you are gay, African American, female, have a criminal record, or are “too inquisitive” to be a good employee. That one stumped the blog for a minute until it belatedly realised this would help rule out whistle-blowers or, worse, people who wanted to know what the rate of pay should be. In Maurice Stucke and Ariel Ezrachi’s book, Virtual Competition ; The Promise and Perils of the Algorithm driven Economy, they cite a case in which a website was created for the centenary of “the historically black fraternity Omega Psi Phi” (If you don’t quite understand the US fraternity system the blog suggests downloading Revenge of the Nerds). Anyway, as Stucke and Ezrachi report: “Among the algorithm-generated ads on the website were ads for low-quality credit cards, highly criticised credit cards and ads that suggested the audience member had an arrest record.”
Blog readers can check this process any day they use a social media site. For instance, just look at the ads which come up on your Facebook page or any internet search. Some of them might relate to some recent search or comment you have made but many of them would prompt the thought: “what on earth made them to think I would be interested in that?” Sue Halpern did look at the ads which came up and allegedly related to her ‘likes’. Her conclusion was that they were odd to say the least. She has coeliac disease but keeps getting hit by ads for ‘cookie dough’. Why they are so wrong about it is a mystery “but I will never know (why) since the composition of Facebook’s algorithms, like Google’s and other tech companies’, is a closely guarded secret”.
Perhaps Malcolm Turnbull could take some time to have a close look at the Centrelink algorithms and fix them up – or perhaps he could just make them public.