AI in the UK Welfare state; will you be abused, homeless or commit a crime?
In the changing world of the UK governments welfare system, private software companies hold their data, and a secret AI algorithim is used to determine life events.
No AI used in the writing of Minding your own MEDIA business.
Consider for a moment AI used to predict life events.
Are you excited?
Now what if someone makes a funding decision on your business based on it?
Many champion predictive AI as a useful tool for planning and preparing for healthcare, or housing.
But consider the implications on you personally, of an algorithm predicting your future behaviour as a criminal or victim of abuse.
In the UK today, the Labour government currently in power is changing the whole way we do benefits, receive benefits, qualify for benefits, overhauling the way people are categorised, identifying them as high risk or low risk for fraudulently claiming benefits, using a secret algorithm.
Alongside this they are predicting other factors, such as the likelihood of you to commit a crime, be abused, or become homeless in the future.
Such secret algorithms may well be infiltrating decisions taken on funding for your innovation or you…
Keep reading with a 7-day free trial
Subscribe to Minding your own MEDIA business to keep reading this post and get 7 days of free access to the full post archives.