With artificial intelligence estimated to have the potential to deliver as much as a 10% increase to the UK’s GDP before 2030, the challenge remains to unlock the technology’s potential – and to do so, a panel of AI experts recommends placing a bet on young brains.
A new report from the AI Council, an independent committee that provides advice to the UK government on all algorithmic matters, finds that steps need to be taken from the very start of children’s education for artificial intelligence to flourish across the country. The goal, for the next ten years, should be no less ambitious than to ensure that every child leaves school with a basic sense of how AI works.
This is not only about understanding the basics of coding and ethics, but about knowing enough to be a confident user of AI products, to look out for potential risks and to engage with the opportunities that the technology presents.
“Without basic literacy in AI specifically, the UK will miss out on opportunities created by AI applications, and will be vulnerable to poor consumer and public decision-making, and the dangers of over-persuasive hype or misplaced fear,” argues the report.
SEE: An IT pro’s guide to robotic process automation (free PDF) (TechRepublic)
AI should be taught as its own specialist subject, therefore, but also be added into other subjects such as geography or history. Extra support should be provided to teachers, through curriculum resources and learning programs, to help them get to grips with AI concepts and ensure that the subtleties of the topic are covered at all key stages of education.
For Bill Mitchell, director of policy at the BCS, the UK’s chartered institute for IT, the objectives set out in the report are remarkable, but they lack a clear roadmap. “The report highlights the need for every child to leave school with an understanding of AI, and I totally agree with that. But how?” Mitchell tells ZDNet.
“I would have liked more explanation on how AI is part of a broader picture. You’re talking about the best part of 10 million children a year. How are we going to ensure they understand what AI is? That’s a massive undertaking. So, it’s great to lay out the challenge – but then what?”
The UK government is not short of initiatives already that aim to kick-start digital change in education. In 2018, for example, £84 million ($114 million) was invested to create a National Centre for Computing Education (NCCE), which provides support for the teaching of computing in schools and colleges. Since its launch, the NCCE has engaged with almost 30,000 teachers across 11,500 schools in the country.
There are about half a million full-time teachers in the UK. Given the amount of teacher training required, and although programs such as NCCE are an encouraging start, Mitchell maintains that he would be “surprised” to see every child coming out of school with an understanding of AI by 2030.
For Mitchell, the lack of digital skills across the UK population is the “biggest inhibitor” to AI being a success. But equally as important as ramping up training in schools, he explains, is upskilling the existing workforce that is already at risk of losing their job as a result of automation.
“If we’re going to be radical about enabling this country to grab hold of the digital revolution, we should be providing every single adult with the skills and the training that are necessary, so that they reach the same level that we expect 16-year-olds to have achieved when they leave school,” says Mitchell.
Around 1.5 million jobs in England are at high risk of some of their tasks being automated in the future, according to recent statistics published by the Office for National Statistics (ONS). This represents 7.4% of roles across the country, the most at-risk of which are lower-skilled jobs.
In the face of this growing need for new skills, the government has launched various programs, such as a national re-training scheme which so far has been made available to 3,600 people. But there is still much to do: a survey carried out by Microsoft, for example, showed that two-thirds of employees in the UK feel that they do not have the appropriate digital skills to fulfil new and emerging roles in their industry. A Lloyds Bank report found that 19% of individuals in the UK couldn’t complete basic tasks such as using a web browser.
In this context, even the most innovative AI applications will struggle to be deployed and adopted in businesses. “The UK has a great research base in universities,” says Mitchell. “Unfortunately, we are still lacking in the capability of successfully adopting and adapting AI in organizations. And that’s because of the lack of skills in the global population.”
Many expect AI to bring huge benefits to the economy. The AI Council’s report highlights the role that algorithms have to play in healthcare, but also defense and climate change: it is estimated that the various applications of AI alone could save up to 4% of greenhouse gas emissions by 2030. The technology also holds promise in so-called “moonshot” endeavors, ranging from materials science to battery storage and drug discovery.
Transforming this potential into a reality will take significant efforts, from a skills perspective, but also in ensuring that the appropriate regulatory and ethical frameworks are in place, or in providing scientists with access to high-quality data. The AI Council urged the UK government, therefore, to draw an AI National Strategy to set a clear time frame and action points that will position the country for success.
Other countries have already overtaken the UK in that respect: Germany has committed €3.1 billion ($3.8 billion) towards a national AI strategy, while the US has invested $1 billion over the coming five years to open 12 new institutes dedicated to research in AI. Even within the UK, Scotland has embarked on its own AI strategy.
Strong leadership and clear objectives are, therefore, crucial to achieve the country’s AI goals. “The problem is that, historically, governments are not good at change management,” says Mitchell. “They are really good at the status quo, but data, digital, AI, are all radical new things that are revolutionary. Managing a revolution does not come naturally to government departments.”
How long it might take for the UK’s own roadmap to see the light of day remains unclear. The government has only just published a delayed and long-awaited national data strategy, and has announced that a new digital strategy is in the works. Now expected to come later this year, the digital strategy was originally due to be published in 2020.
- Azure Synapse Analytics: A progress report
- 5 reasons AI isn’t being adopted at your organization (and how to fix it)
- InfluxData becomes the embedded IoT database for PTC, adds Azure support
- How MIT and IBM are fighting COVID-19 with AI (ZDNet YouTube)
- Google collects a frightening amount of data about you. You can find and delete it now. (CNET)
- One key to hiring your next data scientist (TechRepublic)