Question for Short Debate
Asked by
To ask His Majesty’s Government what assessment they have made of the role of educational technology (ed tech) being used in schools in relation to (1) the educational outcomes, (2) the social development, and (3) the privacy of schoolchildren.
My Lords, I declare my interests, particularly that of chair of the Digital Futures Commission, which published the Blueprint for Educational Data in 2022, as chair of 5Rights Foundation and adviser to the Institute for Ethics in AI in Oxford.
School is a place of learning and an environment where children build relationships, life choices are made and futures initiated. For most children, school is compulsory, so while they are there, the school is in loco parentis. I welcome the use of technology, whether for learning or management, but it is uniquely important that it meets the school’s multiple responsibilities for the children in its care.
The debate this afternoon asks us to consider the impact of edtech on learning, privacy and the social development of children. Each could fill a debate on its own, but in touching on all three, I wish to make the point that we need standards and oversight of all.
For more than a decade, Silicon Valley, with its ecosystem of industry-financed NGOs, academics and think tanks, has promised that edtech would transform education, claiming that personalised learning would supercharge children’s achievements and learning data would empower teachers, and even that tech might in some places replace teachers or reach students who might otherwise not be taught.
Meanwhile, many teachers and academics worry that the sector has provided little evidence for these claims. A recent review by the UCL’s Centre for Education Policy found that, of 25 of the most popular maths apps for children aged five, only one had been empirically evaluated for positive impacts on maths outcomes. Half of them did not include features known to support learning, such as feedback loops, and six of the 25 contained no mathematical content at all. If the UCL finding was extrapolated across the half a million apps labelled “education apps” in the app store, 480,000 would not be evaluated, a quarter of a million would provide no learning support and 120,000 would have no educational content at all. The lack of quality standards is not restricted to apps but is widely spread across all forms of edtech. Of course we should have tech in school, but it must be educationally sound.
Covid supercharged the adoption of edtech and, while we must not conflate remote learning with edtech in the classroom, the Covid moment offers two important insights. First, as forensically set out in the UNESCO publication An Ed-Tech Tragedy, the “unprecedented” dependence on technology worsened disparities and learning loss across the world—including in Kenya, Brazil, the United States and Britain. Unsurprisingly, in each country the privileged children with space, connectivity, their own device and an engaged adult had better outcomes than their peers. A more surprising finding was that, where there was no remote learning at all but children were supplied with printouts or teaching via TV or radio, the majority of students did better. The exact reasons are complex but, in short, teaching prepared by teachers for students whom they know, unmediated by the values and normative engineering practices of Silicon Valley, had better outcomes. UNESCO calls on us to ensure that the promises of edtech are supported by evidence.
Secondly, Covid embedded edtech in our schools. Sixty-four per cent of schools introduced, increased or upgraded their technology with no corresponding focus on pupil privacy. In 2021, LSE Professor Sonia Livingstone and barrister Louise Hooper for the Digital Futures Commission mapped the journey of pupil data on Google Classroom and Class Dojo. Their report showed children’s data leaking from school and homework assignments into the commercial world at eye-watering scale, readily available to advertisers and commercial players without children, parents or teachers even knowing.
It is worth noting that, in 2021, the Netherlands negotiated a contract that restricted the data that Google’s education products could share. In 2022, Helsingør in Denmark banned Google Workspace and Chromebooks altogether—the same year the French Ministry of Education urged schools to stop using free versions of both Google and Microsoft.
Children’s privacy is non-trivial. Data may include school attendance, visits to the nurse, immigration status, test results, disciplinary record, aptitude and personality tests, mental health records, biometric data, or the granular detail of how a child interacted with an educational product—whether they hesitated or misspelled. Between management platforms, multiple connected devices and programmes used for teaching, the data that can be collected on a child is almost infinite and the data protection breathtakingly poor. Pupil data has been made available to gambling firms and advertisers, and even been found to track their use of mental health services.
I turn briefly to the impact on social development. Child development is a multifaceted affair, in which not only the tech itself but the opportunity cost—that is, what the child is not doing—is of equal import. I was in Manchester last week, where a programme to bring professional dancers to nursery schools is being developed because children were arriving unable to play, look each other in the eye or move confidently. Although schools are not to blame if children come in overstimulated and undersocialised, in part because of the sedentary screen time of early years, it is absolutely crucial that school remains a place of movement, singing, playing, drawing, reading and class teaching, supported by tech but not replaced by it, not only in a handful of Manchester nurseries but throughout the school system, and, very importantly, during the teenage years. Decisions about edtech should be in the light of and in response to not simply learning but the whole child and their development needs.
In my final minutes, I will speak briefly about safety tech. Here, I record my gratitude to Ministers and officials in the Department for Education, past and present, who have made very significant progress on this issue this year.
Frankie Thomas was 15 when she accessed a story that promoted suicide on a school iPad that had not been connected to the school filtering system. Subsequently, she took her own life exactly as she had seen online. Since that time, her parents, Judy and Andy, have campaigned tirelessly to bring the governance of safety tech to our notice. They deserve much credit for the advances that have been made. However, we still do not have standards for safety tech in schools. Schools can buy, and are buying, in good faith, systems that fail to search for self-harm or have illegal content filters switched off and so on. Secondarily, while we have excellent new guidance, Ofsted inspections do not explicitly ask whether schools are reviewing and checking that their online safety systems are working, meaning that thousands of schools have not properly engaged with that guidance.
I gave the Minister notice of my questions and very much look forward to her response. Will the department introduce quality control for edtech, including peer review and certification that evidences that it is suitable to meet children’s educational and development needs? Will the department use the upcoming Data Protection and Digital Information Bill to introduce a data protection regime for schools, which is so urgently needed? Will the department introduce standard procurement contracts, such as the Netherlands has, recognising that a single school cannot negotiate performance and privacy standards with global companies? Will the department bring forward a requirement for minimum standards of filtering and monitoring so that safety systems are fit for purpose, and simultaneously ensure that Ofsted’s inspecting schools handbook explicitly requires an inspector to ask whether a school is regularly checking its safety tech?
I am deeply grateful to all noble Lords who have chosen to speak and look forward to their contributions. Education is an extremely precious contribution to child development and widely regarded as a public good. It must not be undermined by allowing an unregulated market to develop without regard for the learning, privacy and safety of children.
My Lords, it is a great pleasure to follow the noble Baroness, Lady Kidron, who has set out the parameters for today’s short debate so powerfully and with her customary expertise. It is a great pleasure to see a number of other noble Lords in the Chamber who I have spent quite a long time debating online safety issues with so far in 2023. I mention my honorary position as a member of the political advisers panel of AI in Education. I shall come back to that in a moment.
The noble Baroness, Lady Kidron, set out a very clear case for standards and oversight of tech in education. I know that this is not a new issue; it is something that my noble friend the Minister and the Department for Education have been looking at for quite some time. When I was Secretary of State for Education, quite a long time ago now, I remember being invited to a number of edtech conferences and events, where I was told how technology was going to revolutionise the classroom, make everybody’s lives so much easier and cut workloads. I still think that all those things are possible and that we should see both the risks and opportunities of technology in education. My former constituency of Loughborough experienced, many, many years ago, the Luddites, as they came through and smashed up the cotton frames. I do not think we want to be Luddites about technology in education or say that we need to put the genie back in the bottle. I will be very interested to hear from the Minister how much the department is already doing in this particular space.
Of course, it is not just about government. As with so many other things, government Ministers, officials, and advisory groups can do so much, but there are many other organisations. The noble Baroness, Lady Kidron, talked about one; AI in Education, led by Sir Anthony Seldon, is another; the noble Lord, Lord Clement-Jones, who cannot be here today, has talked about the Institute for Ethical AI in Education. I very much hope that the department is calling on all those institutions, as well as many others in the space, to gather the best expertise, because I do not think that in this fast-moving world government can possibly be expected to solve the issues that today’s short debate will highlight on its own.
As the noble Baroness, Lady Kidron, said, this issue of tech in education has been only accelerated—as so many other things relating to technology were—by the pandemic. The Covid-19 Committee that this House set up in 2020, chaired by the noble Baroness, Lady Lane-Fox, took evidence on the specific issue of technology in education during the pandemic. While, of course, there were issues—technology adopted very quickly, issues relating to privacy and other things thought about later than they should have been—I was also struck at the time by the evidence from parents and others working with, in particular, children with special needs, for whom the opportunity to learn online in a quieter environment had, for many, been something that they welcomed. I think it is fair to say that we all now live in a hybrid world. While there is no doubt that children learn best in a classroom—we all learn and communicate better face-to-face—there will still be times when the hybrid option is suitable.
The noble Baroness, Lady Kidron, talked about safety tech. The first message I urge my noble friend to take back to the department and others is that I really hope that we are not going to play catch-up on all these issues, as we have done with internet regulation. We now have the Online Safety Act, and are all now waiting for the regulator to do what it needs to do, but there is no doubt that we—and not just us but Governments around the world—have been playing catch-up with the growth of the internet. The issues relating to technology and education, and how we keep our young people safe, are not new; we need to think them through and try to keep as ahead as we possibly can of the challenges.
In the time available, I will make two points. One relates to the curriculum and the other relates to character education, my favourite subject. It seems to me that, over the course of the past nine years, since I had the fortune to become the Secretary for Education, which is a fantastic role, our curriculum has slipped behind somewhat in being relevant for the 21st century. Knowledge is very important, but the world has changed, along with the way that we all access that knowledge—that genie will not go back in the bottle. As the noble Baroness said, getting young people to understand the risks of sharing their data but also being confident about broader issues relating to data analysis and the use of statistics are things that our curriculum does not accurately teach now. A lot of the rest of us, who are not in school or college, could also benefit from lessons in these things, so a programme of adult education on these matters would not go amiss either.
I used to get a lot of lobbying about the taking of exams: why do we still ask young people to sit in rooms for three hours scribbling on a piece of paper? Again, the recall of knowledge is important, but there are ways of designing the use of technology in exam settings that would stop people accessing information on the internet to help them but also reflect the fact that, when you get out into the big wide world and the workplace, people will be using technology. I say this not as somebody whose handwriting is abysmal, but the fact is that I type every day and do not write that much anymore. We have to reflect that fact.
The other point is being sceptical about what young people are finding out from artificial intelligence and the internet. Again, all of us could benefit from lessons in that. But if young people and those who are teaching them are going to use artificial intelligence in education, let us work with them to make sure that they are confident in how they use it, how they check what it is and any underlying biases in the AI that they have been using.
My final point is on character. I firmly believe that our education system is for teaching not just knowledge but characteristics—values, virtues, things such as integrity, honesty, curiosity and the desire to constantly learn. That is more relevant than ever when you have the influence of technology in our classrooms. I would really welcome my noble friend’s comments on the need to update the curriculum to reflect the use of data and AI technology in the modern world, but also how schools will teach character skills to help young people to really use AI and technology in a way that benefits their education.
My Lords, it is a great pleasure to follow the noble Baronesses. I do not think that there was a word that the noble Baroness, Lady Morgan, said that I did not agree with. I declare my interests at the outset: I, too, am a political adviser to AI in Education; I chair a multi-academy trust, E-ACT; I am a director of Suklaa, whose clients include Iris Software and Goodnotes; I am a director of Macat; and I chair the boards at Century Tech and EDUCATE Ventures Research. I am very proud that the last two are headed up by two great experts around AI and education, Professor Rose Luckin and Priya Lakhani.
I am a long-term evangelist for the use of technology in education, as well as change in education and our school system, but I recognise the efficacy problem that the noble Baronesses, Lady Kidron and Lady Morgan, talked about. I signed off, and was responsible as a Minister for, the harnessing technology grants—rather a lot of money was spent on rather a lot of whiteboards. I am not sure that they made a massive difference when we did not accompany that investment with the training of teachers to transform their pedagogy to go with it, and we need to learn from that.
It is also fair to reflect that, with the current orthodoxy of the curriculum—what we require of young people and how they take tests writing on paper with pens in large sports halls every summer—perhaps we do not need technology. It may well be that, given that that system has not really changed for the last 50 to 70 years, we know how to teach it. If we think that that is right and we should preserve the status quo for ever, then perhaps we do not need technology. But I happen to believe, particularly with the workforce crisis that we face in our schools, and the changing environment externally that the noble Baroness, Lady Morgan, talked about, that we need to change.
I am guided by the work back in the late 1990s of Professor Ruben Puentedura from Boston who talked about his SAMR model—that is, substitution, augmentation, modification and redefinition. It is only when you get to the modification or redefinition of pedagogy that you achieve proper gains with the application of technology in education.
Currently we have a curriculum problem for the reasons outlined by the noble Baroness. We have an opportunity for change enabled by technology assisting teachers, and technology is making that change inevitable and essential. In order to realise that opportunity, we have to be mindful of some of the problems of safety, data and privacy, the digital divide—the divide around access to devices and data—and the confidence of teachers and learners to be able to use technology and of parents to be able to support their children during homework using technology. We have to be mindful of all those things, but they should not be an obstacle to progress.
There are alternative visions. There is a dystopian vision where technology replaces teachers and young people are isolated, learning on screens, cramming for tests of knowledge and ultimately falling behind machines because they leave school unable to compete with highly intelligent machines and their ability to regurgitate knowledge far more accurately than humans ever could. At the same time, in that dystopian world, we would have all the problems of data privacy and privatisation that the noble Baroness, Lady Kidron, talked about.
The utopian vision is of technology as a co-pilot to teachers, keeping them informed about the differences in their class, the scaffolding gaps in the knowledge of their children and the skills that those children need as technology helps them to interpret how their children are doing. This vision includes the opportunity for flipped learning so that the instructional knowledge-based elements of the learning can be done at home using technology so that school is a human place of social interaction and group work with the application of knowledge in an exciting way that teachers at the moment are not equipped and trained to be able to do. With the application of technology, there is an opportunity to do that and to develop a more rounded curriculum powered by novel forms of assessment with portfolios as endpoint qualifications that can deliver higher education entrance in a way that is a transformation from where we are at the moment and, to my mind, hugely exciting.
Artificial intelligence represents an opportunity. There are opportunities for tools for workload and workflow and pedagogic tools around adaptive learning, formative assessment on the fly and being able to deliver project-based learning in a way that is currently practically really hard for teachers but could be made a lot easier, thereby engaging all learners with relevant knowledge and skills in a way that is currently inconceivable.
However, we have to be mindful of the risks. I am interested in data trusts for public services and in whether we can set up trusts in statute not only for the NHS but for education so that we can own and control the use of children’s data, navigate which commercial partners we might want to use and get some return on the AI that that data is being used to train so that we can use that to help to fund our education system if that intellectual property is then exploited overseas.
The Minister will not be surprised that I question why we are investing £2 million of public money in Oak National Academy without procurement for it to do AI development, rather than using the private sector and others or even going through any kind of procurement to see how we might do that. Generally, I would love to see Oak repurposed into a modern-day version of BECTA that could properly advise the system on the safety, efficacy and workload implications of technology and generate the best-value procurement possible.
Edtech is a great opportunity. The need for change is pressing. We should chase after the utopian vision, with technology for good being embedded in what we do our schools.
My Lords, it is an honour to follow the noble Lord, Lord Knight, who has just demonstrated his extraordinary depth of knowledge in both education and technology. In fact, it is rather daunting to speak after him and the two noble Baronesses. Rather than declare my interests, I feel I have to do the opposite and declare my lack of knowledge, as I am not an educationalist at all, other than being the mother of two teenage daughters. I speak solely from my experience in digital transformation and digital regulation in other sectors.
With that caveat, I will dare to say a few points. From other sectors, there are four things that we know, which I would like to pull out. The first is obvious: the huge opportunity coupled and paired with the risks that digital technology brings. The yin and the yang are visible in every single place that digital goes. The second thing we know is that you cannot stop it. As my noble friend describes, the Luddites failed, as has everyone else who has attempted to stop technology. Like water in a flood, it finds a way through. You cannot ban it; you also cannot ignore it. We know that from every other sector.
Thirdly, the problem is not the technology, but the people. In every sector, it is people who make technological change hard. While 98% of the population embrace technology in an open, whole-hearted, moral and legal way, there will always be those who use technology in other ways. Change, as the noble Lord, Lord Knight, referred to, involves people changing. We know that from every sector that digital has touched, but it is hard in every sector.
Fourthly, every sector is learning that it has to lean in itself. It is not possible to do what my parents did, which was to abdicate responsibility for the DVD or video player to the younger generation to program, because they did not know how. With technology, it is hugely tempting to want to abdicate responsibility to the “experts”, to the CTO or the technology function. Every sector is learning that you cannot do that. Educationalists, just like politicians, cannot abdicate this to other people. We have to lean in and learn ourselves.
It is here already. As I tried to mug up a little bit on the edtech sector in advance of this debate, I was really struck by some statistics from an RM Technology research pamphlet, published in June 2023. It did some research on 1,000 secondary school students this summer: 67% of them already used chatbots such as ChatGPT—67%, just six months after it launched—and 48% said that excluding it would really hold them back. However, 38% said they felt guilty about using it. Teenagers are expressing the yin and yang already: the opportunity and the threat of that new technology.
Those of us who have worked together on online safety for many years know that we were far too slow to challenge the tech exceptionalism in child online safety. We were far too slow to win the argument that self-regulation was patently not going to be fine. I worry that there is a real risk of almost a double exceptionalism here: the tech exceptionalism, of “Don’t worry, self-regulation will be fine”, coupled with the “Education is different, it’s all a bit too complicated; we need to leave it to the educational establishment and teachers—don’t worry”. Through that double exceptionalism, I was shocked to discover that the age-appropriate design code does not apply to education technology. I do not know why. Can my noble friend the Minister say why would we not extend the age-appropriate design code to edtech? We know that safety by design is the way to build in the right checks and balances for opportunity and risk in digital. If that is not regulated, it does not happen—we have seen that time and again in social media. While it is easy for me to say, “Lean in”, we must really invest to lean in and learn about the technologies. Can my noble friend the Minister say what the Department for Education is doing to build its knowledge as these new technologies grow?
I sit on the Lords Communications and Digital Committee, which is currently doing an inquiry into large language models. We have asked a whole series of regulators how prepared they are to regulate AI. I am ashamed to say that I do not think we have asked anyone in education, so I will do so now. I am keen to understand what the department is doing to build its expertise in large language models, because we can see they are being used. How many AI experts and data scientists does the department have? Is it starting to put together a regulatory sandbox? These are all questions we are asking other regulators and I suggest that the Department for Education should look at them too.
Like the noble Lord, Lord Knight, I too want to highlight the importance of digital inclusion. It is all very well for us to discuss the opportunities and risks of all this wonderful technology, but the harsh reality is that far too many children are growing up in this country without access to it at all. According to Ofcom’s 2023 media use and attitudes report, 19% of 16 to 24 year-olds use only a smartphone to go online. Imagine trying to do your homework just with a smartphone—possibly one that is shared among the whole family. That is a huge disadvantage, which serves to exacerbate all the things that I know the department is working so hard to try to improve.
The report showed that 28% of 16 to 24 year-olds are only “narrow” internet users, which Ofcom defines as those who use the internet for only one to four activities out of a defined list of 13. These are not technical—buying things, streaming videos, looking for jobs or using it for research. That is a very large proportion of our young people without a broad range of basic digital skills. What are we doing in education to ensure that all pupils have basic digital skills and access to more than just a smartphone?
The opportunities are so great—I am a tech evangelist in so many ways—but the risks are also very real. As the noble Baroness, Lady Kidron, said, standards and oversight need to be in every sector. Probably none is more important than education.
My Lords, I, too, thank the noble Baroness, Lady Kidron, for initiating this relevant and highly pertinent debate. I confess that tech is not my area of expertise, but I have received so many briefings and emails and so much helpful advice that I am now well aware of the importance of edtech in schools.
I was a teacher before technology. If we needed to duplicate, we had a jelly tray on which you put one sheet at a time. I seem to remember the print came out purple, for some reason. The advent of photocopiers was a revolution to teachers—the heady days of yesteryear—but, as we have heard, educational technology is on the rise and, as the noble Baroness, Lady Harding, said, we cannot halt it. However, we need to learn how to manage it so that it is our servant and not our master. Much of what I was going to say has been said, but of course I have not said it. I shall try not to be repetitive.
There is always the danger that students are likely to be one step ahead of teachers, as the young have grown up with technology whereas many teachers have had to learn it. As others have pointed out, there are dangers for the social development of pupils if they rely too much on technology and not enough on their own learning. There is also a danger of taking the personal interaction between teacher and pupil out of the picture.
My daughter was a primary teacher during Covid, working excessive hours to ensure that her four year-old pupils continued their education, albeit in a strange and unusual way. Her first task was always to ensure that they had access to a computer and to an adult who could use it, and then to construct relevant and interesting lessons to ensure that they did not lose out. We share concerns about the Oak National Academy, which was set up during Covid to support remote learning, which was new to pupils and teachers. Can the Minister say what the status of the Oak National Academy is now? AI was supposed to help teachers with lesson planning and other materials that would reduce their workload, but it is not at all certain that that was achieved.
We have heard from Jen Persson, the director of Defend Digital Me, who writes:
“To reduce the debate on edTech to questions of data processing or particular pros and cons of a single product is to misunderstand the socio-political and economic underpinning and goals of the edTech market”.
Jen raises concerns that
“the introduction of many common technology tools, apps and platforms into the school setting means the introduction of hundreds, often thousands, of strangers who influence a child’s life through interactions with companies and their affiliates in the digital world”.
Others have pointed this out. They say these platforms are by no means secure and can
“bypass the gatekeepers within the school system to deliver EdTech directly to young people, their families and lifelong learners”.
In other words, the privacy and safety of children may be compromised by these exciting new tools. The issue of the privacy and safety of children must surely be addressed, as we heard from the noble Baroness, Lady Kidron, and others.
For teachers who are overworked and underpaid, there could be help in their workload if they are provided with a personalised AI lesson-planning assistant, but, once again, we need to know how secure these assistants will be. Schools may decide to use tools and platforms to help with management and administration, monitor the progress of students and communicate with other staff members and even with parents. There are copious uses of AI. However, we raise concerns about the cost of the equipment, such as interactive whiteboards, laptops or tablets. They do not come cheap and, as we know only too well, school budgets are stretched to the limit. So what priority will these have in the decisions of head teachers? If payment for those things means that schools go without other things, we have to address that carefully.
We are certainly well aware of the use of edtech for special educational needs. My colleague, the noble Lord, Lord Addington, who is dyslexic, has always relied heavily on devices to assist him. Many other students with different needs will find invaluable the use of adaptive technologies, such as braille machines and other pieces of equipment for blind students. Edtech can be transformational for students who otherwise would miss out on education.
Could technology also be used to ease teacher workload of lesson planning, marking and assessment? Our teachers provide an amazing service to pupils, parents and the country, and anything that helps to reduce workload has to be welcomed. However, once again, we need to be assured of confidentiality in relation to young people. AI might tackle some of the administrative tasks that might keep teachers from investing more time with their peers or students.
There are arguments that edtech could contribute to pastoral support, mental health and pupils’ well-being, but surely only up to a point. The personal touch of teachers and parents can never be sidelined. According to the Government, the UK’s edtech sector is the largest in Europe. They also report that UK schools already spend an estimated £900 million a year on educational technology. If that means that it improves learner engagement and progress, this has to be money well spent. We know that during Covid edtech was invaluable, but surely machines, however sophisticated, can never replace face-to-face teaching.
I will digress slightly by saying, particularly in response to the comment by the noble Baroness, Lady Morgan, that the noble Lords, Lord Knight and Lord Aberdare, and I are on a committee looking at 11-16 education, and we have concluded that GCSEs have completely failed our young people. Our report will come out in December, and I urge noble Lords to look at it because the whole process of 11-16 education is deeply flawed at the moment.
I look forward to the Minister’s reply and hope that the country’s students will be able to benefit from dedicated teachers and world-class technology.
My Lords, it is a pleasure to speak in this debate. I pay tribute to the work of the noble Baroness, Lady Kidron, in promoting the interests of children in relation to AI and the need to put them at the heart of the debate on AI and online safety. Like the noble Baroness, Lady Garden, I am not an expert in technology, so I feel slightly at a loss compared to some of the greater knowledge in the Room, but I have learned a huge amount this week and in this debate.
Every part of our lives is already being affected by AI, but there is a huge divide between those who understand how it works and how it affects us, and those who do not. However, all policy areas should have a renewed focus on the risks and opportunities of AI, and this should be at the front and centre of our work here in Parliament. As the Tony Blair Institute has said, this is a technology with
“a level of impact akin to the internal combustion engine, electricity and the internet, so incrementalism will not be enough”.
As the noble Baroness, Lady Harding, said, we cannot stop it.
I agree with the noble Baroness, Lady Kidron, that each part of the question could fill a debate on its own. She highlighted the global issues in inequality, which we should be concerned about. I will, however, focus on the UK in my remarks. Her examples of the need to ensure that children do not lose the opportunity to socialise and gain social development were powerful. Can the Minister provide reassurance on this and on the online safety issues and the need for safety tech? The noble Baroness, Lady Morgan, noted the advantages to some pupils with special educational needs, as did the noble Baroness, Lady Garden. This offers an immense opportunity. Is the Minister confident that this is being used effectively by schools and promoted effectively by the department?
My noble friend Lord Knight spoke about the need to redefine pedagogy to reflect tech change. This has to be a priority for all of us. I agree that we do not need to assume that we are going to have a dystopian future, but we need to have a balanced debate between this and the utopian vision. Sometimes, there is a big divide between those who see it as a dystopia and those who see it as a utopia. We need to find somewhere in the middle, otherwise we will not be able to embrace the potential, both for the children and for the country, and provide the safeguarding required.
Covid clearly fast-tracked technology in our schools. Technology clearly has the power to transform our education system. But we should not assume that technological advancements in our classrooms will automatically lead to educational advancements. Technology will not be the silver bullet that alone recruits, retains or replaces the teaching staff we desperately need. It will not rebuild our schools or bring a generation of persistently absent children back into classrooms—although there may be some ways in which it can help in terms of the administration of some of these issues.
As the pace of impact of educational technology threatens to outstrip our ability to respond to individual developments, we must work with schools, colleges, universities, employers and unions, as well as pupils and parents and others with parental responsibility to create an overriding strategy that can address the challenges, risks and opportunities that technology poses. I agree with the noble Baroness, Lady Morgan, that the curriculum needs to change. Her suggestion about education would perhaps ensure that policymakers better understand the tech as well. I would work on that.
The noble Baroness, Lady Garden, and my noble friend Lord Knight raised points around Oak Academy. The recent announcement on the new role of AI on the platform warrants additional answers from the department. Concerns have already been raised about the operation, evaluation and assurance at Oak National Academy. AI only serves to amplify this. Could the Minister tell us how much public money is being spent on this and what exactly it will provide? Will it provide exactly what teachers want and need?
Labour knows that we must better prepare our children and young people for the coming digital future. They must be able to use new, emerging and future technology. They must also understand how to shape these technologies and understand their opportunities, risks and limitations. The questioning style and the critical skills we need to teach children in this emerging area are vital. We must ensure that all young people are equipped with both literacy and numeracy skills as well as analytical, critical thinking, problem-solving, creative and collaborative skills that will enable them not only to adapt to change but to lead it and understand what their roles and opportunities are within this new technological world. In this context, I welcome the work undertaken by the organisation AI in Education and note the work done by the noble Baroness, Lady Morgan, and my noble friend Lord Knight. Could the Minister outline how the DfE is engaging with and learning from this group and ensuring its professional perspective and expertise? I was staggered by the number of people involved when I looked through the website. It is a huge resource. How is the DfE utilising this expertise and the expertise of other groups, including those that have been mentioned in this debate?
I want to finish on the third question posed by the noble Baroness, Lady Kidron, on privacy for children and online safety and also raise questions on the potential for bias in AI algorithms, which may end up causing issues within all settings and educational settings in particular.
Can the Minister outline how the Government intend to protect the interests of children, not least in relation to privacy? Are they exploring measures from the Netherlands and Denmark, as the noble Baroness, Lady Kidron, highlighted? What advice are the Government providing to schools about the use of AI, and will they insist on safety by design, as the noble Baroness, Lady Harding, suggests? I will finish with a quote from the World Economic Forum:
“There is no doubt that artificial intelligence will change the way children interact with their surroundings including their learning, play and development environment. However, it is our responsibility to ensure that this change becomes a force for good”.
My Lords, I join other noble Lords in thanking the noble Baroness, Lady Kidron, for her work in establishing standards for online safety and privacy, and for securing this debate. Her speech highlighted many of the risks inherent in these technologies as well as some of the opportunities. My noble friend Lady Harding felt daunted after just a couple of your Lordships’ speeches, but I feel even more daunted coming at the end after such expertise and insight from your Lordships.
I am pleased to say in response to the question from the noble Baroness, Lady Twycross, about our work with AI in Education that we have been working and liaising with it, and I share the noble Baronesses’ respect. I also spent time on its website recently, and I was stunned at the range of resources that it has created. I was fortunate enough to be part of its conference yesterday, which was an incredibly vibrant event bringing together many teachers and educators from around the country.
My noble friend Lady Morgan suggested that the Government need to avoid playing catch-up; I am sure she will recognise that it feels particularly hard for government, which is perhaps not generally famed for its agility, to operate and not play catch-up in an area where the pace of change is so extraordinarily fast. The way I would try to characterise this for my noble friend is that we are looking at this through two lenses. The first is to stay very close to teachers and work closely with them to understand what their immediate needs and worries are in relation to these technologies, and make sure that we can respond to those where appropriate. However, this is also about working very hard on the medium and longer-term issues—I will touch a little more on that in my remarks, but I do not want to underestimate the scale of the task because I know my noble friend Lady Morgan does not either.
We want to create an environment where all schools and trusts can use technology to improve access to education and outcomes, reduce staff workload and run their operations more efficiently. Technology is certainly not an easy solution to all this, and the noble Baroness, Lady Kidron, raised important questions on the role of government in protecting students from the harms of technology. She asked whether we will introduce a data protection regime design for school settings; we are developing the Education Privacy Assurance Scheme—or EPAS to its friends—to work with education settings to help them understand and deliver their obligations and responsibilities in relation to data protection legislation. However, I will look more closely at the points she raised about the Data Protection and Digital Information Bill, and I will of course come back to the noble Baroness in writing with an update on that.
In relation to whether we will introduce standard procurement contracts, we are currently looking at the ways in which we can make the procurement of technology easier for schools. We have five ICT frameworks in place, which are accessible via the find a framework service, and we are looking at how we can support schools beyond the framework, such as providing support developing specifications.
In relation to the peer review of education technology, we have the same expectations for robust evidence for education technology as we do elsewhere in education. I think the House would acknowledge that we are genuinely world leading in our quality of our education research, and so only where there is robust evidence of the impact of technology will we go further in actively encouraging adoption of that technology in the classroom. We have provided £137 million to the Education Endowment Foundation. Its upcoming research trials will explore teaching approaches that use educational technology, including which features of the technology, and how they are used, may support academic attainment—or not, as the noble Baroness suggested.
In relation to filtering and monitoring, we have published standards to help schools understand their responsibilities and statutory duties to safeguard children online, and we have embedded these standards in our Keeping Children Safe in Education guidance. That update was launched in March of this year and the standards have had over 100,000 views, so this is clearly touching something that feels very relevant to schools. We have also provided useful links to training materials and guidance to support schools, including commissioning the UK Safer Internet Centre to create and run a series of webinars.
We have set technology standards on connectivity, cybersecurity, filtering and monitoring, use of the cloud, and servers and storage. We want all schools to meet these standards, which is one reason why we have provided £200 million of investment to upgrade schools that fall below our wifi standards. We are also piloting a digital service to help schools to benchmark their technology, identify areas of improvement and implement these recommendations. We are currently testing those in Blackpool and Portsmouth, and will open it up to more schools next year.
We know that technology evolves at pace and that adoption of generative AI is ever more widespread. We must work very closely with the whole education sector to provide support on how best to use the technology, maximising opportunity while minimising risk. My noble friend Lady Harding asked what the department is doing in relation to LLMs and some of the points she raised are certainly on our radar, or are things that we are actively working on. We began by launching a call for evidence on generative AI in education over the summer. We had 567 responses from practitioners, edtech companies and AI experts across all stages of education, and we will publish the responses this autumn.
In October, we began work with Faculty and the National Institute of Teaching to understand the possible uses of generative AI in education, in a safe setting, exploring the opportunities that this technology presents to reduce teacher workload; to improve outcomes, particularly and explicitly for children with special educational needs, as referred to by my noble friend Lady Morgan, and those from disadvantaged backgrounds; and to use the technology to run school operations more efficiently.
We have held our first hackathon, which was huge fun as well as very insightful. I hope that we can expand some of that work in the new year, and we will publish the findings in spring 2024.
I absolutely agree with the suggestion from the noble Lord, Lord Knight, about AI becoming a co-pilot with teachers. There has perhaps been a focus on using technology to substitute things that teachers already do rather than using it to enhance what they could do.
We have worked closely with Ofqual, Ofsted, the Office for Students and the Education Endowment Foundation as we develop our thinking. We are exploring the role of the Government in relation to the aggregation and curation of content, which the noble Lord, Lord Knight, referred to. We are also exploring our regulatory approach, including the role of a regulatory sandbox for looking at the behaviour of individual products, helping us understand what our regulatory approach should be and, as also picked up by the noble Lord, Lord Knight, looking at how we can maximise the value of our educational IP.
The noble Baroness, Lady Twycross, talked about the importance of children socialising. There are rightly concerns about tools that are serving children directly but, as the Committee has heard, our initial focus has been more on working with teachers and looking at some of the back-office functions. There is a tension and a need to hold on to the short-term pressures that teachers face in relation to the risk of plagiarising, for example; the medium-term issues about curation of content and regulation; and the really big-picture philosophical issues about how we think a classroom will look in five, 10 or 15 years.
My noble friend Lady Morgan and the noble Lord, Lord Knight, asked about and challenged the current curriculum. I remind the Committee that our focus on numeracy and literacy and a knowledge-rich curriculum has helped us to be ranked the highest country in the western world for the reading ability of nine and 10 year-olds. We rank fourth out of 43 countries that assessed children at the same time for the PIRLS 2021 survey. Similarly, we have seen significant improvement in maths. I am happy to write to noble Lords with more detail on the digital content in our curriculum.
My noble friend Lady Harding asked about the exception from the age-appropriate design code. There are exemptions for low-risk services, which include those managed by education providers, that are already subject to regulatory frameworks such as the Keeping Children Safe in Education framework.
Finally, in the last minute—which I do not have—I turn to the questions from the noble Baronesses, Lady Garden and Lady Twycross, about the role of Oak. Oak has been established as an arm’s-length body and is working very collaboratively with the education system and with teachers across the country to develop free curriculum resources.
I end by crediting the hard work and tenacity shown by teachers and leaders up and down the country, and by reassuring the Committee that the Government remain committed not only to supporting schools and students to achieve the best possible results but to consulting and working closely with the sector as we develop our work on the technology that will touch every child and teacher.
Committee adjourned at 4.59 pm.