Faculty AI wins £3m education contract for ‘content store’
Contract is latest in a long line of government wins for AI firm
Faculty AI wins £3m Department for Education contract for content store. It is the latest in a long line of wins for the firm.
Curriculum and pupil assessment data will go into the content store and be used to develop AI tools and bots that will be used for tasks such as marking and generating lesson plans.
The programme was announced by the government in August. Stephen Morgan, the Early Education Minister, said using AI would help to ease the “pressures and workload burdens” on teachers and allow them to “focus on face-to-face teaching”.
At the time the government said that the content store was targeted at “technology companies specialising in education.”
Faculty AI is not a technology company that specialises in education, although it is working with those who are in order to deliver on this contract. A Faculty statement says it will work with ImpactEd Group, Open Education AI and Sir Anthony Seldon’s AI in Education initiative on this contract.
The company will also work with a law firm to ensure proper safeguards around student data. This means that content from pupils would only be used with parental consent, and data would be anonymised.
Faculty’s history of success in public sector
Faculty AI is led by former government AI adviser Marc Warner, and the company gives the appearance of being well connected across Whitehall, having won approximately £45 million in public sector contracts over the last three years.
Faculty AI was the data science partner of choice for one Dominic Cummings and the Vote Leave campaign in 2016.
Cummings also leaned on the firm for guidance in the early days of the pandemic and a Byline Times investigation in 2021 found that Faculty had won 20 different public sector contracts between 2018 and 2021 including work for the Home Office, DHSC, DCMS, Cabinet Office and Transport for London.
More recently, Faculty won its contract for providing testing services for the AI Safety Institute (AISI) without any competitive process. The reason given at the time was that no other company had the deep expertise required to test frontier AI systems – an explanation which some questioned given how much of a tech and science powerhouse we are repeatedly told that the UK is.
Faculty’s links to government have continued to build into the Labour-led administration. The company has advised Secretary for Science, Innovation and Technology Peter Kyle, seconding a member of staff to his office to provide technical advice on AI policy.
The government has certainly leaned into the possibilities of GenAI. The fact that Peter Kyle is currently in California in meetings with representatives of the big tech players and promoting the UK AISI provides an indication that GenAI will form a key part of the government platform for delivering on its commitments to grow the economy and improve public services.
Nonetheless, despite the possibilities for GenAI for boosting productivity and cutting down on mundane tasks, there are still more than a few questions over accuracy and reliability. A recent study found that LLM’s can be tricked into giving wrong answers to quite simple maths problem by adding extraneous information to the question.
There is also the tendency of LLMs to “hallucinate” and the potential prospect of model collapse due to there simply not being enough fresh data to feed the models.