Google unveiled PaLM 2, its latest large language model (LLM), at its annual developer conference last week - but its claims at the time about using a smaller training data set have been called into question.
A report by CNBC has found that PaLM 2 actually uses nearly five times the amount of training data as its predecessor, PaLM (Pathways Language Model), giving it the capability to perform in tasks s...
To continue reading this article...
Join Computing
- Unlimited access to real-time news, analysis and opinion from the technology industry
- Receive important and breaking news in our daily newsletter
- Be the first to hear about our events and awards programmes
- Join live member only interviews with IT leaders at the ‘IT Lounge’; your chance to ask your burning tech questions and have them answered
- Access to the Computing Delta hub providing market intelligence and research
- Receive our members-only newsletter with exclusive opinion pieces from senior IT Leaders