ChatGPT uses 10 times the electricity of a Google search – but cutting out ‘matrix multiplication’ could cut this number without affecting performance
New research suggests that eliminating the ‘matrix multiplication' stage of large-language models (LLMs) used in AI could slash power consumption without affecting performance. The research was ...
To continue reading this article...
Join Computing
- Unlimited access to real-time news, analysis and opinion from the technology industry
- Receive important and breaking news in our daily newsletter
- Be the first to hear about our events and awards programmes
- Join live member only interviews with IT leaders at the ‘IT Lounge’; your chance to ask your burning tech questions and have them answered
- Access to the Computing Delta hub providing market intelligence and research
- Receive our members-only newsletter with exclusive opinion pieces from senior IT Leaders