‘Feels like magic!’: Groq’s ultrafast LPU could well be the first LLM-native processor

Groq, a company led by ex-Google engineer and CEO Jonathan Ross, claims to have created the first-ever Language Processing Unit (LPU) that can deliver the fastest speeds for AI applications. The company’s Tensor Stream Processor (TSP) processes data tasks in a sequential, organized manner, unlike a GPU which is akin to a static workstation. The TSP’s efficiency became evident with the rise of Generative AI, leading Ross to rebrand the TSP as the LPU. LPUs are energy efficient, reducing the overhead of managing multiple threads and avoiding underutilization of cores. Groq’s scalable chip design allows multiple TSPs to be linked without traditional bottlenecks, simplifying hardware requirements for large-scale AI models. The first public demo of Groq was a lightning-fast AI answers engine that generated answers with hundreds of words in less than a second.

Read more at: https://www.techradar.com