MediaTek advances generative AI in edge devices with Meta’s Llama 2 LLM
Using Meta’s LLM as well as MediaTek’s newest APUs and NeuroPilot AI Platform, MediaTek wants to create a comprehensive edge computing ecosystem to expedite AI application development on smartphones, IoT, automobiles, smart homes, and other edge devices.
The majority of generative AI processing now takes place in the cloud, but MediaTek’s implementation of Llama 2 models will allow generative AI apps to operate natively on-device as well.
Developers and consumers can benefit from smooth performance, more privacy, improved security and dependability, decreased latency, the capacity to operate in locations with minimal to no connection, and reduced operating costs.
Edge device makers will need to deploy high processing, low-power AI processors and faster more dependable connections to increase computing capabilities to fully benefit from on-device Generative AI technology.
Every 5G smartphone SoC powered by MediaTek that has been released so far is outfitted with APUs that can carry out a broad range of Generative AI functions, including AI Noise Reduction, AI Super Resolution, AI MEMC, and more.
Furthermore, MediaTek’s next-generation flagship chipset, which will be released later this year, will include a software stack optimized for Llama 2, as well as an upgraded APU with Transformer backbone acceleration, reduced footprint access, and use of DRAM bandwidth, which will improve LLM and AIGC performance even further. These developments allow for a faster pace in developing use cases for on-device Generative AI.
“The increasing popularity of Generative AI is a significant trend in digital transformation, and our vision is to provide the exciting community of Llama 2 developers and users with the tools needed to fully innovate in the AI space,” said JC Hsu, Corporate Senior Vice President and General Manager of Wireless Communications Business Unit at MediaTek. “Through our partnership with Meta, we can deliver hardware and software with far more capability on the edge than ever before.”