Recent #Edge AI news in the semiconductor industry

6 months ago
➀ GP Singh co-founded Ambient Scientific to develop high-performance, low-power AI microprocessors; ➁ The company's DigAn® technology enables ultra-low power AI applications without cloud dependency; ➂ GPX10 processor addresses inefficiencies in current AI hardware by offering better performance and lower power consumption; ➃ GP Singh emphasizes the importance of semiconductors in improving human lives.
AIAI ProcessorsCloud ComputingComputingEdge AIEnergy efficiencyHardwareInnovationsSEMICONDUCTORbattery lifetechnology
2 months ago

➀ Lantronix has launched the Open-Q 8550CS System-on-Module (SOM), built on the Qualcomm Dragonwing QCS8550 processor, offering low-power on-device AI and ML capabilities.

➁ The SOM is designed for demanding AI/ML applications in extreme Edge computing environments, supporting advanced video processing, AI inference, and Edge AI gateway integration.

➂ Key features include low power consumption with a 4nm process, dual eNPU for AI acceleration, and support for high-speed connectivity and multiple display interfaces.

AIAI ChipEdge AIQualcomm
2 months ago

➀ Synaptics has introduced the SR-Series adaptive microcontrollers (MCUs) to expand its Astra AI-Native platform for Edge AI applications, offering three performance tiers: performance (100 GOPS), efficiency, and ultra-low-power always-on (AON) processing.

➁ These MCUs feature an Arm Cortex-M55 core combined with an Arm Ethos-U55 neural processing unit (NPU), along with multiple camera interfaces, secure memory, and accelerators, targeting applications such as battery-powered security cameras, sensors, and smart appliances.

➂ The SR-Series supports the Astra Machina Micro development kit and an open-source SDK, enabling developers to create context-aware cognitive IoT devices with adaptive vision, audio, and voice processing capabilities.

AIAI ChipEdge AIIoTMCU
3 months ago

The Fraunhofer Institute for Integrated Circuits IIS has developed an AI chip for processing Spiking Neural Networks (SNNs). The SENNA spiking neural network inference accelerator, inspired by brain function, consists of artificial neurons and can process electrical impulses (spikes) directly. Its speed, energy efficiency, and compact design enable the use of SNNs directly where data is generated: in edge devices.

SNNs consist of a network of artificial neurons connected by synapses. Information is transmitted and processed in the form of electrical impulses, allowing pulsing networks to be the next step in artificial intelligence: faster, more energy-efficient, and closer to the processing method of the human brain. To bring these advantages into application, small, efficient hardware that mimics a structure of neurons and synapses is needed. For this, the Fraunhofer IIS has developed the neuromorphic SNN accelerator SENNA as part of the Fraunhofer project SEC-Learn.

SENNA is a neuromorphic chip for fast processing of low-dimensional time series data in AI applications. The current version consists of 1024 artificial neurons on less than 11 mm² of chip area. Its low reaction time down to 20 nanoseconds ensures precise timing in time-critical applications at the edge. This makes it particularly strong in real-time event-based sensor data processing and in closed control systems, such as the control of small electric motors with AI. With SENNA, AI-optimized data transmission can be realized in communication systems. There, the AI processor can analyze signal streams and adjust transmission and reception methods as needed to improve efficiency and performance.

AIChip DesignData ProcessingEdge AI
6 months ago
➀ The rise of generative AI, humanoid and service robots; ➁ AI notebooks and servers becoming mainstream; ➂ Advancements in advanced processes and CoWoS; ➃ Enhanced cybersecurity defenses and threat detection; ➄ AMOLED expansion in consumer electronics; ➅ Miniaturization and low-cost production of CubeSats; ➆ Modular end-to-end model production and Level 4 robotaxi commercialization; ➇ EVs and AI data centers driving battery and energy storage innovations.
AIBattery TechnologyEdge AITechnology Trendscybersecurityenergy storagerobotics
6 months ago

➀ The rise of edge AI has spurred semiconductor designers to build accelerators for performance and low power, leading to a proliferation of NPUs among in-house, startup, and commercial IP product portfolios.

➁ The complexity of software and hardware around neural network architectures, AI models, and base models is exploding, requiring sophisticated software compilers and instruction set simulators.

➂ The hardware complexity of inference platforms is evolving, with a focus on performance and power efficiency, especially for edge applications.

➃ The combination of tensor engines, vector engines, and scalar engines in multiple clusters to address the challenges of acceleration is complex and costly.

➄ The supply chain and ecosystem for NPUs are becoming increasingly complex, with intermediate manufacturers and software companies having limited resources to support a wide range of platforms.

Edge AI
11 months ago
1. Edge AI is a decentralized approach to AI architecture, processing data near the user rather than in the cloud. 2. Benefits include lower costs, reduced energy consumption, better data protection, and more robust applications. 3. Potential applications in industries like automotive, machinery, and medical technology are vast, but a holistic approach is needed to fully leverage Edge AI's capabilities.
Cloud ComputingEdge AIartificial intelligence