Recent #AI Hardware news in the semiconductor industry
➀ The rise of artificial intelligence has led to significant technological advancements in the industry, with DeepSeek and LPU standing out.
➁ LPU, designed for language processing tasks, offers a new path in the AI field with its innovative architecture, impressive performance, and low cost.
➂ Groq's LPU, compared to GPT-4, shows a 18x faster processing speed and a 10x performance advantage over NVIDIA's GPU.
➃ Chinese companies like Wuwen Xingqiong are also investing in LPU technology.
➄ DeepSeek's release of DeepSeek-R1 model showcases its competitive edge in the AI field with its performance and cost advantages.
➅ The combination of DeepSeek and LPU is expected to revolutionize AI computing.
➀ Luoyonghao has established his new company, Xihongxian Technology Co., Ltd., focusing on developing AR operating systems and hardware products initially, but shifted to AI due to industry trends.
➁ The company is expected to launch its first AI hardware around the Chinese New Year, with a focus on software solutions and an AI Agent, along with AI native hardware.
➂ Luoyonghao, known as a pioneer in AI hardware, aims to create revolutionary products with his new venture, Xihongxian, despite previous challenges with the AI hardware TNT.
➀ Luoyonghao has established his new company, Xihongxian Technology Co., Ltd., focusing on developing AR operating systems and hardware products initially, but shifted to AI due to industry trends.
➁ The company is expected to launch its first AI hardware around the Chinese New Year, with a focus on software solutions and an AI Agent, along with AI native hardware.
➂ Luoyonghao, known as a pioneer in AI hardware, aims to create revolutionary products with his new venture, Xihongxian, despite previous challenges with the AI hardware TNT.
➀ Stanford University is researching a hybrid memory that combines the density of DRAM with the speed of SRAM, funded by CHIPS and Science Act.
➁ The research is part of the California Pacific Northwest AI Hardware Center project, which will receive $16.3 million from the US Department of Defense.
➂ The team, led by H.S. Philip Wong, focuses on developing more energy-efficient hardware for AI, with memory being the core.
➃ The hybrid gain cell memory combines the small footprint of DRAM with the almost as fast speed of SRAM.
➄ The gain cell, similar to DRAM, uses a second transistor instead of a capacitor to store data, with the data stored as charge on the gate of the transistor.
➅ Reading signals are无损 in the gain cell, and the reading transistor provides gain to the storage transistor during reading.
➆ Liu and Wong's mixed gain cell memory, combining silicon read transistors with indium tin oxide write transistors, overcomes limitations and achieves a data retention time of over 5000 seconds.
➇ These hybrid storage cells can be integrated into logic chips, potentially changing the way memory is used in computers.
➀ Celestial AI has incorporated a thermally stable optical modulator in its chips, with power consumption of hundreds of watts, eliminating the need for a digital signal processor (DSP) due to its high signal-to-noise ratio (SNR) and low bit error rate (BER).
➁ Celestial's photonic interconnect module offers a memory capacity of 2.07TB and a bandwidth of 7.2Tbps, with a latency of just 100 nanoseconds.
➂ The system supports up to 115Tbps of network switching with 916 modules, providing 33Tb of memory capacity for backend/extended AI.