Recent #machine learning news in the semiconductor industry

7 months ago

➀ This tutorial demonstrates how to implement AI-powered gesture recognition using the Edge Impulse platform and the IndusBoard Coin microcontroller.

➁ It includes a step-by-step guide on setting up the development environment, collecting sensor data, training a machine learning model, and deploying it on the IndusBoard Coin.

➂ The project aims to enable AI tasks on compact devices with limited resources, such as smartwatches and phones.

AIIndusBoard CoinMicrocontrollermachine learning
8 months ago

➀ Qualcomm Technologies has unveiled the Snapdragon X, a new SoC designed for AI-enabled laptops and compact desktops, priced starting at $600.

➁ The Snapdragon X boasts advanced CPU and GPU capabilities, along with dedicated AI acceleration, aiming to make powerful computing more accessible.

➂ It is expected to power Windows 11 laptops and mini desktops from manufacturers like Acer, Asus, Dell Technologies, HP, and Lenovo.

AIAI ChipQualcommWindows 11machine learning
8 months ago

➀ Researchers at Karlsruhe Institute of Technology (KIT) have demonstrated that machine learning (ML) can significantly improve the manufacturing process monitoring of perovskite solar cells, a promising photovoltaic technology.

➁ Using deep learning techniques, they were able to predict material properties and efficiencies of solar cells with high precision beyond laboratory scale.

➂ The study shows that ML can help identify process errors during cell production, leading to improved quality control without additional testing methods.

AImachine learningsolar energy
9 months ago
➀ The Technical University of Graz (TU Graz) is developing a real-time lightning risk assessment system to improve safety at outdoor events and construction sites. ➁ The system uses a network of field meters and combines data from lightning location systems and weather radar to predict lightning strikes. ➂ The research aims to reduce downtime and enhance safety by providing more accurate forecasts.
SafetyTU Grazmachine learning
9 months ago

➀ The current state-of-the-art machine learning applications use deep neural network models that have become so large and complex that they have exceeded the limits of traditional electronic computing hardware.

➁ Researchers at MIT and other institutions have developed a new type of photonic chip that can overcome these barriers.

➂ The optical device can complete the key calculations of machine learning classification tasks in less than half a nanosecond while achieving an accuracy rate of over 92%.

machine learning
10 months ago
➀ The Fraunhofer Institute for Production Systems and Design Technology IPK presents research projects and solutions addressing the challenges of modern manufacturing; ➁ The focus is on data management, flexible manufacturing processes, machine efficiency, human resource support, and sustainable production; ➂ The article highlights the importance of holistic system solutions and interdisciplinary collaboration in research and innovation.
AIIndustry 4.0ManufacturingResearch and Developmentinnovationmachine learningsustainability
10 months ago
➀ AMD hinted that Call of Duty: Black Ops 6 might be the first game to feature its next-gen FSR, likely FSR 4, which will use AI for the first time. ➁ AMD is collaborating with Activision to enhance the game experience with FSR 3.1 and future AI-based FSR. ➃ AMD's Jack Huynh revealed that the company is working with Activision to enable the next generation ML-Based FSR on Call of Duty: Black Ops 6.
AIAMDCall of Duty: Black Ops 6FSR 4machine learningupscaling
11 months ago
➀ Sean Park discusses Point2 Technology's mission to provide ultra-low power, low-latency interconnect solutions for AI/ML datacenters; ➁ He explains the challenges of scaling bandwidth and maintaining efficiency in AI/ML datacenters; ➂ The article explores the company's e-Tube technology and its potential to revolutionize interconnect technology.
AIDatacenter NetworkingInterconnect TechnologySEMICONDUCTORSmart RetimersUltraWiree-Tubemachine learning
10 months ago
➀ Alex Bronstein, known for groundbreaking innovations, starts his professorship at the Institute of Science and Technology Austria (ISTA) with a focus on Machine Learning in biowissenschaften. ➁ Bronstein has a background in both academic research at the Technion, Israel, and industry at Intel, where he developed 3D sensing technology. ➂ He aims to expand the boundaries of machine learning for applications in biowissenschaften and contribute to strategic directions in structural and cell biology, as well as single-cell analyses.
AITechnology DevelopmentTechnology Transfercomputer visioninnovationmachine learningresearch
11 months ago
➀ Researchers from the Leibniz Institute for Astrophysics Potsdam (AIP) and the Institute of Cosmos Sciences at the University of Barcelona (ICCUB) have used a novel machine learning model to process observation data from 217 million stars of the Gaia mission efficiently. The results are comparable to conventional methods for determining stellar parameters. The new approach opens up exciting possibilities for mapping properties like interstellar extinction and metallicity across the Milky Way, contributing to understanding the stellar populations and the structure of our galaxy. ➁ The third data release of the Gaia satellite by the European Space Agency ESA provided access to improved measurements for 1.8 billion stars, a vast amount of data for studying the Milky Way. Efficient analysis of such a large dataset, however, presents a challenge. The study published now investigates the use of machine learning to determine important stellar properties based on Gaia's spectrophotometric data. The model was trained on high-quality data from 8 million stars and achieved reliable predictions with low uncertainties. ➂ The machine learning technique, 'Extreme Gradient-Boosted Trees,' enables the determination of precise stellar properties like temperature, chemical composition, and interstellar dust extinction with unprecedented efficiency. The developed machine learning model, SHBoost, completes its tasks, including model training and prediction, within four hours on a single graphics processor - a process that previously required two weeks and 3000 high-performance processors.
Astronomymachine learning