Recent #Data Processing news in the semiconductor industry

4 months ago

The ETH Zurich researchers have developed a method that makes AI answers more reliable over time. Their algorithm is highly selective in choosing data. Additionally, up to 40 times smaller AI models can achieve the same output performance as the best large AI models.

ChatGPT and similar tools often amaze us with the accuracy of their answers, but also often lead to doubt. One of the big challenges of powerful AI response machines is that they serve us with perfect answers and obvious nonsense with the same ease. One of the major challenges is how the underlying large language models (LLMs) of AI deal with uncertainty. It has been very difficult until now to judge whether LLMs focused on text processing and generation generate their answers on a solid foundation of data or whether they are on uncertain ground.

Researchers from the Institute for Machine Learning at the Department of Computer Science at ETH Zurich have now developed a method to specifically reduce the uncertainty of AI. 'Our algorithm can specifically enrich the general language model of AI with additional data from the relevant thematic area of the question. In combination with the specific question, we can then specifically retrieve those relationships from the depths of the model and from the enrichment data that are likely to generate a correct answer,' explains Jonas Hübotter from the Learning & Adaptive Systems Group, who developed the new method as part of his PhD studies.

AIAI EthicsAI researchAlgorithmData ProcessingETH Zurichmachine learning
5 months ago

➀ The Fraunhofer IPMS is involved in a research project called InSeKT to develop new technological approaches for integrating AI at the edges of IT networks.

➁ The project aims to enable complex calculations directly where data is generated, improving data protection and real-time capabilities.

➂ Fraunhofer IPMS is working on sensor technology, including gas analysis using IMS, near-infrared photodetector evaluation, and adapted use of CMUTs for improved imaging.

AIAI ChipData ProcessingEdge AIRecyclingenvironmental monitoringgamingmedical technologysemiconductorsensor
5 months ago

The Fraunhofer Institute for Photonics Microsystems (IPMS) is involved in an interdisciplinary research project called 'InSeKT' (Development of Intelligent Sensor Edge Technologies). This project, carried out by the Technical University of Wildau, the Leibniz Institute for Innovative Microelectronics (IHP), and the Fraunhofer IPMS, aims to integrate artificial intelligence (AI) more effectively at the 'edges' of IT networks. The project focuses on miniaturized sensor structures and the integration of electronic components, with the goal of enabling complex calculations directly at the data source, such as at the sensor itself.

Current data processing with AI often occurs through central cloud computing solutions, leading to data transfer over large distances and potential data leaks. The project addresses this by promoting decentralized data processing for improved data security and real-time system capabilities.

The project covers various areas, including gas analysis using ion mobility spectrometers (IMS), data-supported evaluation of photodetectors for the near-infrared wavelength range, and the adapted use of capacitive microelectromechanical ultrasonic transducers (CMUTs) for improved imaging. The generated data will be used to train Edge-KI systems for fast and accurate data processing.

AICloud ComputingData ProcessingEdge ComputingMEMSPhotonicsRecyclingSensor Technologydata securitymaterial science
5 months ago

➀ The University of Ulm and Duisburg-Essen, along with Bosch and Nokia, organized a demo event for connected cooperative driving as part of the EU PoDIUM project.

➁ A live demonstration showcased the seamless coordination of automated vehicles in complex traffic situations.

➂ The research focuses on enabling secure interaction between automated vehicles using sensor data and a cooperative maneuver planner.

5GBoschConnected VehiclesData ProcessingNokiaautomotive
5 months ago

➀ Retrieval Augmented Generation (RAG) is being developed by Fraunhofer IWU to streamline the process of finding crucial information in extensive technical and legal texts.

➁ The technology is designed to complement Large Language Models (LLMs) by providing precise and exhaustive information.

➂ The team at IWU is using the EU Machinery Regulation (2023/1230) as a demonstration of the technology's capabilities.

AIData ProcessingLLMLarge Language Models
6 months ago

The Fraunhofer Institute for Integrated Circuits IIS has developed an AI chip for processing Spiking Neural Networks (SNNs). The SENNA spiking neural network inference accelerator, inspired by brain function, consists of artificial neurons and can process electrical impulses (spikes) directly. Its speed, energy efficiency, and compact design enable the use of SNNs directly where data is generated: in edge devices.

SNNs consist of a network of artificial neurons connected by synapses. Information is transmitted and processed in the form of electrical impulses, allowing pulsing networks to be the next step in artificial intelligence: faster, more energy-efficient, and closer to the processing method of the human brain. To bring these advantages into application, small, efficient hardware that mimics a structure of neurons and synapses is needed. For this, the Fraunhofer IIS has developed the neuromorphic SNN accelerator SENNA as part of the Fraunhofer project SEC-Learn.

SENNA is a neuromorphic chip for fast processing of low-dimensional time series data in AI applications. The current version consists of 1024 artificial neurons on less than 11 mm² of chip area. Its low reaction time down to 20 nanoseconds ensures precise timing in time-critical applications at the edge. This makes it particularly strong in real-time event-based sensor data processing and in closed control systems, such as the control of small electric motors with AI. With SENNA, AI-optimized data transmission can be realized in communication systems. There, the AI processor can analyze signal streams and adjust transmission and reception methods as needed to improve efficiency and performance.

AIChip DesignData ProcessingEdge AI
7 months ago

➀ The EU's ODISSEE project aims to find ways to process the volume of scientific data produced by research infrastructures using AI data processing.

➁ The project, funded by the European Union, targets fundamental mysteries such as the nature of dark matter.

➂ Research bodies like CERN's HL-LHC and SKAO will generate massive amounts of raw data, which current digital technologies cannot handle.

AIData ProcessingEuropean Union
10 months ago
➀ Teledyne e2v has released engineering models of the LX2160-Space, a 16-core Arm Cortex A72-based SoC processor designed for space demanding applications. ➁ The processor is suitable for heavy computing applications such as Earth Observation Satellites, Early Warning systems, and Telecom, with its 200k DMIPS capabilities and efficient power consumption. ➂ Characterization against radiation is ongoing, with plans to have a range of radiation-tolerant space grades qualified up to NASA Level 1 by H2 2025.
Data ProcessingSpaceTeledyne e2vsemiconductortelecommunications
about 1 year ago
1. The article discusses the significant development of electronic character recognition (OCR) systems in automatic data processing. 2. It highlights the use of magnetic detection and optical scanning in deciphering printed numeric information, which is then converted for digital processor input. 3. The focus is on how major companies like IBM, National Cash, and National Data Processing have contributed to the market, particularly in the UK where they have sold magnetic systems to banks for automatic cheque sorting.
Banking TechnologyData ProcessingOCR