文章與白皮書

2022/01/19

Embedded Logic-NVM Solutions for Al Chips

Last month, eMemory Technology hosted a webinar titled “eMemory’s Embedded Logic-NVM Solution for AI Chips.” While the purpose was to present their embedded Logic-NVM solution, the webinar nicely sets the stage by highlighting Analog NVM’s value as it relates to neural networks. Of course, the algorithms of neural networks are core to implementing AI chips, especially in weights storage. Dr. Chen, the presenter is a manager of one of eMemory’s many technology development departments. Following is a synthesis of the salient points I gathered from the webinar.

Market Trends

There is a massive migration of AI processing from the cloud to the edge, enabled by emerging AI algorithms. Fast growing AI applications are many, such as data inference, image and voice processing and recognition, autonomous driving, cybersecurity, augmented reality, etc. In order to develop efficient AI chips for these applications, it is important to implement various types of Al processing elements (PEs) with low power consumption and high computing performance.

Neural Networks

Artificial Intelligence (AI) is about emulating the human brain. Human brain, of course consists of many neural networks and neurons are the structural and functional units of these networks. It is neurons that perceive changes in the environment and transmit information to other neurons through current conduction. The neurons can be divided into four parts, namely the receptive zone, trigger zone, conducting zone and the output zone. The basic architecture of an electronic equivalent of the human neural network must include corresponding zones/layers.

Analog NVM

Implementing a Neural Network electronically is achieved through a Multi-Layer Perceptron (MLP) structure. The MLP consists of an input layer, hidden layers, and output layer that are all connected via weights (the electronic equivalent of synapses). The input layer is mapped to data inputs and weights, the hidden layer to the net-input function, and the output layer to activation function and output.

....More