Semicon Korea 2019 Keynote Speech
"Apart from the cloud infrastructure, we expect to see the era of 'On-Device AI', which will enable AI calculations on smart phones and autonomous vehicles. When this age arrives, the AI age will be truly realized."
Shim Eun-Su, Director of the AI & SW Research Center of Samsung Electronics Advanced Technology Institute (executive director), emphasized at the keynote speech of Semicon Korea 2019 held at COEX in Seoul, Korea on 23rd.
AI is implemented based on 'learning'. The computer could distinguish between cats and puppies because it was possible to input a large amount of photo data and to let it learn "This is a cat" and "That is a puppy." In the computing industry, according to the degree of learning, it is classified into machine learning and deep learning. Computer learning was typically done in a cloud infrastructure consisting of large servers. This is to enter the data in a large amount and to let it learn relatively quickly. The AlphaGo System of Google, which competed with Se-Dol Lee, an expert Go player, was also implemented after learning on the cloud infrastructure.
"It is important that the AI learning capability of individual devices is given more priority than the cloud infrastructure," Executive Director Shim said. "For real AI, we have to allow it to track and learn everything it sees and hears. But when such data is being uploaded to the cloud, there is an actual risk of personal information leakage," he added.
Connectivity is also a problem. If AI learning is implemented only on the cloud infrastructure, services will not be available in shadow areas of communication. The expected communication overload can be fatal if a large number of unrefined data is put on the cloud. Experts warn that if we rely solely on cloud infrastructures for autonomous vehicles’ computational bases, we could face the risk of an accident due to communication delays.
This connectivity and computational overloading problem can be solved if AI learning and computation is possible on an individual device. However, in this case, minimizing the power consumption and solving the bottleneck in the chip internal memory is the biggest problem.
"The Samsung Advanced Institute of Technology has been aware of and predicted this situation for years and has been studying on-device AI technology and achieved results," said Shim. "The new Galaxy S10 smartphone, which will be launched soon, is equipped with a neural network unit (NPU: Neural Processing Unit) developed by us,” he said.
Executive Director Shim will be presenting at the International Solid-State Circuits Conference (ISSCC) in San Francisco, the USA on February 17th. Samsung Electronics published the paper. The paper is about the butterfly structure dual-core NPU installed in 8-nm system-on-chip (SoC) (Paper Title: Butterfly-Structure Dual-Core Sparsity-Aware Neural Processing Unit in 8-nm system-on-chip (SoC)). Shim explained that it is the greatest advancement of NPUs to date in terms of low power, high performance, and scalability (supporting a variety of learning frameworks).
"On-device AI uses only a limited amount of resources on mobile devices, so there is a problem that previously learned information is erased as the learning progresses, and it is another task that should be solved. When the on-device AI solution that solved various tasks is implemented, it will bring the true AI age that we dream of," Shim said.