文章摘要
Wang Bingrui(王秉睿)*,Chen Yunji**.[J].高技术通讯(英文),2020,26(2):160~167
NNL: a domain-specific language for neural networks
  
DOI:doi:10.3772/j.issn.1006-6748.2020.02.005
中文关键词: 
英文关键词: artificial neural network (NN), domain-specific language (DSL), neural network (NN) accelerator
基金项目:
Author NameAffiliation
Wang Bingrui(王秉睿)* (*School of Computer Science and Technology, University of Science and Technology of China, Hefei 230027, P.R.China) 
Chen Yunji** (**Intelligent Processor Research Center, Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100190, P.R.China) 
Hits: 1432
Download times: 1320
中文摘要:
      
英文摘要:
      Recent years, neural networks (NNs) have received increasing attention from both academia and industry. So far significant diversity among existing NNs as well as their hardware platforms makes NN programming a daunting task. In this paper, a domain-specific language (DSL) for NNs, neural network language (NNL) is proposed to deliver productivity of NN programming and portable performance of NN execution on different hardware platforms. The productivity and flexibility of NN programming are enabled by abstracting NNs as a directed graph of blocks.The language describes 4 representative and widely used NNs and runs them on 3 different hardware platforms (CPU, GPU and NN accelerator). Experimental results show that NNs written with the proposed language are, on average, 14.5% better than the baseline implementations across these 3 platforms. Moreover, compared with the Caffe framework that specifically targets the GPU platform, the code can achieve similar performance.
View Full Text   View/Add Comment  Download reader
Close

分享按钮