Publication
Title
Designing resource-constrained neural networks using neural architecture search targeting embedded devices
Author
Abstract
Recent advances in the field of Neural Architecture Search (NAS) have made it possible to develop state-of-the-art deep learning systems without requiring extensive human expertise and hyperparameter tuning. In most previous research, little concern was given to the resources required to run the generated systems. In this paper, we present an improvement on a recent NAS method, Efficient Neural Architecture Search (ENAS). We adapt ENAS to not only take into account the network’s performance, but also various constraints that would allow these networks to be ported to embedded devices. Our results show ENAS’ ability to comply with these added constraints. In order to show the efficacy of our system, we demonstrate it by designing a Recurrent Neural Network that predicts words as they are spoken, and meets the constraints set out for operation on an embedded device, along with a Convolutional Neural Network, capable of classifying 32x32 RGB images at a rate of 1 FPS on an embedded device.
Language
English
Source (journal)
Internet of Things. - -
Publication
2020
ISSN
2542-6605
DOI
10.1016/J.IOT.2020.100234
Volume/pages
12 (2020) , 14 p.
Article Reference
100234
ISI
000695695600001
Medium
E-only publicatie
Full text (Publisher's DOI)
Full text (open access)
Full text (publisher's version - intranet only)
UAntwerpen
Faculty/Department
Research group
Publication type
Subject
Affiliation
Publications with a UAntwerp address
External links
Web of Science
Record
Identifier
Creation 09.12.2020
Last edited 09.12.2024
To cite this reference