Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Recent Developments in Low-Power AI Accelerators: A Survey
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0002-0476-4177
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0001-9947-1088
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0002-8929-7220
2022 (English)In: Algorithms, E-ISSN 1999-4893, Vol. 15, no 11, article id 419Article in journal (Refereed) Published
Abstract [en]

As machine learning and AI continue to rapidly develop, and with the ever-closer end of Moore’s law, new avenues and novel ideas in architecture design are being created and utilized. One avenue is accelerating AI as close to the user as possible, i.e., at the edge, to reduce latency and increase performance. Therefore, researchers have developed low-power AI accelerators, designed specifically to accelerate machine learning and AI at edge devices. In this paper, we present an overview of low-power AI accelerators between 2019–2022. Low-power AI accelerators are defined in this paper based on their acceleration target and power consumption. In this survey, 79 low-power AI accelerators are presented and discussed. The reviewed accelerators are discussed based on five criteria: (i) power, performance, and power efficiency, (ii) acceleration targets, (iii) arithmetic precision, (iv) neuromorphic accelerators, and (v) industry vs. academic accelerators. CNNs and DNNs are the most popular accelerator targets, while Transformers and SNNs are on the rise.

Place, publisher, year, edition, pages
MDPI, 2022. Vol. 15, no 11, article id 419
Keywords [en]
survey; hardware accelerator; low-power; performance; machine learning; artificial intelligence; neural networks
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:bth-24171DOI: 10.3390/a15110419ISI: 000930705100001OAI: oai:DiVA.org:bth-24171DiVA, id: diva2:1724935
Funder
ELLIIT - The Linköping‐Lund Initiative on IT and Mobile Communications, C05Knowledge Foundation, 20170236
Note

open access

Available from: 2023-01-09 Created: 2023-01-09 Last updated: 2023-03-29Bibliographically approved

Open Access in DiVA

fulltext(699 kB)128 downloads
File information
File name FULLTEXT01.pdfFile size 699 kBChecksum SHA-512
e299e0d64ef3a3a52dbeb5036fd930f595d4861a84704203871934f727e6aa2e75184de1b1c5c38aa16e9dea19f6a20d7e19b1b397f919ff4166243b5414926d
Type fulltextMimetype application/pdf

Other links

Publisher's full textPaper at the Journal Algorithms

Authority records

Åleskog, ChristofferGrahn, HåkanBorg, Anton

Search in DiVA

By author/editor
Åleskog, ChristofferGrahn, HåkanBorg, Anton
By organisation
Department of Computer Science
In the same journal
Algorithms
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 128 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 294 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf