Toggle Main Menu Toggle Search

Open Access padlockePrints

A Novel Multi-Step Finite-State Automaton for Arbitrarily Deterministic Tsetlin Machine Learning

Lookup NU author(s): Dr Rishad Shafik, Professor Alex Yakovlev, Adrian Wheeldon, Jie lei

Downloads

Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


Abstract

Due to the high energy consumption and scalability challenges of deep learning, there is a critical need to shift research focus towards dealing with energy consumption constraints. Tsetlin Machines (TMs) are a recent approach to machine learning that has demonstrated significantly reduced energy usage compared to neural networks alike, while performing competitively accuracy-wise on several benchmarks. However, TMs rely heavily on energy-costly random number generation to stochastically guide a team of Tsetlin Automata to a Nash Equilibrium of the TM game. In this paper, we propose a novel finite-state learning automaton that can replace the Tsetlin Automata in TM learning, for increased determinism. The new automaton uses multi-step deterministic state jumps to reinforce sub-patterns. Simultaneously, flipping a coin to skip every d'th state update ensures diversification by randomization. The d-parameter thus allows the degree of randomization to be finely controlled. E.g., d=1 makes every update random and d=∞ makes the automaton completely deterministic. Our empirical results show that, overall, only substantial degrees of determinism reduces accuracy. Energy-wise, random number generation constitutes switching energy consumption of the TM, saving up to 11 mW power for larger datasets with high d values. We can thus use the new d-parameter to trade off accuracy against energy consumption, to facilitate low-energy machine learning.


Publication metadata

Author(s): Abeyrathna KD, Granmo O, Shafik R, Yakovlev A, Wheeldon A, Lei J, Goodwin M

Publication type: Article

Publication status: Published

Journal: arXiv

Year: 2020

Volume: 2007.02114

Online publication date: 04/07/2020

Acceptance date: 04/07/2020

URL: https://arxiv.org/abs/2007.02114


Share