Toggle Main Menu Toggle Search

Open Access padlockePrints

A Dual-Layer Attention-Based LSTM Network for Fed-batch Fermentation Process Modelling

Lookup NU author(s): Dr Kai Liu, Dr Jie ZhangORCiD

Downloads

Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


Abstract

© 2021 Elsevier B.V.Recurrent neural network (RNN) is a dynamic neural network where the current network output is related to the previous outputs. Long short-term memory network (LSTM) has emerged as a high-performance RNN. However, the original LSTM does not consider variable and sample relevance for process modelling. To overcome this problem, the paper proposes a Dual-layer Attention-based LSTM (DA-LSTM) network to model a fed-batch fermentation process. In the proposed DA-LSTM, LSTM is used to extract features of the input data and multiple time series results of the hidden layer, an encoder input attention mechanism is to select relevant driving series in the input data sequence, and a temporal decoder attention mechanism is used to measure the importance of encoder hidden states. The model with this deep architecture for high-level representations can learn very complex dynamic systems. To demonstrate the effectiveness of the proposed method, a comparative study with the original LSTM, signal attention-based LSTM is carried out. It is shown that the proposed method gives better modelling performance than others.


Publication metadata

Author(s): Liu K, Zhang J

Publication type: Book Chapter

Publication status: Published

Book Title: 31st European Symposium on Computer Aided Process Engineering

Year: 2021

Volume: 50

Pages: 541-547

Print publication date: 25/06/2021

Online publication date: 18/07/2021

Acceptance date: 02/04/2020

Series Title: Computer Aided Chemical Engineering

Publisher: Elsevier B.V.

Place Published: Amsterdam

URL: https://doi.org/10.1016/B978-0-323-88506-5.50086-3

DOI: 10.1016/B978-0-323-88506-5.50086-3

Library holdings: Search Newcastle University Library for this item

ISBN: 9780323885065


Share