Stock Prediction with Stacked-LSTM Neural Networks (2025)

Abstract

This paper explores a stacked long-term and short-term memory (LSTM) model for non-stationary financial time series in stock price prediction. The proposed LSTM is designed to overcome gradient explosion, gradient vanishing, and save long-term memory. Firstly, build time series with different days for network input, and then add early-stopping, rectified linear units (Relu) activation function to avoid over-fitting during the training stage. Finally, save trained parameters state and new batch size for testing. The results suggest that the developed stacked LSTM produces better predictive power and generalization.

Original languageEnglish
Title of host publicationThe 21st IEEE International Conference on Software Quality, Reliability, and Security (QRS-C)
Number of pages7
PublisherIEEE (Institute of Electrical and Electronics Engineers)
Publication date1 Apr 2022
Pages1119-1125
ISBN (Print)978-1-6654-7837-3
ISBN (Electronic)978-1-6654-7836-6
DOIs
Publication statusPublished - 1 Apr 2022
SeriesInternational Conference on Software Quality, Reliability and Security Companion
ISSN2693-9371

Keywords

  • Deep Learning
  • Stacked Long Short Term Memory
  • Time Series
  • neural netowrks
  • over-fitting
  • predictive models
  • software quality
  • time series analysis

Fingerprint

Dive into the research topics of 'Stock Prediction with Stacked-LSTM Neural Networks'. Together they form a unique fingerprint.

View full fingerprint

  • 1 Finished
  • R2P2: Networking for Research and Development of Human Interactive and Sensitive Robotics taking advantage of Additive Manufacturing

    Chrysostomou, D. (PI), LI, C. (Project Participant), Arexolaleiba, N. A. (Project Participant) & Madsen, O. (Project Participant)

    01/01/202031/12/2022

    Project: Research

Cite this

  • APA
  • Author
  • BIBTEX
  • Harvard
  • Standard
  • RIS
  • Vancouver

Zhang, X., LI, C., Chen, K.-L., Chrysostomou, D., & Yang, H. (2022). Stock Prediction with Stacked-LSTM Neural Networks. In The 21st IEEE International Conference on Software Quality, Reliability, and Security (QRS-C) (pp. 1119-1125). IEEE (Institute of Electrical and Electronics Engineers). https://doi.org/10.1109/QRS-C55045.2021.00166

Zhang, Xiaochun ; LI, Chen ; Chen, Kuan-Lin et al. / Stock Prediction with Stacked-LSTM Neural Networks. The 21st IEEE International Conference on Software Quality, Reliability, and Security (QRS-C). IEEE (Institute of Electrical and Electronics Engineers), 2022. pp. 1119-1125 (International Conference on Software Quality, Reliability and Security Companion).

@inproceedings{34000074292142568f3b8e17fbb1edf8,

title = "Stock Prediction with Stacked-LSTM Neural Networks",

abstract = "This paper explores a stacked long-term and short-term memory (LSTM) model for non-stationary financial time series in stock price prediction. The proposed LSTM is designed to overcome gradient explosion, gradient vanishing, and save long-term memory. Firstly, build time series with different days for network input, and then add early-stopping, rectified linear units (Relu) activation function to avoid over-fitting during the training stage. Finally, save trained parameters state and new batch size for testing. The results suggest that the developed stacked LSTM produces better predictive power and generalization.",

keywords = "Deep Learning, Stacked Long Short Term Memory, Time Series, neural netowrks, over-fitting, predictive models, software quality, time series analysis",

author = "Xiaochun Zhang and Chen LI and Kuan-Lin Chen and Dimitrios Chrysostomou and Hongji Yang",

year = "2022",

month = apr,

day = "1",

doi = "10.1109/QRS-C55045.2021.00166",

language = "English",

isbn = "978-1-6654-7837-3",

series = "International Conference on Software Quality, Reliability and Security Companion",

publisher = "IEEE (Institute of Electrical and Electronics Engineers)",

pages = "1119--1125",

booktitle = "The 21st IEEE International Conference on Software Quality, Reliability, and Security (QRS-C)",

address = "United States",

}

Zhang, X, LI, C, Chen, K-L, Chrysostomou, D & Yang, H 2022, Stock Prediction with Stacked-LSTM Neural Networks. in The 21st IEEE International Conference on Software Quality, Reliability, and Security (QRS-C). IEEE (Institute of Electrical and Electronics Engineers), International Conference on Software Quality, Reliability and Security Companion, pp. 1119-1125. https://doi.org/10.1109/QRS-C55045.2021.00166

Stock Prediction with Stacked-LSTM Neural Networks. / Zhang, Xiaochun; LI, Chen; Chen, Kuan-Lin et al.
The 21st IEEE International Conference on Software Quality, Reliability, and Security (QRS-C). IEEE (Institute of Electrical and Electronics Engineers), 2022. p. 1119-1125 (International Conference on Software Quality, Reliability and Security Companion).

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

TY - GEN

T1 - Stock Prediction with Stacked-LSTM Neural Networks

AU - Zhang, Xiaochun

AU - LI, Chen

AU - Chen, Kuan-Lin

AU - Chrysostomou, Dimitrios

AU - Yang, Hongji

PY - 2022/4/1

Y1 - 2022/4/1

N2 - This paper explores a stacked long-term and short-term memory (LSTM) model for non-stationary financial time series in stock price prediction. The proposed LSTM is designed to overcome gradient explosion, gradient vanishing, and save long-term memory. Firstly, build time series with different days for network input, and then add early-stopping, rectified linear units (Relu) activation function to avoid over-fitting during the training stage. Finally, save trained parameters state and new batch size for testing. The results suggest that the developed stacked LSTM produces better predictive power and generalization.

AB - This paper explores a stacked long-term and short-term memory (LSTM) model for non-stationary financial time series in stock price prediction. The proposed LSTM is designed to overcome gradient explosion, gradient vanishing, and save long-term memory. Firstly, build time series with different days for network input, and then add early-stopping, rectified linear units (Relu) activation function to avoid over-fitting during the training stage. Finally, save trained parameters state and new batch size for testing. The results suggest that the developed stacked LSTM produces better predictive power and generalization.

KW - Deep Learning

KW - Stacked Long Short Term Memory

KW - Time Series

KW - neural netowrks

KW - over-fitting

KW - predictive models

KW - software quality

KW - time series analysis

UR - http://www.scopus.com/inward/record.url?scp=85140873800&partnerID=8YFLogxK

U2 - 10.1109/QRS-C55045.2021.00166

DO - 10.1109/QRS-C55045.2021.00166

M3 - Article in proceeding

SN - 978-1-6654-7837-3

T3 - International Conference on Software Quality, Reliability and Security Companion

SP - 1119

EP - 1125

BT - The 21st IEEE International Conference on Software Quality, Reliability, and Security (QRS-C)

PB - IEEE (Institute of Electrical and Electronics Engineers)

ER -

Zhang X, LI C, Chen KL, Chrysostomou D, Yang H. Stock Prediction with Stacked-LSTM Neural Networks. In The 21st IEEE International Conference on Software Quality, Reliability, and Security (QRS-C). IEEE (Institute of Electrical and Electronics Engineers). 2022. p. 1119-1125. (International Conference on Software Quality, Reliability and Security Companion). doi: 10.1109/QRS-C55045.2021.00166

Stock Prediction with Stacked-LSTM Neural Networks (2025)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Kieth Sipes

Last Updated:

Views: 6529

Rating: 4.7 / 5 (67 voted)

Reviews: 90% of readers found this page helpful

Author information

Name: Kieth Sipes

Birthday: 2001-04-14

Address: Suite 492 62479 Champlin Loop, South Catrice, MS 57271

Phone: +9663362133320

Job: District Sales Analyst

Hobby: Digital arts, Dance, Ghost hunting, Worldbuilding, Kayaking, Table tennis, 3D printing

Introduction: My name is Kieth Sipes, I am a zany, rich, courageous, powerful, faithful, jolly, excited person who loves writing and wants to share my knowledge and understanding with you.