Abstract :Accurate Prediction Of Stock Price Movement Remains A Challenging Problem Due To The Highly Volatile And Information-driven Nature Of Financial Markets. Traditional Prediction Models Relying Solely On Historical Price Data Fail To Capture The Complex Influence Of External Information Such As Financial News. To Address This Limitation, This Project Proposes A Hybrid Information Mixing Module (HIMM) That Effectively Integrates Numerical Stock Data With Textual News Information For Improved Stock Movement Prediction. The Proposed Mechanism Extracts Temporal Patterns From Stock Price Sequences Using Recurrent Neural Network Models, Including Long Short-Term Memory (LSTM) And Gated Recurrent Units (GRU), While Simultaneously Learning Semantic Representations From Financial News Text. Unlike Conventional Fusion Approaches, The HIMM Architecture Employs Two Independent Multilayer Perceptron Blocks That Perform Feature-level Mixing And Interaction-level Mixing In A Structured Row-wise And Column-wise Manner. This Design Enables Richer Cross-modal Interactions Between Time-series And Semantic Features, Allowing The Model To Better Capture Market Volatility And Information Flow. Experimental Evaluation On Volatile Stock Market Data Demonstrates That The Proposed Mechanism Achieves Superior Performance Compared To Baseline Models, With Notable Improvements In Accuracy, Matthews Correlation Coefficient, And F1-score. The Results Confirm That Structured Multimodal Information Mixing Significantly Enhances Predictive Capability, Making The Proposed Hybrid Module A Promising Approach For Intelligent Stock Market Forecasting Systems. Keywords-Stock Price Movement Prediction, Multimodal Data Fusion, Hybrid Information Mixing Module, Deep Learning, Financial Time-series Analysis, News Sentiment Analysis, Long Short-term Memory Networks, Gated Recurrent Units, Multilayer Perceptron, Market Volatility Modeling |
Published:20-2-2026 Issue:Vol. 26 No. 2 (2026) Page Nos:80-85 Section:Articles License:This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. How to Cite |