Abstract :Human Activity Recognition (HAR) Plays A Vital Role In Modern Healthcare Systems By Enabling Continuous Monitoring Of Individuals’ Physical Activities And Behaviors. Accurate Recognition Of Daily Activities Such As Walking, Sitting, Running, And Sleeping Can Support Early Diagnosis, Rehabilitation, Elderly Care, And Fitness Tracking. Traditional Activity Recognition Systems Often Rely On A Single Data Modality, Such As Wearable Sensors Or Video Data, Which Limits Their Accuracy And Robustness. To Overcome These Limitations, This Project Proposes An Intelligent Multimodal Human Activity Recognition System That Integrates Multiple Data Sources To Improve Performance And Reliability.cThe Proposed System Utilizes Multimodal Data Inputs Such As Wearable Sensor Data (accelerometer, Gyroscope), Visual Data From Cameras, And Possibly Audio Signals. Data Preprocessing Techniques Including Normalization, Noise Removal, And Feature Extraction Are Applied To Prepare The Dataset. Deep Learning Models, Particularly Convolutional Neural Networks (CNNs) And Recurrent Neural Networks (RNNs), Are Employed To Capture Spatial And Temporal Patterns In The Data. Feature Fusion Techniques Are Used To Combine Information From Different Modalities, Resulting In A More Comprehensive Representation Of Human Activities.cExperimental Results Demonstrate That The Multimodal Approach Significantly Improves Recognition Accuracy Compared To Single-modality Systems. The System Is Capable Of Accurately Identifying A Wide Range Of Human Activities In Real Time, Making It Suitable For Healthcare Applications Such As Patient Monitoring, Fall Detection, And Fitness Tracking. However, Challenges Such As Data Synchronization And Computational Complexity Remain. Overall, The Proposed System Provides An Efficient And Scalable Solution For Intelligent Activity Recognition In Personal Healthcare Environments. |
Published:08-4-2026 Issue:Vol. 26 No. 4 (2026) Page Nos:1851-1857 Section:Articles License:This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. How to Cite |