DEEP LEARNING APPROACH FOR MULTIMODAL BIOMETRIC RECOGNITION SYSTEM BASED ON FUSION OF IRIS, FACE, AND FINGER VEIN TRAITSID: 2356 Abstract :Traditional Password-based Authentication Systems Exhibit Critical Vulnerabilities Including Susceptibility To Phishing, Bruteforce Attacks, And Credential Theft. This Paper Presents A Multimodal Biometric Secure File Storage System That Integrates Face, Fingerprint, And Dual-iris Recognition For Quadruple-modal Authentication. Leveraging MTCNN And FaceNet For Face Processing, MobileNetV2 Transfer Learning For Fingerprint Feature Extraction, And A Custom Convolutional Neural Network For Iris Recognition, The System Requires Simultaneous Verification Of All Four Biometric Samples Before Granting Document Access. Implemented As A Python Flask Web Application With SQLite Persistence, The System Achieves 625ms Average Authentication Latency With GPU Acceleration, 97.6% Test Pass Rate Across 42 Test Cases, And Maintains Complete User Isolation For Stored Documents. The Integrated Analytics Dashboard Provides Real-time Performance Metrics, Confidence Scores, And Comprehensive Audit Logging. Experimental Evaluation Confirms That The AND-fusion Decision Strategy— Requiring All Modalities To Match—provides Exponentially Stronger Security Than Single-modality Or Optional Multimodal Approaches, Establishing A Practical Framework For Biometricsecured Digital Document Repositories. Keywords—multimodal Biometrics, Face Recognition, Fingerprint Recognition, Iris Recognition, Secure File Storage, Deep Learning, FaceNet, MobileNetV2, Authentication |
Published:02-4-2026 Issue:Vol. 26 No. 4 (2026) Page Nos:168-173 Section:Articles License:This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. How to CiteMr. B. Srinivasa Rao, Md. Moulanbi, B. Keerthi, S. Satish, B. Sunil Kumar, DEEP LEARNING APPROACH FOR MULTIMODAL BIOMETRIC RECOGNITION SYSTEM BASED ON FUSION OF IRIS, FACE, AND FINGER VEIN TRAITS , 2026, International Journal of Engineering Sciences and Advanced Technology, 26(4), Page 168-173, ISSN No: 2250-3676. |