Publications

You can also find my articles on my Google Scholar profile.

Journal Articles


PhaseMix: A Periodic Motion Fusion Method for Adult Spinal Deformity Classification

Published in IEEE Access, 2024

The contents above will be part of a list of publications, if the user clicks the link for the publication than the contents of section will be rendered as a full page, allowing you to provide more information about the paper for the reader. When publications are displayed as a single page, the contents of the above “citation” field will automatically be included below this section in a smaller font.

Recommended citation: Chen K, Xu J, Asada T, et al. PhaseMix: A Periodic Motion Fusion Method for Adult Spinal Deformity Classification[J]. IEEE Access, 2024.
Download Paper

Two-stage video-based convolutional neural networks for adult spinal deformity classification

Published in Frontiers in Neuroscience, 2023

The contents above will be part of a list of publications, if the user clicks the link for the publication than the contents of section will be rendered as a full page, allowing you to provide more information about the paper for the reader. When publications are displayed as a single page, the contents of the above “citation” field will automatically be included below this section in a smaller font.

Recommended citation: Chen K, Asada T, Ienaga N, et al. Two-stage video-based convolutional neural networks for adult spinal deformity classification[J]. Frontiers in Neuroscience, 2023, 17: 1278584.
Download Paper

Enhanced Full Attention Generative Adversarial Networks

Published in IEICE TRANSACTIONS on Information and Systems, 2023

In this paper, we propose improved Generative Adversarial Networks with attention module in Generator, which can enhance the effectiveness of Generator. Furthermore, recent work has shown that Generator conditioning affects GAN performance. Leveraging this insight, we explored the effect of different normalization (spectral normalization, instance normalization) on Generator and Discriminator. Moreover, an enhanced loss function called Wasserstein Divergence distance, can alleviate the problem of difficult to train module in practice.

Recommended citation: Chen K X, Yamane S. Enhanced Full Attention Generative Adversarial Networks[J]. IEICE TRANSACTIONS on Information and Systems, 2023, 106(5): 813-817.
Download Paper

Conference Papers


A clinical knowledge-guided attention framework for gait-based adult spinal deformity diagnosis

Published in 2026 International Conference on Electronics, Information, and Communication (ICEIC), 2026

This research propose a framework that leverages clinical knowledge to guide gait classification for the automated diagnosis of Adult Spinal Deformity (ASD) using monocular gait videos. Existing models often overlook anatomically meaningful features, limiting their clinical applicability. This research introduce clinician-informed attention maps to encode expert knowledge about diagnostic joints and motion patterns. These maps guide the spatiotemporal focus of a CNN-based backbone to attend to clinically relevant cues. Experiments show that our approach improves both accuracy and interpretability over traditional baselines, demonstrating its potential as a non-invasive and explainable screening tool for ASD.

Download Paper

FilterNet: A Filtered Gait Motion Fusion Network for Classifying Adult Spinal Deformity

Published in 2026 International Conference on Electronics, Information, and Communication (ICEIC), 2026

Analyzing periodic human motion from videos is vital for applications such as action recognition and healthcare. In gait analysis, models must capture subtle phase-specific motion patterns. However, processing entire sequences often introduces noise and reduces accuracy. We propose FilterNet, a frameselection framework that identifies phase-relevant frames to improve temporal modeling and enhance discriminative representation. On a clinical gait video dataset, FilterNet achieves 74.5% accuracy, 75.2% precision, and 74.3% F1-score, outperforming baseline methods. Though demonstrated in a medical context, the framework is broadly applicable to other periodic motion analysis tasks. Code is available at: https://github.com/ChenKaiXuSan/FilterNet ASD PyTorch.

Recommended citation: K. Chen et al., "FilterNet: A Filtered Gait Motion Fusion Network for Classifying Adult Spinal Deformity," 2026 International Conference on Electronics, Information, and Communication (ICEIC), Macau, China, 2026, pp. 1-6, doi: 10.1109/ICEIC69189.2026.11386438.
Download Paper

Feel What You See: A Novel Sensory Interface Linking Visual Heat Cues and Instant Thermal Feedback

Published in SIGGRAPH Asia 2025 Emerging Technologies, 2025

We present a novel sensory interface that allows users to truly feel what they see by transforming visually implied heat cues into physical temperature sensations. Using a vision-language model, the system semantically analyzes video content to detect heat-related elements—such as fire, ice, or sunlight—and determines the appropriate thermal response based on both type and screen prominence. The core of this experience is a custom-designed, non-contact thermal device combining two physical mechanisms: an ethanol-based aerosol module for cooling, and a visible-spectrum radiation module for heating. The system dynamically adjusts the type and intensity of stimulation to reflect the semantic properties of each visual scene. This prototype demonstrates a new form of semantic-driven multisensory interaction, extending the expressive power of visual media. It opens new possibilities in immersive storytelling, sensory accessibility, and emotion-enhanced interfaces—where the ambient temperature of a scene can be directly felt, not just seen.

Recommended citation: Xu J, Chen K, Kuroda Y, et al. Feel What You See: A Novel Sensory Interface Linking Visual Heat Cues and Instant Thermal Feedback[M]//Proceedings of the SIGGRAPH Asia 2025 Emerging Technologies. 2025: 1-2.
Download Paper

Spinal Disease Classification Using Deep Learning on Dual-View Videos

Published in Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2025

Adult spinal deformity (ASD) is characterized by spinal deformities that cause symptoms such as balance disorders. Although gait observation is used for the diagnosis of ASD, no quantitative evaluation protocol has been established. In this study, we propose a novel approach for distinguishing ASD from non-ASD using a neural network with gait videos captured from two perspectives, enabling the detection of unique gait fluctuations specific to each patient. By integrating spatiotemporal gait dynamics, the proposed method enhances quantitative diagnostics and provides a more comprehensive gait assessment. The experimental results indicate that, relative to a model employing a singular perspective, there was a modest enhancement in classification accuracy accompanied by a notable improvement in the F1 score. Specifically, the F1 score increased by 10% over the single-view model, achieving an F1 score of 71.86%. These results confirm the efficacy of integrating two-perspective videos for gait analysis in ASD and suggest that further investigation into fusion techniques may be beneficial. Codes and models are available at https://github.com/TsuguTsukumo/2stream_3D_CNN_Walk_Pytorch.git.

Recommended citation: Tsukumo T, Chen K, Asada T, et al. Spinal Disease Classification Using Deep Learning on Dual-View Videos[C]//Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference. 2025, 2025: 1-5.
Download Paper