Advancements in User Experience and Emotion Integration in Digital Design
Anupriya
Abstract—Traditionally, the process of User Experience (UX are mostly disorganized, resulting in a split in the field of. On the one hand, static text mining of net reviews offers extensive, semantically rich, post-hoc analyses on user sentiment and product feature opinions.[1, 1] However, this approach lacks real- time applicability for interface adaptation. On the other hand, dynamic “affective computing models” capture real-time, high-resolution user emotion through modalities such as facial recognition, speech, and “physiological biometrics. [1, 1, 1] These immediate, and are ”context-blind,” in the sense that they lack an understanding specific product-related *cause* of the user’s affective state. We propose a novel **Hierarchical Affective Fusion (HAF) Model** in order to bridge this critical gap. HAF is a novel neural architecture that, for the first time, combines these disparate and trans- temporal data streams. It uses a Static UX Profile Encoder, these models have been trained on large review corpora, in order to generate a product- specific semantic knowledge base. The static profile then gets utilized in contextualize the real-time output of a Dynamic Affect Encoder, which combines video, audio, and physiologic signals. The HAF The model employs a mixture of BERT-based topic models, Vision ViT – transformers, Temporal-CNNs – bi and an innovative fusion layer called Gated Multimodal Units (GMU) and cross-modal attention. We describe an extensive experimental designed in order compare the HAF-powered adaptive interface with both static and unimodal adaptive baselines. We predict that the The HAF-adaptive interface will produce statistically significant increases in task success rates, measurable decreases in user frustration, and increased scores on measures of self- reported engagement (e.g., SAM/PANAS).[1] The present paper shows the application of the art in making responsive, emotion- aware systems. [2] The the primary contribution offered in the proposed work using HAF model for computational UX, bridging the gap between long-term, retrospective sentiment and in-the- moment, immediate affect. Index Terms—Affective Computing, User Experience (UX), Multimodal Fusion, Hierarchical Atten- tion, Adaptive Interfaces, Biometrics, Sentiment Analysis, Human Computer Interaction Index Terms—Affective Computing, User Experience (UX), Multimodal Fusion, Hierarchical Attention, Adaptive Interfaces, Biometrics, Sentiment Analysis, Human-Computer Interaction

