MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversation
-
Updated
Mar 10, 2024 - Python
MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversation
MMSA is a unified framework for Multimodal Sentiment Analysis.
This repository contains various models targetting multimodal representation learning, multimodal fusion for downstream tasks such as multimodal sentiment analysis.
An official implementation for " UniVL: A Unified Video and Language Pre-Training Model for Multimodal Understanding and Generation"
多模态情感分析——基于BERT+ResNet的多种融合方法
A Tool for extracting multimodal features from videos.
This repository contains the official implementation code of the paper Improving Multimodal Fusion with Hierarchical Mutual Information Maximization for Multimodal Sentiment Analysis, accepted at EMNLP 2021.
Learning Language-guided Adaptive Hyper-modality Representation for Multimodal Sentiment Analysis (ALMT)
Context-Dependent Sentiment Analysis in User-Generated Videos
The code for our IEEE ACCESS (2020) paper Multimodal Emotion Recognition with Transformer-Based Self Supervised Feature Fusion.
CM-BERT: Cross-Modal BERT for Text-Audio Sentiment Analysis(MM2020)
😎 Awesome lists about Speech Emotion Recognition
Towards Robust Multimodal Sentiment Analysis with Incomplete Data
Code for the paper "VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis", AAAI'19
Offical implementation of paper "MSAF: Multimodal Split Attention Fusion"
A Facial Expression-Aware Multimodal Multi-task Learning Framework for Emotion Recognition in Multi-party Conversations (ACL 2023)
This repository contains the implementation of the paper -- Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment Analysis
Codebase for EMNLP 2024 Findings Paper "Knowledge-Guided Dynamic Modality Attention Fusion Framework for Multimodal Sentiment Analysis"
A survey of deep multimodal emotion recognition.
Multimodal sentiment analysis using hierarchical fusion with context modeling
Add a description, image, and links to the multimodal-sentiment-analysis topic page so that developers can more easily learn about it.
To associate your repository with the multimodal-sentiment-analysis topic, visit your repo's landing page and select "manage topics."