Back to Nlp Progress

Intent Detection and Slot Filling

english/intent_detection_slot_filling.md

0.38.4 KB
Original Source

Intent Detection and Slot Filling

Intent Detection and Slot Filling is the task of interpreting user commands/queries by extracting the intent and the relevant slots.

Example (from ATIS):

Query: What flights are available from pittsburgh to baltimore on thursday morning
Intent: flight info
Slots: 
    - from_city: pittsburgh
    - to_city: baltimore
    - depart_date: thursday
    - depart_time: morning

ATIS

ATIS (Air Travel Information System) (Hemphill et al.) is a dataset by Microsoft CNTK. Available from the github page. The slots are labeled in the BIO (Inside Outside Beginning) format (similar to NER). This dataset contains only air travel related commands. Most of the ATIS results are based on the work here.

ModelSlot F1 ScoreIntent AccuracyPaper / SourceCode
Bi-model with decoder96.8998.99A Bi-model based RNN Semantic Frame Parsing Model for Intent Detection and Slot Filling
CTRAN98.4698.07CTRAN: CNN-Transformer-based network for natural language understandingOfficial
SlotRefine + BERT96.1697.74SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot FillingOfficial
SlotRefine96.2297.11SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot FillingOfficial
Stack-Propagation + BERT96.1097.50A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language UnderstandingOfficial
JointBERT-CAE96.197.50CAE: Mechanism to Diminish the Class Imbalanced in SLU Slot Filling TaskOfficial
Co-interactive Transformer95.9097.70A Co-Interactive Transformer for Joint Slot Filling and Intent DetectionOfficial
Heterogeneous Attention95.5897.76Joint agricultural intent detection and slot filling based on enhanced heterogeneous attention mechanism
Stack-Propagation95.9096.90A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language UnderstandingOfficial
Attention Encoder-Decoder NN95.8798.43Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
SF-ID (BLSTM) network95.8097.76A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot FillingOfficial
Context Encoder95.80NAImproving Slot Filling by Utilizing Contextual Information
Capsule-NLU95.2095.00Joint Slot Filling and Intent Detection via Capsule Neural NetworksOfficial
Joint GRU model(W)95.4998.10A Joint Model of Intent Determination and Slot Filling for Spoken Language Understanding
Slot-Gated BLSTM with Attension95.2094.10Slot-Gated Modeling for Joint Slot Filling and Intent PredictionOfficial
Joint model with recurrent slot label context94.6498.40Joint Online Spoken Language Understanding and Language Modeling with Recurrent Neural NetworksOfficial
Recursive NN93.9695.40JOINT SEMANTIC UTTERANCE CLASSIFICATION AND SLOT FILLING WITH RECURSIVE NEURAL NETWORKS
Encoder-labeler Deep LSTM95.66NALeveraging Sentence-level Information with Encoder LSTM for Natural Language Understanding
RNN with Label Sampling94.89NARecurrent Neural Network Structured Output Prediction for Spoken Language Understanding
Hybrid RNN95.06NAUsing recurrent neural networks for slot filling in spoken language understanding.
RNN-EM95.25NARecurrent neural networks with external memory for language understanding
CNN-CRF94.35NAConvolutional neural network based triangular crf for joint intent detection and slot filling

SNIPS

SNIPS is a dataset by Snips.ai for Intent Detection and Slot Filling benchmarking. Available from the github page. This dataset contains several day to day user command categories (e.g. play a song, book a restaurant).

ModelSlot F1 ScoreIntent AccuracyPaper / SourceCode
CTRAN98.3099.42CTRAN: CNN-Transformer-based Network for Natural Language UnderstandingOfficial
SlotRefine + BERT97.0599.04SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot FillingOfficial
Stack-Propagation + BERT97.0099.00A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language UnderstandingOfficial
JointBERT-CAE97.0098.30CAE: Mechanism to Diminish the Class Imbalanced in SLU Slot Filling TaskOfficial
Heterogeneous Attention96.3298.29Joint agricultural intent detection and slot filling based on enhanced heterogeneous attention mechanism
Co-interactive Transformer95.9098.80A Co-Interactive Transformer for Joint Slot Filling and Intent DetectionOfficial
Stack-Propagation94.2098.00A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language UnderstandingOfficial
SlotRefine93.7297.44SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot FillingOfficial
Context Encoder93.60NAImproving Slot Filling by Utilizing Contextual Information
SF-ID (BLSTM) network92.2397.43A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot FillingOfficial
Capsule-NLU91.8097.70Joint Slot Filling and Intent Detection via Capsule Neural NetworksOfficial
Slot-Gated BLSTM with Attention88.8097.00Slot-Gated Modeling for Joint Slot Filling and Intent PredictionOfficial