Deep Tabular Learning

Bidirectional LSTM

A Bidirectional LSTM, or biLSTM, is a sequence processing model that consists of two LSTMs: one taking the input in a forward direction, and the other in a backwards direction. BiLSTMs effectively increase the amount of information available to the network, improving the context available to the algorithm (e.g. knowing what words immediately follow and precede a word in a sentence).

Image Source: Modelling Radiological Language with Bidirectional Long Short-Term Memory Networks, Cornegruta et al

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Sentence 72 7.64%
Sentiment Analysis 41 4.35%
Named Entity Recognition (NER) 38 4.03%
Language Modelling 37 3.93%
NER 34 3.61%
Text Classification 29 3.08%
General Classification 29 3.08%
Classification 24 2.55%
Question Answering 17 1.80%

Components


Component Type
LSTM
Recurrent Neural Networks

Categories