Research Summary Log

1 minute read

I have started maintaining a summary log of the research papers i read related to my work. The papers are mostly related to :

  • Natural Language Processing (NLP)
  • Natural Language Understanding (NLU)
  • Dialogue Systems
  • Deep Learning

The table below can be searched and sorted by columns.

Title Topic Summary Keywords Year
Named Entity Recognition for Novel Types by Transfer Learning Named Entity Recognition (NER) Given training data in a related domain with similar (but not identical) named entity (NE) types and a small amount of in-domain training data, They use transfer learning to learn a domain-specific NE model. NER;CRF;Transfer Learning 2016
FRAMES: A Corpus for adding memory to Goal-Oriented Dialog Systems Dialogue Systems Based on Semantic Frames, This paper introduce a task called frame tracking, which generalizes state tracking to a setting where several states are tracked simultaneously. They show that Frames can also be used to study memory in dialogue management and information presentation through natural language generation. They also provide a baseline model for frame tracking task Frames; Memory; Goal Oriented Dialogue; NLG; IOB; Frame Tracking 2016
A Critical Review of Recurrent Neural Networks for Sequence Learning Deep Learning This paper provide a detailed explanation on Recurrent Neural Networks (RNN) and its state of the art variants : Long Short-Term Memory (LSTM) and bidirectional recurrent neural network (BRNN). Authors have synthesized the body of research over the past three decades that has yielded these powerful models. Some intersesting conclusions are :
  • Many advances come from novel architectures rather than fundamentally novel algorithms
  • Extension of RNNs to longer form text in Natural language tasks will be fruitful.
  • Dialogue systems could be built along similar principles to the architectures used for translation, encoding prompts and generating responses, retaining the entirety of conversation history as contextual information.
Recurrent Neural Networks, Sequence Learning 2015

Updated:

Leave a Comment