Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Multiple Different Natural Language Processing Tasks in a Single Deep Model

NotificationsYou must be signed in to change notification settings

rubythonode/joint-many-task-model

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multiple Different Natural Language Processing Tasks in a Single Deep Model. This, in my opinion, is indeed a very goodpaper. It demonstrates how a neural model can be trained from low-level to higher level in a fashion such that lower layerscorrespond to word-level tasks and the higher layers correspond to tasks which are performed at sentence level.The authors also show how to retain the information at lower layers while training the higher layers by successive regularization.It is also clearly shown that transfer learning is possible where different datasets are exploited simultaneously after jointlypre-trained for word embeddings. Catastrophic inference is a very crucial thing to deal with in this mode.It is basically the inference in other layer's learned parameters while training a particular layer. As an example, you want to retain information about POS while training for, say, chunking later!

Model Architecture:

Data:

Tasks:

  • POS Tagging (word-level)
  • Chunking (word-level)
  • Semantic Relatedness (sentence-level)
  • Textual Entailment (sentence-level)

Usage:

data.py - Preprocesses data for the model

run.py - Runs the main model.

Sample input:

task_desc= {'pos':'this has increased the risk','chunk':'this has increased the risk','relatedness': ['two dogs are wrestling and hugging','there is no dog wrestling and hugging'],'entailment': ['Two dogs are wrestling and hugging','There is no dog wrestling and hugging']}

Sample Output:

Note:

The original paper contains one more task which is dependency parsing. Currently, that is not incorporated in the model due tonon-availability of good public data. Also need to add successive regularization.

Citations:

A Joint Many-Task Model: Growing a Neural Network for Multiple NLP TasksKazuma Hashimoto, Caiming Xiong, Yoshimasa Tsuruoka, Richard Socher

https://arxiv.org/abs/1611.01587

About

Multiple Different Natural Language Processing Tasks in a Single Deep Model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp