-
Notifications
You must be signed in to change notification settings - Fork 0
Two approaches that use unlabeled data to improve sequence learning with recurrent networks. The first approach is to predict what comes next in a sequence, which is a conventional language model in natural language processing.
sanjay-kv/Semi-supervised-Sequence-Learning
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
% readme.txt for sagej.cls % Version 1.20 released 14 January 2017 % % This software may only be used to prepare an article for publication in a % SAGE Publications journal % Any other use constitutes an infringement of copyright. % % The release consists of the following files: % % readme.txt this file % sagej.cls the LaTeX2e class file % Sage_LaTeX_Guidelines.tex authors' instructions % Sage_Latex_Guidelines.pdf authors' instructions in PDF format % SageV.bst SAGE Vancouver style bst % SageH.bst SAGE Harvard style bst % % Typeset Sage_Latex_Guidelines.tex for instructions and examples, or view the PDF. % % Simply place sagej.cls and sagedoc.tex in your systems usual % directories and typeset using your LaTeX2e/PDFLaTeX command. % % % % *** IMPORTANT NOTE *** % When you submit your paper, please use the "doublespace" option % in the documentclass line which will double-space your document % and make the task of reviewing much simpler. %
About
Two approaches that use unlabeled data to improve sequence learning with recurrent networks. The first approach is to predict what comes next in a sequence, which is a conventional language model in natural language processing.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published