You signed in with another tab or window.Reload to refresh your session.You signed out in another tab or window.Reload to refresh your session.You switched accounts on another tab or window.Reload to refresh your session.Dismiss alert
* Modified parameter order of DecoderRNN.forward (#85)* Updated TopKDecoder (#86)* Fixed topk decoder.* Use torchtext from pipy (#87)* Use torchtext from pipe.* Fixed torch text sorting order.* attention is not required when only using teacher forcing in decoder (#90)* attention is not required when only using teacher forcing in decoder* Updated docs and version.* Fixed code style.
<li><strong>mask</strong> (<aclass="reference external"href="https://docs.python.org/2/library/functions.html#int"title="(in Python v2.7)"><em>int</em></a><em>,</em><em>optional</em>) – index of masked token, i.e. weight[mask] = 0.</li>
<li><strong>mask</strong> (<aclass="reference external"href="https://docs.python.org/2/library/functions.html#int"title="(in Python v2.7)"><em>int</em></a><em>,</em><em>optional</em>) – index of masked token, i.e. weight[mask] = 0.</li>
303
303
</ul>
304
304
</td>
@@ -354,7 +354,7 @@ <h2>Perplexity<a class="headerlink" href="#perplexity" title="Permalink to this
<codeclass="descname">forward_rnn</code><spanclass="sig-paren">(</span><em>inputs=None</em>,<em>encoder_hidden=None</em>,<em>encoder_outputs=None</em>,<em>function=<function log_softmax></em>,<em>retain_output_probs=True</em><spanclass="sig-paren">)</span><aclass="headerlink"href="#seq2seq.models.TopKDecoder.TopKDecoder.forward_rnn"title="Permalink to this definition">¶</a></dt>
<codeclass="descname">forward</code><spanclass="sig-paren">(</span><em>inputs=None</em>,<em>encoder_hidden=None</em>,<em>encoder_outputs=None</em>,<em>function=<function log_softmax></em>,<em>teacher_forcing_ratio=0</em>,<em>retain_output_probs=True</em><spanclass="sig-paren">)</span><aclass="headerlink"href="#seq2seq.models.TopKDecoder.TopKDecoder.forward"title="Permalink to this definition">¶</a></dt>
418
420
<dd><p>Forward rnn for MAX_LENGTH steps. Look at<codeclass="xref py py-func docutils literal"><spanclass="pre">seq2seq.models.DecoderRNN.DecoderRNN.forward_rnn()</span></code> for details.</p>
419
421
</dd></dl>
420
422
@@ -460,8 +462,8 @@ <h1>Models<a class="headerlink" href="#models" title="Permalink to this headline
460
462
<colclass="field-body"/>
461
463
<tbodyvalign="top">
462
464
<trclass="field-odd field"><thclass="field-name">Variables:</th><tdclass="field-body"><ulclass="first last simple">
463
-
<li><strong>linear_out</strong> (<aclass="reference external"href="http://pytorch.org/docs/master/nn.html#torch.nn.Linear"title="(in PyTorch vmaster (0.2.0+c580352 ))"><em>torch.nn.Linear</em></a>) – applies a linear transformation to the incoming data:<spanclass="math">\(y = Ax + b\)</span>.</li>
464
-
<li><strong>mask</strong> (<aclass="reference external"href="http://pytorch.org/docs/master/tensors.html#torch.Tensor"title="(in PyTorch vmaster (0.2.0+c580352 ))"><em>torch.Tensor</em></a><em>,</em><em>optional</em>) – applies a<spanclass="math">\(-inf\)</span> to the indices specified in the<cite>Tensor</cite>.</li>
465
+
<li><strong>linear_out</strong> (<aclass="reference external"href="http://pytorch.org/docs/master/nn.html#torch.nn.Linear"title="(in PyTorch vmaster (0.2.0+2e42272 ))"><em>torch.nn.Linear</em></a>) – applies a linear transformation to the incoming data:<spanclass="math">\(y = Ax + b\)</span>.</li>
466
+
<li><strong>mask</strong> (<aclass="reference external"href="http://pytorch.org/docs/master/tensors.html#torch.Tensor"title="(in PyTorch vmaster (0.2.0+2e42272 ))"><em>torch.Tensor</em></a><em>,</em><em>optional</em>) – applies a<spanclass="math">\(-inf\)</span> to the indices specified in the<cite>Tensor</cite>.</li>
465
467
</ul>
466
468
</td>
467
469
</tr>
@@ -482,7 +484,7 @@ <h1>Models<a class="headerlink" href="#models" title="Permalink to this headline
482
484
<colclass="field-name"/>
483
485
<colclass="field-body"/>
484
486
<tbodyvalign="top">
485
-
<trclass="field-odd field"><thclass="field-name">Parameters:</th><tdclass="field-body"><strong>mask</strong> (<aclass="reference external"href="http://pytorch.org/docs/master/tensors.html#torch.Tensor"title="(in PyTorch vmaster (0.2.0+c580352 ))"><em>torch.Tensor</em></a>) – tensor containing indices to be masked</td>
487
+
<trclass="field-odd field"><thclass="field-name">Parameters:</th><tdclass="field-body"><strong>mask</strong> (<aclass="reference external"href="http://pytorch.org/docs/master/tensors.html#torch.Tensor"title="(in PyTorch vmaster (0.2.0+2e42272 ))"><em>torch.Tensor</em></a>) – tensor containing indices to be masked</td>
486
488
</tr>
487
489
</tbody>
488
490
</table>
@@ -588,7 +590,7 @@ <h1>Models<a class="headerlink" href="#models" title="Permalink to this headline