How to keep the HG weights non-negative: the truncated Perceptron reweighing rule
The literature on error-driven learning in Harmonic Grammar (HG) has adopted the Perceptron reweighing rule. Yet, this rule is not suited to HG, as it fails at ensuring non-negative weights. A variant is thus considered which truncates the updates at zero, keeping the weights non-negative. Converge...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Polish Academy of Sciences
2015-12-01
|
Series: | Journal of Language Modelling |
Subjects: | |
Online Access: | https://jlm.ipipan.waw.pl/index.php/JLM/article/view/115 |
id |
doaj-0ce5d491b88e440c9d4f02ceca0695da |
---|---|
record_format |
Article |
spelling |
doaj-0ce5d491b88e440c9d4f02ceca0695da2021-02-25T14:51:00ZengPolish Academy of SciencesJournal of Language Modelling2299-856X2299-84702015-12-013210.15398/jlm.v3i2.11540How to keep the HG weights non-negative: the truncated Perceptron reweighing ruleGiorgio Magri0UMR 7023 SFL (CNRS, University of Paris 8) The literature on error-driven learning in Harmonic Grammar (HG) has adopted the Perceptron reweighing rule. Yet, this rule is not suited to HG, as it fails at ensuring non-negative weights. A variant is thus considered which truncates the updates at zero, keeping the weights non-negative. Convergence guarantees and error bounds for the original Perceptron are shown to extend to its truncated variant. https://jlm.ipipan.waw.pl/index.php/JLM/article/view/115Harmonic Grammarerror-driven learningPerceptronconvergence |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Giorgio Magri |
spellingShingle |
Giorgio Magri How to keep the HG weights non-negative: the truncated Perceptron reweighing rule Journal of Language Modelling Harmonic Grammar error-driven learning Perceptron convergence |
author_facet |
Giorgio Magri |
author_sort |
Giorgio Magri |
title |
How to keep the HG weights non-negative: the truncated Perceptron reweighing rule |
title_short |
How to keep the HG weights non-negative: the truncated Perceptron reweighing rule |
title_full |
How to keep the HG weights non-negative: the truncated Perceptron reweighing rule |
title_fullStr |
How to keep the HG weights non-negative: the truncated Perceptron reweighing rule |
title_full_unstemmed |
How to keep the HG weights non-negative: the truncated Perceptron reweighing rule |
title_sort |
how to keep the hg weights non-negative: the truncated perceptron reweighing rule |
publisher |
Polish Academy of Sciences |
series |
Journal of Language Modelling |
issn |
2299-856X 2299-8470 |
publishDate |
2015-12-01 |
description |
The literature on error-driven learning in Harmonic Grammar (HG) has adopted the Perceptron reweighing rule. Yet, this rule is not suited to HG, as it fails at ensuring non-negative weights. A variant is thus considered which truncates the updates at zero, keeping the weights non-negative. Convergence guarantees and error bounds for the original Perceptron are shown to extend to its truncated variant.
|
topic |
Harmonic Grammar error-driven learning Perceptron convergence |
url |
https://jlm.ipipan.waw.pl/index.php/JLM/article/view/115 |
work_keys_str_mv |
AT giorgiomagri howtokeepthehgweightsnonnegativethetruncatedperceptronreweighingrule |
_version_ |
1724251373656080384 |