Parallelization of Neural Network Training for NLP with Hogwild!

Neural Networks are prevalent in todays NLP research. Despite their success for different tasks, training time is relatively long. We use Hogwild! to counteract this phenomenon and show that it is a suitable method to speed up training Neural Networks of different architectures and complexity. For P...

Full description

Bibliographic Details
Main Authors: Deyringer Valentin, Fraser Alexander, Schmid Helmut, Okita Tsuyoshi
Format: Article
Language:English
Published: Sciendo 2017-10-01
Series:Prague Bulletin of Mathematical Linguistics
Online Access:https://doi.org/10.1515/pralin-2017-0036