7:00 – 4:30 ASRC IRAD
- Talked with Eric yesterday. going to write up a white paper about teachable AI. Two-three week effort
- Speaking of which, The Evolved Transformer
- Recent works have highlighted the strengths of the Transformer architecture for dealing with sequence tasks. At the same time, neural architecture search has advanced to the point where it can outperform human-designed models. The goal of this work is to use architecture search to find a better Transformer architecture. We first construct a large search space inspired by the recent advances in feed-forward sequential models and then run evolutionary architecture search, seeding our initial population with the Transformer. To effectively run this search on the computationally expensive WMT 2014 English-German translation task, we develop the progressive dynamic hurdles method, which allows us to dynamically allocate more resources to more promising candidate models. The architecture found in our experiments – the Evolved Transformer – demonstrates consistent improvement over the Transformer on four well-established language tasks: WMT 2014 English-German, WMT 2014 English-French, WMT 2014 English-Czech and LM1B. At big model size, the Evolved Transformer is twice as efficient as the Transformer in FLOPS without loss in quality. At a much smaller – mobile-friendly – model size of ~7M parameters, the Evolved Transformer outperforms the Transformer by 0.7 BLEU on WMT’14 English-German.
- Finished running Tymora1 on Slack. Downloaded, though the download didn’t include research_notes. Hmmm. Looks like I can’t make it public, either.
- Thinking about writing a tagging app, possibly with a centrality capability.
- Started on the Teachable AI paper. The rough outline is there, and I have a good set of references.