2022.acl-tutorials.1@ACL

Total: 1

#1 A Gentle Introduction to Deep Nets and Opportunities for the Future [PDF] [Copy] [Kimi1]

Authors: Kenneth Church ; Valia Kordoni ; Gary Marcus ; Ernest Davis ; Yanjun Ma ; Zeyu Chen

The first half of this tutorial will make deep nets more accessible to a broader audience, following “Deep Nets for Poets” and “A Gentle Introduction to Fine-Tuning.” We will also introduce GFT (general fine tuning), a little language for fine tuning deep nets with short (one line) programs that are as easy to code as regression in statistics packages such as R using glm (general linear models). Based on the success of these methods on a number of benchmarks, one might come away with the impression that deep nets are all we need. However, we believe the glass is half-full: while there is much that can be done with deep nets, there is always more to do. The second half of this tutorial will discuss some of these opportunities.