CoCoDev

via zoom

BabyBERTa: Learning More Grammar With Small-Scale Child-Directed Language Philip Huebner (University of Illinos, Urbana-Champaign) Abstract: Transformer-based language models have taken the NLP world by storm. However, their potential for addressing important questions in language acquisition research has been largely ignored. In this work, we examined the grammatical knowledge of RoBERTa (Liu et al., 2019) when […]