line/LINE-DistilBERT-Japanese
DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
Stars: 46
Give AlbumentationsX a star on GitHub — it powers this leaderboard
Star on GitHubDistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.