Jump to the main content block
TITLE

[2021-Dec-22] Pre-trained Language Model Is All You Need for Winning NLP Competitions

Institute of Information Systems and Applications

Speaker:

Prof. Tzong-Han Tsai 蔡宗翰教授

Professor, National Central University

Topic:

Pre-trained Language Model Is All You Need for Winning NLP Competitions

Date:

13:20-15:00 Wednesday 22-Dec-2021

QR code:

QR code-1222.png

Link:

https://reurl.cc/ZjDkW3

Hosted by:

Prof. Hung-Kuo Chu

 

 

Abstract

In recent years, large pre-trained transformer-based language models (PLMs), such as the BERT family of models, have taken the natural language processing (NLP) field by storm, achieving state-of-the-art performance on many tasks. PLMs are pre-trained on billions of sentences. This allows it to project the deep semantic meaning of  texts into a high-dimensional space, making some difficult NLP tasks much easier to tackle.

Through using pre-trained language models, the IISR Lab of National Central University has recently achieved excellent results in several international competitions including BioCreative, BioASQ 9 and FEVEROUS. In this talk, Prof. Richard Tsai will share the basic concepts of PLM, its importance and detailed application experiences.

 

Bio.

Richard Tzong-Han Tsai, National Central University & Academia Sinica Richard Tzong-Han Tsai is a professor in the Department of Computer Science and Information Engineering at National Central University and the CEO of Center for GIS at Academia Sinica. His main research interests include natural language processing, deep learning, chatbots/dialogue systems, cross-lingual information access, sentiment analysis, and digital humanities, etc.

He achieved great success in several international academic completions. His team won the first place of the word segmentation task and 2nd place of the named entity recognition task in SIGHAN bakeoff. In addition, his team won a series of named entity transliteration tasks held with the NEWS workshop (held by the Institute for Infocomm Research, Singapore) in 2011, 2012, and 2015. In the biomedical text mining field, his team won several BioCreative competitions (held by the National Institute of Healthcare) from 2010 to 2021 and won the first place of BioASQ Task 8b Phase B in 2020 and 2021.

Because of his contributions in the above mentioned areas, he received several research awards including Outstanding Newly-Recruited Professor Award of National Central University, PhD Dissertation Advisor Award of The Association for Computational Linguistics and Chinese Language Processing, and National Science Council Exceptional Research Talent Award of Ministry of Science and Technology. Until now, he has published papers in several top journals, including Bioinformatics, Nucleic Acids Research, IEEE-ACM Transactions on Computational Biology and Bioinformatics, Journal of the American Society for Information Science and Technology, IEEE Transactions on Audio, Speech and Language Processing, and Decision Support Systems. His papers have been accepted by leading artificial intelligence (AI) or natural language processing (NLP) conferences such as ACL, IJCAI, NAACL, COLING, and IJCNLP. Prof. Tsai is also the principle investigator of the university AI Competition project sponsored by Ministry of Education (Taiwan).

 

 

All faculty and students are welcome to join.

Click Num: