The University of Southampton

node type automatic_tidt_calendar_event has no template

<p>​WAIS Seminar with Professor Jie Tang<br></p><p>Title &#58; WuDao &#58; Pretrain the World<br></p><p>Abstract &#58;&#160;<br></p><p><span style="font-size&#58;12pt;font-family&#58;calibri, sans-serif;">Large-scale
pretrained models on web texts have substantially advanced the state of the art
in various AI tasks, such as natural language understanding and text
generation, and image processing, multimodal modeling. The downstream task
performances have also constantly increased in the past few years. In this
talk, I will first go through three families&#58; augoregressive models (e.g.,
GPT), autoencoding models (e.g., BERT), and encoder-decoder models. Then, I
will introduce China’s first homegrown super-scale intelligent model system,
with the goal of building an ultra-large-scale cognitive-oriented pretraining
model to focus on essential problems in general artificial intelligence from a
cognitive perspective. In particular, as an example, I will elaborate a novel
pretraining framework GLM (General Language Model) to address this challenge.
GLM has three major benefits&#58; (1) it performs well on classification,
unconditional generation, and conditional generation tasks with one single
pretrained model; (2) it outperforms BERT-like models on classification due to
improved pretrain-finetune consistency; (3) it naturally handles variable-length
blank filling which is crucial for many downstream tasks. Empirically, GLM
substantially outperforms BERT on the SuperGLUE natural language understanding
benchmark with the same amount of pre-training data.&#160;</span><br></p><p><br></p><p><br></p>

B385Saturday, May 4, 2024 - 04:32https://www.wais.ecs.soton.ac.uk/events/B385ECS SeminarsECS EventsBuilding 32- Room 3077