Skip to content

Main task (section 7)

7 Extensions and Improvements for Additional Downstream Tasks

see project description


  1. Additional pre-training, How to Fine-Tune BERT for Text Classification? Sun et al. [2020]
  2. Multiple negative ranking loss, Efficient Natural Language Response Suggestion for Smart Reply Henderson et al. [2017]
  3. Cosine-Similarity, Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks Reimers and Gurevych [2019]
  4. Regularized Optimization, SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models Principled Regularized Optimization Jiang et al. [2020]
  5. Multitask: BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning. Stickland and Murray [2019] MTRec: Multi-Task Learning over BERT for News Recommendation. Bi et al. [2022] Gradient surgery for multi-task learning. Yu et al. [2020]
  6. Contrastive Learning, Simple Contrastive Learning of Sentence Embeddings. Gao et al. [2021]
  7. ...
Edited by Qumeng Sun