본문 바로가기
  • Machine Learning Paper Reviews (Mostly NLP)

Methodologies5

STILTs: Supplementary Training on Pretrained Sentence Encoders Sentence Encoders on STILTs: Supplementary Training on Intermediate Labeled-data Tasks Jason Phang, Thibault F'evry, Samuel R. Bowman 27 Feb 2019 A Second Stage of Pretraining In order to overcome the flaws of existing encoders, transfer learning has allowed us to enhance the performance of various tasks by starting with a pretrained model that has already learned relevant features or representa.. 2023. 8. 26.
What Is Wrong With Backpropagation The Forward-Forward Algorithm: Some Preliminary Investigations Geoffrey Hinton [Google Brain] 27 Dec 2022 What Is Wrong With Backpropagation Despite the mathematical advantages we've obtained thanks to backpropagation, the paper maintains that backpropagation is an implausible method when we consider how the actual cortex is trained. Cortex does not mirror bottom-up connections like backpropagat.. 2023. 3. 25.
GAN: Generative Adversarial Networks Generative Adversarial Nets Ian J. Goodfellow, Jean Pouget-Abadie, Yoshua Bengio, etc. 10 Jun 2014 Deep Generative Models The ultimate goal of generative model is to approximate the ground-truth distribution of the data by using a neural network, and generate a data by sampling from the distribution. Surprisingly, generative models weren't that useful until among 2015, but it have recently impro.. 2023. 3. 13.
Detour for Domain-Specific Tasks Avoiding Adaptive Pre-training KALA: Knowledge-Augmented Language Model Adaptation Minki Kang, Jinheon Baek, Sung Ju Hwang 4 Aug 2022 Abstract 2023. 2. 7.