Curious;的个人划水博客
  • 首页
  • 归档
  • 分类
  • 标签
  • 关于
  •   
  •   

acl2022_debiased(消除偏置)Contrasitve Learning

acl2022_debiased(消除偏置)Contrasitve LearningDebiased Contrastive Learning of Unsupervised Sentence Representations 消除偏置的对比学习,用于无监督的句子表征 个人总结文章的亮点如下: 1)motivation是一个非常常见的motivation,就是说抽样得到的负例不一定很可靠,这样基于一
2022-08-11
research > contrastive learning
#contrastive learning

acl2021_CL-for-BERT-Sentence Representations论文阅读

acl2021_CL-for-BERT-Sentence Representations论文阅读论文全称:Self-Guided Contrastive Learning for BERT Sentence Representations 总结贡献: 两个bert构造正负例:①一个参数不参与优化的Bert,对所有层的transformer隐藏层做maxpooling,即(batch,len,76
2022-08-07
research > contrastive learning
#contrastive learning

cvpr2020_MoCo-MomentumCL论文阅读

cvpr2020_MoCo-MomentumCL论文阅读Momentum Contrast for Unsupervised Visual Representation Learning 中文翻译为:用于无监督视觉表征学习的动量对比 个人总结贡献如下: Momentum方式进行对比学习,优化了memory bank方法中,bank中由于参数更新导致各项差距过大的问题,这里momentum使得ba
2022-08-07
research > contrastive learning
#contrastive learning

acl-findings2022_Virtual Augumentation CL论文阅读

acl-findings2022_Virtual Augumentation CL论文阅读Virtual Augmentation Supported Contrastive Learning of Sentence Representations 虚拟增强支持的句子表征的对比性学习 个人总结文章的亮点如下: 1)hard-negative角度,有一种反向的思路的感觉,邻域内的样本制造hard-n
2022-08-02
research > contrastive learning
#contrastive learning

naacl2022_Token-aware Contrastive Learning论文阅读

naacl2022_Token-aware Contrastive Learning论文阅读TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning TaCL:利用标记感知对比学习改进BERT预训练; Token-aware CL,个人总结文章的亮点如下: 1)首次将对比性学习用于改进Transformer模型的
2022-08-02
research > contrastive learning
#contrastive learning

naacl2022_Entity-aware CL for sentence emb论文阅读

naacl2022_Entity-aware CL for sentence emb论文阅读EASE:Entity-Aware Contrasitve Learning of Sentence Embedding 实体感知的对比学习用于句子的embedding表征 Entity-aware CL for sentence embedding,个人总结文章的亮点如下: 1)hyperlink ent
2022-08-02
research > contrastive learning
#contrastive learning

emnlp2021_Raise a Child in Large Language Model论文阅读

emnlp2021_Raise a Child in Large Language Model论文阅读Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning 中文翻译为:在大型语言模型中培养Child:实现有效且通用的fine-tune 个人总结贡献如下: 一种有些即插即用感觉的
2022-07-21
research > contrastive learning
#contrastive learning

emnlp2021_SimCSE论文阅读

emnlp2021_SimCSE论文阅读**SimCSE: Simple Contrastive Learning of Sentence Embeddings **普林斯顿&清华大学 SimCSE,个人总结文章的亮点如下: 1)一种无监督的方法,只简单使用BERT中的dropout,一句话过同一个bert两次,生成的两种representation之间构成positive pair; 2
2022-07-21
research > contrastive learning
#contrastive learning

acl2021_CLINE论文阅读

acl2021_CLINE论文阅读总结贡献: 用同/反义词构造正负例,三个损失:①MLM;②预测每个token是否被替换,0/1二分类;③对比损失,即正例拉近,反例远离 referencehttps://blog.csdn.net/qq_33161208/article/details/123631813 # ACL2021 | 对比学习8篇论文一句话总结 0. Abst
2022-07-15
research > contrastive learning
#contrastive learning

Contrastive Learning 初步调研&了解

Contrastive Learning 初步调研&了解12备注:中文名为:对比表示学习,或者说叫做对比表征学习? Referenceshttps://www.zhihu.com/zvideo/1296467630460039168 https://zhuanlan.zhihu.com/p/450561239 https://zhuanlan.zhihu.com/p/471018370
2022-07-05
research > contrastive learning
#contrastive learning
1…89101112…20

搜索

Hexo Fluid