Skip to content

Latest commit

 

History

History

paper

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 

经典论文参考

CV

计算机视觉领域的经典论文

Survey

  1. A Comprehensive Survey on Source-free Domain Adaptation
  2. Domain Generalization in Computational Pathology: Survey and Guidelines
  3. A Survey on Generative Modeling with Limited Data, Few Shots, and Zero Shot
  4. Know Your Self-supervised Learning: A Survey on Image-based Generative and Discriminative Training
  5. On the Design Fundamentals of Diffusion Models: A Survey

Paper

图像分类

  1. AlexNet_ImageNet Classification with Deep Convolutional
  2. VGGNet_Very Deep Convolutional Networks for Large-Scale Image Recognition
  3. ResNet_Deep Residual Learning for Image Recognition
  4. DenseNet_Densely Connected Convolutional Networks
  5. ViT_An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale

目标检测

  1. R-CNN_Rich feature hierarchies for accurate object detection and semantic segmentation
  2. Fast R-CNN
  3. Mask R-CNN
  4. YOLO_You Only Look Once: Unified, Real-Time Object Detection
  5. DETR_End-to-End Object Detection with Transformers

语义分割

  1. BiSeNet_Bilateral Segmentation Network for Real-time Semantic Segmentation
  2. FCN_Fully Convolutional Networks for Semantic Segmentation
  3. OCRNet_Object-Contextual Representations for Semantic Segmentation
  4. U-Net_Convolutional Networks for Biomedical Image Segmentation
  5. Swin Transformer_Hierarchical Vision Transformer using Shifted Windows

生成模型

NLP

自然语言处理领域的经典论文

Survey

  1. Augmented Language Models: a Survey
  2. Model-tuning Via Prompts Makes NLP Models Adversarially Robust
  3. A Survey on In-context Learning

Paper

传统自然语言处理

  1. Word2Vec_Efficient Estimation of Word Representations in Vector Space
  2. CNN_Convolutional Neural Networks for Sentence Classification
  3. RNN_A Critical Review of Recurrent Neural Networks for Sequence Learning
  4. Seq2Seq_Sequence to Sequence Learning with Neural Networks
  5. Convolutional Sequence to Sequence Learning
  6. GloVe_Global Vectors for Word Representation

大模型(LLM)

  1. Transformer_Attention Is All You Need
  2. GPT_Improving Language Understanding by Generative Pre-Training
  3. GPT2_Language Models are Unsupervised Multitask Learners
  4. [GPT3_Language Models are Few-Shot Learners](2005.14165 (arxiv.org))
  5. [GPT3.5_Training language models to follow instructions with human feedback](2203.02155 (arxiv.org))
  6. [GPT-4 Technical Report](2303.08774 (arxiv.org))
  7. BERT_Pre-training of Deep Bidirectional Transformers for Language Understanding

大模型微调

  1. LORA: LOW-RANK ADAPTATION OF LARGE LANGUAGE MODEL
  2. [The Power of Scale for Parameter-Efficient Prompt Tuning]([2104.08691] The Power of Scale for Parameter-Efficient Prompt Tuning (arxiv.org))
  3. [Chain-of-Thought Prompting Elicits Reasoning in Large Language Models](2201.11903 (arxiv.org))

多模态(Multi-model Language Modal)

  1. PaLM-E: An Embodied Multimodal Language Model
  2. [Visual Instruction Tuning](2304.08485 (arxiv.org))
  3. [TALLRec: An Effective and Efficient Tuning Framework to Align Large Language Model with Recommendation](TALLRec: An Effective and Efficient Tuning Framework to Align Large Language Model with Recommendation (arxiv.org))
  4. LLaRA: Aligning Large Language Models with Sequential Recommenders

RS

推荐系统领域的经典论文

Survey

  1. A Survey on User Behavior Modeling in Recommender Systems
  2. Disentangled Representation Learning
  3. A Cookbook of Self-Supervised Learning
  4. Self-Supervised Learning for Recommender Systems: A Survey
  5. Graph Neural Networks in Recommender Systems: A Survey

Paper

  1. Bayesian Personalized Ranking
  2. Neural Collaborative Filtering
  3. Neural Graph Collaborative Filtering
  4. LightGCN_Simplifying and Powering Graph Convolution Network for Recommendation
  5. Self-Attentive Sequential Recommendation
  6. Intent-aware Ranking Ensemble for Personalized Recommendation
  7. LightGT_A Light Graph Transformer for Multimedia Recommendation
  8. Graph Transformer for Recommendation
  9. Learning Disentangled Representations for Recommendation
  10. Deep Interest Network for Click-Through Rate Prediction
  11. Less is More: Reweighting Important Spectral Graph Features for Recommendation
  12. Search-based User Interest Modeling with Lifelong Sequential Behavior Data for Click-Through Rate Prediction
  13. Multi-behavior Self-supervised Learning for Recommendation
  14. Multi-Scenario Ranking with Adaptive Feature Learning
  15. Towards Multi-Interest Pre-training with Sparse Capsule Network

Re