Skip to content

LAMDA-CL Lab

🏛 About Us

LAMDA-CL Lab is part of the LAMDA group at Nanjing University, with a research focus on continual, lifelong, and incremental learning—the ability for models to learn continuously without forgetting, adapt to new tasks, and generalize over time with or without access to past data. We aim to empower intelligent systems with the ability to learn across tasks, domains, and modalities, enabling them to grow more capable and versatile as data evolves.

Our lab is composed of faculty members, PhD students, and interns working together to push the frontier of general-purpose and adaptive AI. Members of LAMDA-CL are actively contributing to top AI venues such as CVPR, ICML, NeurIPS, ICLR, TPAMI, and IJCV.

🔬 Our Research Focus

We investigate both theoretical foundations and practical algorithms for continual learning, with emphasis on:

  • Class-Incremental Learning (CIL): Adapting models to new classes without retraining from scratch or accessing full historical data.
  • Domain-Incremental Learning (DIL): Transferring and adapting models to new environments and data distributions.
  • Pre-trained Model Adaptation: Leveraging foundation models for efficient continual learning with minimal forgetting.
  • Few-shot and memory-efficient continual learning: Solving incremental tasks with limited exemplars or task labels.
  • Toolbox and benchmark development: Providing accessible, extensible toolkits for reproducible continual learning research.
  • ...

📦 Open-Source Toolkits

We contribute to the community with practical and research-oriented toolkits:

  • PyCIL: An open-source Python toolbox for class-incremental learning.
  • PILOT: A toolkit for pre-trained model-based continual learning.

🎓 Join Us

We are always looking for motivated PhD students, postdocs, Master's students, and visiting interns who share our interests in lifelong learning, generalization, and pre-trained models. If you are passionate about building adaptive, intelligent systems that learn continually from the world, we welcome you to join our team.

Feel free to check out our latest updates, explore our research, and reach out to us for potential collaboration opportunities.

Pinned Loading

  1. PyCIL PyCIL Public

    PyCIL: A Python Toolbox for Class-Incremental Learning

    Python 918 143

  2. LAMDA-PILOT LAMDA-PILOT Public

    🎉 PILOT: A Pre-trained Model-Based Continual Learning Toolbox

    Python 415 45

  3. CIL_Survey CIL_Survey Public

    Class-Incremental Learning: A Survey (TPAMI 2024)

    Python 251 28

  4. RevisitingCIL RevisitingCIL Public

    Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need (IJCV 2024)

    Python 137 22

  5. TPAMI-Limit TPAMI-Limit Public

    The code repository for "Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks" (TPAMI 2023)

    Python 36 6

  6. PROOF PROOF Public

    Learning without Forgetting for Vision-Language Models (TPAMI 2025)

    Python 34 2

Repositories

Showing 10 of 12 repositories
  • LAMDA-PILOT Public

    🎉 PILOT: A Pre-trained Model-Based Continual Learning Toolbox

    LAMDA-CL/LAMDA-PILOT’s past year of commit activity
    Python 415 MIT 45 2 0 Updated Apr 19, 2025
  • PyCIL Public

    PyCIL: A Python Toolbox for Class-Incremental Learning

    LAMDA-CL/PyCIL’s past year of commit activity
    Python 918 143 6 2 Updated Apr 19, 2025
  • .github Public
    LAMDA-CL/.github’s past year of commit activity
    0 0 0 0 Updated Apr 18, 2025
  • PROOF Public

    Learning without Forgetting for Vision-Language Models (TPAMI 2025)

    LAMDA-CL/PROOF’s past year of commit activity
    Python 34 2 0 0 Updated Feb 21, 2025
  • CIL_Survey Public

    Class-Incremental Learning: A Survey (TPAMI 2024)

    LAMDA-CL/CIL_Survey’s past year of commit activity
    Python 251 28 0 1 Updated Nov 7, 2024
  • RevisitingCIL Public

    Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need (IJCV 2024)

    LAMDA-CL/RevisitingCIL’s past year of commit activity
    Python 137 22 0 0 Updated Aug 25, 2024
  • Awesome-Few-Shot-Class-Incremental-Learning Public

    Awesome Few-Shot Class-Incremental Learning

    LAMDA-CL/Awesome-Few-Shot-Class-Incremental-Learning’s past year of commit activity
    237 29 0 1 Updated Jan 14, 2024
  • TPAMI-Limit Public

    The code repository for "Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks" (TPAMI 2023)

    LAMDA-CL/TPAMI-Limit’s past year of commit activity
    Python 36 6 0 0 Updated Nov 15, 2023
  • CVPR22-Fact Public

    Forward Compatible Few-Shot Class-Incremental Learning (CVPR'22)

    LAMDA-CL/CVPR22-Fact’s past year of commit activity
    Python 129 23 0 0 Updated Oct 31, 2023
  • MM21-Coil Public

    The code repository for "Co-Transport for Class-Incremental Learning" (ACM MM'21) in PyTorch.

    LAMDA-CL/MM21-Coil’s past year of commit activity
    Python 12 1 0 0 Updated Dec 24, 2021

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…