Schuetze NLP Lab, Center for Information and Language Processing (CIS),
Ludwig Maximilians University of Munich (LMU Munich),
Munich Center for Machine Learning (MCML)
Email: lastname@cis.lmu.de
About Me
- Hi, I am Ercong, a third-year PhD student at Center for Information and Language Processing (CIS), LMU Munich.
- I am supervised by PD. Dr. Helmut Schmid. I am also part of the Schütze Lab led by Prof. Hinrich Schütze, and an affiliated doctoral researcher of Munich Center for Machine Learning (MCML).
- I obtained my M.Sc. degree in computational linguistics and informatics (computer science) at LMU Munich. Prior to LMU, I was an undergraduate at Shanghai Jiao Tong University (SJTU), majoring in German Studies and minoring in Finance. During my bachelor study, I spent one exchange semester at University of Heidelberg in Comparative German Studies.
Research Interest
I have a broad research interest in the field of Natural Language Processing (NLP), especially in multilingual NLP, efficient methods for NLP, and human-inspired NLP.
- Multilingual NLP: multilinguality of LLMs (Nie et al., 2024), cross-lingual transfer (Nie et al., 2023a, Li et al., 2023b, Ma et al., 2024), historical language processing (Nie et al., 2023b).
- Efficient methods for NLP: prompt-based learning(Nie et al., 2023c, Ma et al., 2023, Li et al., 2023a), low-resource learning (Liu et al., 2024), parameter-efficient fine-tuning (Yuan et al., 2024a, Yuan et al., 2024b).
- Human-inspired NLP: NLP inspired by human language processing (Zhang et al., 2023, Chen et al., 2024), computational neurolinguistics (He et al., 2024).
Feel free to reach out if you're interested in topics related to NLP and LLM, including multilinguality, interpretability, retrieval-augmented methods, human-inspired NLP, and their intersections with digital humanities and social sciences or domain-specific applications.
Selected Publications
- Ercong Nie*, Sheng Liang*, Helmut Schmid, Hinrich Schütze. Cross-Lingual Retrieval Augmented Prompt for Low-Resource Languages. In ACL Findings 2023. [Paper], [Code]
- Ercong Nie, Helmut Schmid, Hinrich Schuetze. Unleashing the Multilingual Encoder Potential: Boosting Zero-Shot Performance via Probability Calibration. In EMNLP Findings 2023. [Paper], [Code]
- Bolei Ma*, Ercong Nie*, Shuzhou Yuan, Helmut Schmid, Michael Färber, Frauke Kreuter, Hinrich Schuetze. ToPro: Token-Level Prompt Decomposition for Cross-Lingual Sequence Labeling Tasks. In EACL 2024. [Paper], [Code]
- Yongkang Liu*, Ercong Nie*, Shi Feng, Zheng Hua, Zifeng Ding, Daling Wang, Yifei Zhang, Hinrich Schütze. A Unified Data Augmentation Framework for Low-Resource Multi-domain Dialogue Generation. In ECML-PKDD 2024. [Paper]
- Shuzhou Yuan, Ercong Nie, Michael Färber, Helmut Schmid, Hinrich Schuetze. GNNavi: Navigating the Information Flow in Large Language Models by Graph Neural Network. In ACL Findings 2024. [Paper], [Code]
- Linyang He, Peili Chen, Ercong Nie, Yuanning Li, Jonathan R. Brennan. Decoding Probing: Revealing Internal Linguistic Structures in Neural Language Models Using Minimal Pairs. In LREC-COLING 2024. [Paper]
(* denotes equal contribution)
See more in my publications.
Academic Services
Conference Reviewer
- 2025: ACL ARR (ACL, EMNLP, NAACL), COLING, IJCNN, Second Workshop on Ancient Language Processing @ NAACL, The 19th Linguistic Annotation Workshop @ ACL
- 2024: ACL ARR (ACL, EMNLP, NAACL, EACL), LREC-COLING, SemEval
- 2023: EMNLP, First Workshop on Bangla Language Processing @ EMNLP, Workshop on Instruction Tuning and Instruction Following @ NeurIPS, CoNLL-CMCL BabyLM Challenge @ EMNLP
Journal Reviewer
- ACM Transactions on Intelligent Systems and Technology (ACM TIST)
- Royal Society Open Science (RSOS)
Community Members
- Committe member of NICE, an NLP Academic Exchange Platform.
- Member of AI Grid, a community connecting young scientists AI funded by the German Ferderal Ministry of Education and Research.
- Junior member of Munich Center for Machine Learning (MCML), one of six German national AI Competence Centers.
Credits: This page was originally created by Peiqin Lin and has been adopted and modified by me.