My primary research lies in the area of natural language processing and efficient artificial intelligence. I am particularly interested in the sustainability and truthfulness of language models, which have opened promising avenues for research including:
Language Agents: Reasoning, memory, and planning with multimodal language models.
Efficient Language Models: Reduction of the computational and memory complexities in language models, while maintaining their performance on downstream tasks
Augmented Language Models: Fast knowledge learning and editing of language models from symbolic resources, such as knowledge graphs, texts, and tools
Language Intelligence Lab @ HUFS
Homepage: https://lilab.hufs.ac.kr/
최종학력
Ph.D. in Computer Science and Engineering, Korea University
전공분야
Deep Learning
주요 연구
- Efficient Language Model
- Augmented Language Models
- Complex Reasoning with Language Models
주요 강의
- Natural Language Processing
- Machine Learning
- Programming
- Information Retrieval
- Data Science
주요 논문/저서
(* denotes equal contribution)
(2025)
- Yerim Oh, Jun-Hyung Park, Junho Kim, SungHo Kim, SangKeun Lee. Incorporating Domain Knowledge into Materials Tokenization. ACL 2025.
- Nayeon Kim*, Eojin Jeon*, Jun-Hyung Park, SangKeun Lee. Handling Korean Out-of-Vocabulary Words with Phoneme Representation Learning. PAKDD 2025.
- Mingyu Lee, Junho Kim, Jun-Hyung Park, SangKeun Lee. Continual Debiasing: A Bias Mitigation Framework for Natural Language Understanding Systems. ESWA.
(2024)
- Jun-Hyung Park, Yeachan Kim, Mingyu Lee, Hyuntae Park, SangKeun Lee. MolTRES: Improving Chemical Language Representation Learning for Molecular Property Prediction. EMNLP 2024.
- Hyuntae Park*, Yeachan Kim*, Jun-Hyung Park, SangKeun Lee. Zero-shot Commonsense Reasoning over Machine Imagination. Findings of EMNLP 2024.
- Junho Kim*, Yeachan Kim*, Jun-Hyung Park, Yerim Oh, Suho Kim, SangKeun Lee. MELT: Materials-aware Continued Pre-training for Language Model Adaptation to Materials Science. Findings of EMNLP 2024.
- Jun-Hyung Park*, Hyuntae Park*, Yeachan Kim, Woosang Lim, SangKeun Lee. Moleco: Molecular Contrastive Learning with Chemical Language Models for Molecular Property Prediction. EMNLP 2024 Industry.
- Yeachan Kim, Jun-Hyung Park, SungHo Kim, Juhyeong Park, Sangyun Kim, SangKeun Lee. SEED: Semantic Knowledge Transfer for Language Model Adaptation to Materials Sciences. EMNLP 2024 Industry.
- Jun-Hyung Park, Mingyu Lee, Junho Kim, and SangKeun Lee. Coconut: Contextualized Commonsense Unified Transformers for Graph-Based Commonsense Augmentation of Language Models. Findings of ACL 2024.
(2023)
- Jun-Hyung Park*, Hyuntae Park*, Youjin Kang, Eojin Jeon, and SangKeun Lee. DIVE: Towards Descriptive and Diverse Visual Commonsense Generation. EMNLP 2023.
- Yeachan Kim, Junho Kim, Jun-Hyung Park, Mingyu Lee, and SangKeun Lee. Leap-of-Thought: Accelerating Transformers via Dynamic Token Routing. EMNLP 2023.
- Joon-Young Choi, Junho Kim, Jun-Hyung Park, Wing-Lam Mok, and SangKeun Lee. SMoP: Towards Efficient and Effective Prompt Tuning with Sparse Mixture-of-Prompts. EMNLP 2023.
- Yeachan Kim*, Junho Kim*, Wing-Lam Mok, Jun-Hyung Park and SangKeun Lee. Client-Customized Adaptation for Parameter-Efficient Federated Learning. Findings of ACL 2023.
- Jun-Hyung Park, Yeachan Kim, Junho Kim, Joon-Young Choi, and SangKeun Lee. Dynamic Structure Pruning for Compressing CNNs. AAAI 2023.
(2022)
- Jun-Hyung Park*, Mingyu Lee*, Junho Kim, Kang-Min Kim, and SangKeun Lee. Efficient Pre-training of Masked Language Model via Concept-based Curriculum Masking. EMNLP 2022.
- Jun-Hyung Park*, Junho Kim*, Mingyu Lee, Wing-Lam Mok, Joon-Young Choi, and SangKeun Lee. Tutoring Helps Students Learn Better: Improving Knowledge Distillation for BERT with Tutor Network. EMNLP 2022.
- Jun-Hyung Park*, Nayeon Kim*, Joon-Young Choi, Eojin Jeon, Youjin Kang, and SangKeun Lee. Break it Down into BTS: Basic, Tiniest Subword Units for Korean. EMNLP 2022.
- Jun-Hyung Park, Kang-Min Kim, and SangKeun Lee. Quantized Sparse Training: A Unified Trainable Framework for Joint Pruning and Quantization of DNNs. ACM TECS.
- Jun-Hyung Park*, Yong-Ho Jung*, Joon-Young Choi, Mingyu Lee, Junho Kim, Kang-Min Kim, and SangKeun Lee. Learning from Missing Relations: Contrastive Learning with Commonsense Knowledge Graphs for Commonsense Inference. Findings of ACL 2022.
- Jun-Hyung Park, Byung-Ju Choi, and SangKeun Lee. Examining the Impact of Adaptive Convolution on Natural Language Understanding. ESWA.