Jiuxiang Gu
Home
News
Publications
CV
Machine Learning
Selective reflection-tuning: Student-selected data recycling for llm instruction-tuning (Findings of the Association for Computational Linguistics (ACL) 2024)
Self-Cleaning: Improving a Named Entity Recognizer Trained on Noisy Data with a Few Clean Instances (Findings of the Association for Computational Linguistics (NAACL) 2024)
TRINS: Towards Multimodal Language Models that Can Read (Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 2024)
Tensor attention training: Provably efficient learning of higher-order transformers (arXiv 2024)
Toffee: Efficient Million-Scale Dataset Construction for Subject-Driven Text-to-Image Generation (arXiv 2024)
Unraveling the Smoothness Properties of Diffusion Models: A Gaussian Mixture Perspective (arXiv 2024)
A Critical Analysis of Document Out-of-Distribution Detection (Findings of the Association for Computational Linguistics (EMNLP) 2023)
Reflection-tuning: Data recycling improves llm instruction-tuning (Workshop on Instruction Tuning and Instruction Following at NeurIPS 2023)
Delving into out-of-distribution detection with vision-language representations (Advances in neural information processing systems (NeurIPS) 2022)
Ei-clip: Entity-aware interventional contrastive learning for e-commerce cross-modal retrieval (Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 2022)
«
»
Cite
×