Xun Zhang

Email: zhangxun04@sjtu.edu.cn

zhangxun2.jpg

I am an undergraduate student at the Department of Computer Science and Engineering (CSE), Shanghai Jiao Tong University (SJTU), advised by Prof. Yulun Zhang.

Research interests: computer vision, model compression, and multimodal learning.

news

Mar 09, 2026 I was awarded the title of “Merit Student” at the university level for the 2024–2025 academic year. 🌈
Feb 01, 2026 Our work Q-DiT4SR is released: Paper | GitHub | Arxiv. 🌸
Jan 26, 2026 Our paper Grounding-IQA is accepted by ICLR 2026. 🎯
Jan 24, 2026 I am serving as a Reviewer for ICML 2026 and ECCV 2026. 🌻
Dec 09, 2025 I received the 2025 University-Level Second-Class Scholarship! 🎈
Dec 06, 2025 Our work TreeQ is released: Paper | GitHub | Arxiv. 🌳
Nov 11, 2025 I received the 2025 Yanbao Scholarship (¥10,000 RMB)! 🤩
Oct 03, 2025 I am serving as a Reviewer for CVPR 2026. 🍉
Sep 28, 2025 Our work RobuQ is released: Paper | GitHub | Arxiv. 🌱
Sep 24, 2025 My overall assessment score ranked 4th out of 99 students in the 2024-2025 academic year. ✌️
Nov 29, 2024 I received the 2024 University-Level Third-Class Scholarship! 🥳
Nov 27, 2024 Our work Grounding-IQA is released: Paper | Project | GitHub. ✨
May 04, 2024 I was awarded the title of “2024 University-Level Outstanding Communist Youth League Member”. 🎉

selected publications

  1. Grounding-IQA: Multimodal Language Grounding Model for Image Quality Assessment
    Zheng Chen, Xun Zhang, Wenbo Li, Renjing Pei, Fenglong Song, Xiongkuo Min, Xiaohong Liu, Xin Yuan, Yong Guo, and Yulun Zhang
    In International Conference on Learning Representations, 2026
  2. Q-DiT4SR: Exploration of Detail-Preserving Diffusion Transformer Quantization for Real-World Image Super-Resolution
    Xun Zhang, Kaicheng Yang, Hongliang Lu, Haotong Qin, Yong Guo, and Yulun Zhang
    arXiv preprint arXiv:2602.01273, 2026
  3. TreeQ: Pushing the Quantization Boundary of Diffusion Transformer via Tree-Structured Mixed-Precision Search
    Kaicheng Yang, Kaisen Yang, Baiting Wu, Xun Zhang, Qianrui Yang, Haotong Qin, He Zhang, and Yulun Zhang
    arXiv preprint arXiv:2512.06353, 2025
  4. RobuQ: Pushing DiTs to W1.58A2 via Robust Activation Quantization
    Kaicheng Yang*, Xun Zhang*, Haotong Qin, Yucheng Lin, Kaisen Yang, Xianglong Yan, and Yulun Zhang
    arXiv preprint arXiv:2509.23582, 2025