🐨 Me

I am currently a second-year Master student at Shanghai Jiao Tong University, under the supervise of Prof. Fan Cheng, major in computer science. Before that, I received my B.S. degree in IEEE honor class at SJTU, major in computer science. I used to be a research intern at SJTU interpretable ML lab, guided by Prof. Quanshi Zhang during my undergraduate study.

In 2021.10~2022, I worked as a research intern in Microsoft Research Asia (MSRA) DKI group guided by Haoyu Dong. I worked on some interesting research topics related to semi-structured tabular data, such as table pre-training, table question answering.

Currently, I work as a research assistant in XLang Lab @HKUNLP advised by Prof. Tao Yu. I also have the fortune to work with Qian Liu at Sea AI Lab, who is an excellent collaborator!πŸ€—

πŸ”₯ News

  • 2023.10: Β πŸ”₯πŸ”₯ We’ve built OpenAgents, an open platform for language agents in the wild!
  • 2023.10: Β πŸ™‹πŸ™‹ We have released Lemur-70B, an agentic language model based on LLama-2!
  • 2023.04: Β πŸ”₯πŸ”₯ New preprint applying symbolic tasks in instruction tuning!
  • 2022.10: Β πŸŽ‰πŸŽ‰ Our TaCube paper(Table QA) got accepted by EMNLP’22.

πŸ₯‘ Projects

sym

OpenAgents

Host your own ChatGPT Plus locally!

Demo | Paper | Doc |

  • Data Agent: code interpreter augmented with data tools
  • Plugins Agent: 200+ plugins for daily life
  • Web Agent: autonomous web browsing

πŸ“ Publications

OpenAgents: An Open Platform for Language Agents in the Wild

Tianbao Xie*, Fan Zhou*, Zhoujun Cheng*, Peng Shi*, Luoxuan Weng*, Yitao Liu*, Toh Jing Hua, Junning Zhao, Qian Liu, Che Liu, Leo Z. Liu, Yiheng Xu, Hongjin Su, Dongchan Shin, Caiming Xiong, Tao Yu, (*=equal contribution)
(2023, Preprint) | πŸ“„ PDF | πŸ›  Code | πŸ““ Blog |

Lemur: Harmonizing Natural Language and Code for Language Agents

Yiheng Xu*, Hongjin Su*, Chen Xing*, Boyu Mi, Qian Liu, Weijia Shi, Binyuan Hui, Fan Zhou, Yitao Liu, Tianbao Xie, Zhoujun Cheng, Siheng Zhao, Lingpeng Kong, Bailin Wang, Caiming Xiong, Tao Yu, (*=equal contribution)
(2023, Preprint) | πŸ“„ PDF | πŸ›  Code | πŸ€— hf models | πŸ““ Blog |

From Zero to Hero: Examining the Power of Symbolic Tasks in Instruction Tuning

Qian Liu*, Fan Zhou*, Zhengbao Jiang, Longxu Dou, Min Lin, (*=equal contribution)
(2023, Preprint) | πŸ“„ PDF | πŸ›  Code | πŸ€— hf datasets & models | ✊ Twitter |

Reflection of Thought: Inversely Eliciting Numerical Reasoning in Language Models via Solving Linear Systems

Fan Zhou*, Haoyu Dong*, Qian Liu, Zhoujun Cheng, Shi Han, Dongmei Zhang, (*=equal contribution)
NeurIPS 2022, 2nd MATH-AI Workshop | πŸ“„ PDF

TaCube: Pre-computing Data Cubes for Answering Numerical-Reasoning Questions over Tabular Data

Fan Zhou, Mengkang Hu, Haoyu Dong, Zhoujun Cheng, Fan Cheng, Shi Han, Dongmei Zhang
EMNLP 2022, Oral | πŸ“„ PDF | πŸ›  Code

Table Pre-training: A Survey on Model Architectures, Pretraining Objectives, and Downstream Tasks

Haoyu Dong, Zhoujun Cheng, Xinyi He, Mengyu Zhou, Anda Zhou, Fan Zhou, Ao Liu, Shi Han, Dongmei Zhang
IJCAI 2022(survey track) | πŸ“„ PDF

Exploring Image Regions Not Well Encoded by an INN

Zenan Ling, Fan Zhou, Meng Wei, Quanshi Zhang
AISTATS 2022 | πŸ“„ PDF

Quantification and Analysis of Layer-wise and Pixel-wise Information Discarding

Haotian Ma, Hao Zhang, Fan Zhou, Quanshi Zhang
ICML 2022 | πŸ“„ PDF | πŸ›  Code

πŸŽ– Honors and Awards

  • MSRA Stars of Tomorrow (Award of Excellent Intern), 2022
  • Outstanding Graduates of SJTU, 2021
  • SJTU Academic Scholarship, 2017~2020
  • Shanghai City Scholarship(β‰ˆtop 5%), 2018

πŸ“– Educations

  • 2021.09 - 2024.03 (expected), M.S.@SJTU, Computer Science & Engineering
  • 2017.09 - 2021.06, B.S.@SJTU, IEEE honor class, Computer Science.