picture
  • Research
  • Publications
  • Group
  • Software
  • Muhan Zhang (张牧涵)

    Email: muhan "at" pku "dot" edu "dot" cn
    Google Scholar, Github, Lab github


    Biography

    Muhan is a tenure-track assistant professor and PhD advisor in the Institute for Artificial Intelligence of Peking University . Before coming back to China, he was a research scientist in Facebook AI (now Meta AI) working on large-scale graph learning systems and problems (2019-2021). He received his PhD degree in computer science from Washington University in St. Louis (2015-2019), advised by Prof. Yixin Chen. Before WashU, he obtained a bachelor degree from Shanghai Jiao Tong University as a member of the IEEE pilot class, where he worked with Prof. Ya Zhang and Prof. Wenjun Zhang.


    Research Interests

    Graph machine learning. Many real-world problems are inherently graph-structured, e.g., social networks, biological networks, World Wide Web, molecules, circuits, brain, road networks, knowledge graphs. Many machine learning algorithms are also defined on graphs, such as neural networks and graphical models. In this field, I develop algorithms and theories for learning over graphs, and apply them to problems like link prediction, graph classification, graph structure optimization, and knowledge graph reasoning. I am also interested in practical applications of graph neural networks, including brain modeling, drug discovery, circuit design, and healthcare applications.

    Large language models and reasoning. Compared to machine, human posesses extreme flexibility in handling unseen tasks in a zero-shot/few-shot way, much of which is attributed to human's System-II intelligence for handling complex logical reasoning, task planning, causal reasoning, and inductive generalization. Recently, large language models (LLMs) show promising performance in such tasks, while also demonstrate insufficiency in truly difficult cases and are ultimately still unreliable. In this field, we aim to understand and improve LLMs' reasoning ability, with means like graphs, code, grammars, and external tools. Ultimately, we want to build machines with equal reasoning ability to human which are flexible, intepretable, and reliable.


    Prospective Students

    I am looking for highly motivated PhD/undergraduate students who are interested in doing graph ML research with me. Please shoot me an email if you meet at least three of the following criteria: 1) creativity and passion for research, 2) solid math skills, 3) solid coding skills, 4) good English (writing and speaking). I will do my best to provide support for your success, including detailed guidance, plenty of computation resources, and research freedom for senior PhDs. You are especially welcome if you have interdisciplinary backgrounds (such as maths/physics/chemistry/biology) while proficient in coding. For students in Peking University, you can schedule one on one chats with me at my office. Due to the large number of applicants, the competition is intense this year and I may not be able to respond to every email. Hope you understand!


    News

    1/17/2024: 5 papers accepted at ICLR-24! Congrats to Hao, Xiyuan, Yinan, Zehao, and Ling!

    10/17/2023: We introduce a theoretical framework that views complex answer generation in large language models as a hierarchical "template-content" structure, which can explain how LLMs decompose tasks and perform complex reasoning. See preprint at Explaining the Complex Task Reasoning of Large Language Models with Template-Content Structure .

    9/22/2023: 5 papers accepted at NeurIPS-23! Congrats to Lecheng, Jiarui, Zian, Junru and Cai!

    6/5/2023: We proposed (k,t)-FWL+, a subgraph GNN achieving new state-of-the-art results on ZINC. See preprint at Towards Arbitrarily Expressive GNNs in O(n2) Space by Rethinking Folklore Weisfeiler-Lehman .

    5/29/2023: We introduced code prompting, a neural-symbolic prompting method that uses code as intermediate steps to improve LLMs' reasoning ability. See preprint at Code Prompting: a Neural Symbolic Method for Complex Reasoning in Large Language Models .

    5/24/2023: We systematically tested LLMs' inductive, deductive, and abductive reasoning abilities, and found that their performance dropped a lot when replacing natural language with symbols. See preprint at Large Language Models are In-Context Semantic Reasoners rather than Symbolic Reasoners .

    5/31/2023: Gave a talk on "Is Distance Matrix Enough for Geometric Deep Learning?" at Amazon Web Services AI Shanghai Lablet.

    5/8/2023: We constructed BREC: a new comprehensive dataset for evaluating GNN expressiveness. The arxiv link of our manuscript is https://arxiv.org/pdf/2304.07702.pdf. The dataset and evaluation code are at https://github.com/GraphPKU/BREC. We welcome suggestions, contributions or collaborations to improve BREC.

    4/25/2023: One paper accepted at ICML-23! From Relational Pooling to Subgraph GNNs. Congrats to Cai and Xiyuan!

    4/20/2023: We provided a complete theory of using graph neural networks (GNNs) for multi-node representation learning with labeling tricks, extending our original labeling trick paper. The arxiv link of our manuscript is https://arxiv.org/pdf/2304.10074.pdf

    4/18/2023: We proposed Neural Common Neighbor (NCN) for link prediction. The preprint can be found at Neural Common Neighbor with Completion for Link Prediction .

    1/21/2023: Two papers accepted at ICLR-23! I2-GNNs and Circuit Graph Neural Network. I2-GNNs is the first linear-time GNN model to count 6-cycles. Congrats to Yinan and Zehao!

    11/24/2022: Two papers accepted at LoG-22! Graph Neural Network with Local Frame and Subgraph-aware Weisfeiler-Lehman. Congrats to Xiyuan and Zhaohui!

    11/18/2022: Invited by BioMap to give a talk on 3DLinker!

    9/15/2022: Three papers accepted at NeurIPS-22! Rethinking KG evaluation, K-hop GNN, and Geodesic GNN. Rethinking KG evaluation is accepted as an oral presentation! Congrats to Haotong, Jiarui and Lecheng!

    7/26/2022: Gave a talk on "How Powerful are Spectral Graph Neural Networks" at "AI + Math" Colloquia of Shanghai Jiao Tong University! Video link (in English).

    7/9/2022: Gave a talk on "How Powerful are Spectral Graph Neural Networks" at LOGS China! Video link at Bilibili (in Chinese).

    5/16/2022: Three papers accepted at ICML-22! 3DLinker, JacobiConv, and PACE. 3DLinker is accepted as a long presentation (118/5630)! Congrats to Yinan, Xiyuan and Zehao!

    4/27/2022: Invited by Twitter (London and New York teams) to give a talk on Labeling Trick!

    4/15/2022: I will serve as an area chair for NeurIPS-22.

    4/9/2022: Invited by AI Time to give a talk of Nested GNNs and ShaDow in the AI2000 Young Scientist Track!

    3/6/2022: Elected to the 2022 AI 2000 Most Influential Scholars by AMiner. Ranked 66th globally in the AAAI/IJCAI category.

    2/17/2022: Invited by AI Time to give a talk of Labeling Trick!

    1/24/2022: Two papers accepted at ICLR-22! Subgraph Representation Learning GNN, Positional Encoding for more powerful GNN!

    1/10/2022: Our Book Graph Neural Networks: Foundations, Frontiers, and Applications is published! A free version is available on the GNN Book Website. Happy to contribute one chapter on GNN for link prediction!

    12/22/2021: I will serve as a meta-reviewer (area chair) for ICML-22.

    12/19/2021: Glad to organize a graph workshop at ICLR-22: PAIR2Struct: Privacy, Accountability, Interpretability, Robustness, Reasoning on Structured Data. Welcome to contribute!

    9/29/2021: Three papers accepted at NeurIPS! Labeling Trick, Nested GNN and ShaDow GNN!

    9/16/2021: Gave a presentation on Labeling Trick at 4Paradigm!

    8/19/2021: Gave a presentation on Labeling Trick at Remin University of China!

    3/13/2021: I am going to join Peking University as an assistant professor starting from May 2021. Welcome to apply for PKU!


    Publications

      2024

    • [WWW-24] H. Liu, J. Feng, L. Kong, D. Tao, Y. Chen, M. Zhang, Graph Contrastive Learning Meets Graph Meta Learning: A Unified Method for Few-shot Node Tasks, The Web Conference (WWW-24), 2024. (PDF) (Source code)
    • [ICLR-24] H. Liu, J. Feng, L. Kong, N. Liang, D. Tao, Y. Chen, M. Zhang, One For All: Towards Training One Graph Model For All Classification Tasks, International Conference on Learning Representations (ICLR-24), spotlight presentation, 2024. (PDF) (Source code)
    • [ICLR-24] X. Wang, H. Yang, M. Zhang, Neural Common Neighbor with Completion for Link Prediction, International Conference on Learning Representations (ICLR-24), 2024. (PDF) (Source code)
    • [ICLR-24] Y. Huang, W. Lu, J. Robinson, Y. Yang, M. Zhang, S. Jegelka, P. Li, On the Stability of Expressive Positional Encodings for Graph Neural Networks, International Conference on Learning Representations (ICLR-24), 2024. (PDF) (Source code)
    • [ICLR-24] Z. Dong, M. Zhang, P. Payne, M. Province, C. Cruchaga, T. Zhao, F. Li, Y. Chen, Rethinking the Power of Graph Canonization in Graph Representation Learning with Stability, International Conference on Learning Representations (ICLR-24), 2024. (PDF)
    • [ICLR-24] L. Yang, Y. Tian, M. Xu, Z. Liu, S. Hong, W. Qu, W. Zhang, B. Cui, M. Zhang#, J. Leskovec, VQGraph: Rethinking Graph Representation Space for Bridging GNNs and MLPs, International Conference on Learning Representations (ICLR-24), 2024. (PDF) (Source code)
    • 2023

    • [NeurIPS-23] J. Zhou, J. Feng, X. Wang and M. Zhang, Distance-Restricted Folklore Weisfeiler-Lehman GNNs with Provable Cycle Counting Power, Advances in Neural Information Processing Systems (NeurIPS-23), spotlight presentation, 2023. (PDF)
    • [NeurIPS-23] Z. Li, X. Wang, Y. Huang and M. Zhang, Is Distance Matrix Enough for Geometric Deep Learning?, Advances in Neural Information Processing Systems (NeurIPS-23), 2023. (PDF)
    • [NeurIPS-23] L. Kong, J. Feng, H. Liu, D. Tao, Y. Chen and M. Zhang, MAG-GNN: Reinforcement Learning Boosted Graph Neural Network, Advances in Neural Information Processing Systems (NeurIPS-23), 2023. (PDF)
    • [NeurIPS-23] J. Feng, L. Kong, H. Liu, D. Tao, F. Li, M. Zhang and Y. Chen, Extending the Design Space of Graph Neural Networks by Rethinking Folklore Weisfeiler-Lehman, Advances in Neural Information Processing Systems (NeurIPS-23), 2023. (PDF)
    • [NeurIPS-23] C. Zhou, X. Wang and M. Zhang, Facilitating Graph Neural Networks with Random Walk on Simplicial Complexes, Advances in Neural Information Processing Systems (NeurIPS-23), 2023. (PDF)
    • [VLDB-23] H. Yin, M. Zhang, J. Wang and P. Li, SUREL+: Moving from Walks to Sets for Scalable Subgraph-based Graph Representation Learning, Proc. of the VLDB Endowment (VLDB-23), volume 16, 2023. (PDF) (Source code)
    • [ICML-23] C. Zhou, X. Wang and M. Zhang, From Relational Pooling to Subgraph GNNs: A Universal Framework for More Expressive Graph Neural Networks, Proc. International Conference on Machine Learning (ICML-23), 2023. (PDF)
    • [ICLR-23] Y. Huang, X. Peng, J. Ma, M. Zhang, Boosting the Cycle Counting Power of Graph Neural Networks with I2-GNNs, International Conference on Learning Representations (ICLR-23), 2023. (PDF) (Source code)
    • [ICLR-23] Z. Dong, W. Cao, M. Zhang, D. Tao, Y. Chen, X. Zhang, CktGNN: Circuit Graph Neural Network for Electronic Design Automation, International Conference on Learning Representations (ICLR-23), 2023. (PDF)
    • [ASP-DAC-23] Y. Chen, J. Mai, X. Gao, M. Zhang and Y. Lin, MacroRank: Ranking Macro Placement Solutions Leveraging Translation Equivariancy, 28th Asia and South Pacific Design Automation Conference (ASP-DAC-23), 2023. (PDF)
    • 2022

    • [LoG-22] X. Wang and M. Zhang, Graph Neural Network with Local Frame for Molecular Potential Energy Surface, The First Learning on Graphs Conference (LoG-22), 2022. (PDF)(Source code)
    • [LoG-22] Z. Wang, Q. Cao, H. Shen#, B. Xu, M. Zhang# and X. Cheng, Towards Efficient and Expressive GNNs for Graph Classification via Subgraph-aware Weisfeiler-Lehman, The First Learning on Graphs Conference (LoG-22), 2022. (#corresponding author)(PDF)
    • [NeurIPS-22] H. Yang, Z. Lin and M. Zhang, Rethinking Knowledge Graph Evaluation Under the Open-World Assumption, Advances in Neural Information Processing Systems (NeurIPS-22), oral presentation (1.7% acceptance rate), 2022. (PDF)(Source code)
    • [NeurIPS-22] J. Feng, Y. Chen, F. Li, A. Sarkar and M. Zhang, How Powerful are K-hop Message Passing Graph Neural Networks, Advances in Neural Information Processing Systems (NeurIPS-22), 2022. (PDF)(Source code)
    • [NeurIPS-22] L. Kong, Y. Chen, M. Zhang, Geodesic Graph Neural Network for Efficient Graph Representation Learning, Advances in Neural Information Processing Systems (NeurIPS-22), 2022. (PDF)(Source code)
    • [ICML-22-DyNN] Y. Yang, Y. Liang and M. Zhang, PA-GNN: Parameter-Adaptive Graph Neural Networks , Workshop on Dynamic Neural Networks in International Conference on Machine Learning (ICML-22-DyNN-Workshop), oral presentation, 2022. (PDF)
    • [VLDB-22] H. Yin, M. Zhang, Y. Wang, J. Wang and P. Li, Algorithm and System Co-design for Efficient Subgraph-based Graph Representation Learning , Proc. of the VLDB Endowment (VLDB-22), volume 15, 2022. (PDF)(Source code)
    • [ICML-22] Y. Huang, X. Peng, J. Ma, and M. Zhang, 3DLinker: An E(3) Equivariant Variational Autoencoder for Molecular Linker Design, Proc. International Conference on Machine Learning (ICML-22), long presentation, 2022. (Only 118 out of 5630 submissions are accepted as long presentations) (PDF)(Source code)
    • [ICML-22] X. Wang and M. Zhang, How Powerful are Spectral Graph Neural Networks, Proc. International Conference on Machine Learning (ICML-22), 2022. (PDF)(Source code)
    • [ICML-22] Z. Dong*, M. Zhang*, F. Li, and Y. Chen, PACE: A Parallelizable Computation Encoder for Directed Acyclic Graphs, Proc. International Conference on Machine Learning (ICML-22), 2022. (*co-first author) (PDF)(Source code)
    • [ICLR-22] X. Wang and M. Zhang, GLASS: GNN with Labeling Tricks for Subgraph Representation Learning, International Conference on Learning Representations (ICLR-22), 2022. (PDF)(Source code)
    • [ICLR-22] H. Wang, H. Yin, M. Zhang, and P. Li, Equivariant and Stable Positional Encoding for More Powerful Graph Neural Networks, International Conference on Learning Representations (ICLR-22), 2022. (PDF)(Source code)
    • 2021

    • [MIA-21] X. Li, Y. Zhou, N. Dvornek, M. Zhang, S. Gao, J. Zhuang, D. Scheinost, LH Staib, P. Ventola and JS Duncan, Braingnn: Interpretable brain graph neural network for fmri analysis, Medical Image Analysis (MIA-21), 2021. (PDF)
    • [NeurIPS-21] M. Zhang, P. Li, Y. Xia, K. Wang, and L. Jin, Labeling Trick: A Theory of Using Graph Neural Networks for Multi-Node Representation Learning, Advances in Neural Information Processing Systems (NeurIPS-21), 2021. (PDF)(Source code)
    • [NeurIPS-21] M. Zhang and P. Li, Nested Graph Neural Networks, Advances in Neural Information Processing Systems (NeurIPS-21), 2021. (PDF) (Source code)
    • [NeurIPS-21] H. Zeng, M. Zhang#, Y. Xia, A. Srivastava, A. Malevich, R. Kannan, V. Prasanna, L. Jin, and R. Chen, Decoupling the Depth and Scope of Graph Neural Networks, Advances in Neural Information Processing Systems (NeurIPS-21), 2021. (#corresponding author) (PDF)(Source code)
    • 2020

    • [ICLR-20] M. Zhang and Y. Chen, Inductive Matrix Completion Based on Graph Neural Networks, International Conference on Learning Representations (ICLR-20), spotlight presentation, 2020. (PDF)(Source code)
    • [KDD-20] M. Zhang, C. King, M. Avidan, and Y. Chen, Hierarchical Attention Propagation for Healthcare Representation Learning, Proc. ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD-20), 2020. (PDF)(Source code)
    • [MICCAI-20] X. Li, Y. Zhou, N. Dvornek, M. Zhang, J. Zhuang, P. Ventola, and J. Duncan, Pooling Regularized Graph Neural Network for fMRI Biomarker Analysis, Proc. International Conference on Medical Image Computing and Computer-Assisted Intervention. (MICCAI-20), 2020. (PDF)
    • 2019

    • [NeurIPS-19] M. Zhang, S. Jiang, Z. Cui, R. Garnett, and Y. Chen, D-VAE: A Variational Autoencoder for Directed Acyclic Graphs, Advances in Neural Information Processing Systems (NeurIPS-19), 2019. (PDF)(Source code)
    • [BJA-19] B. Fritz, Z. Cui, M. Zhang, Y. He, Y. Chen, A. Kronzer, A. Ben Abdallah, C. King, M. Avidan, A Deep Learning Model for Predicting 30-day Postoperative Mortalit, British Journal of Anaesthesia (BJA-19), 2019. (PDF)
    • 2018

    • [NeurIPS-18] M. Zhang and Y. Chen, Link Prediction Based on Graph Neural Networks, Advances in Neural Information Processing Systems (NeurIPS-18), spotlight presentation, 2018. (Only 168 out of 4856 submissions are accepted as spotlight presentations) (PDF)(Source code)(Website)(Video)
    • [AAAI-18] M. Zhang, Z. Cui, M. Neumann, and Y. Chen, An End-to-End Deep Learning Architecture for Graph Classification, Proc. AAAI Conference on Artificial Intelligence (AAAI-18), 2018. (PDF)(Supplement)(Source code)
    • [AAAI-18] M. Zhang, Z. Cui, S. Jiang, and Y. Chen, Beyond Link Prediction: Predicting Hyperlinks in Adjacency Space, Proc. AAAI Conference on Artificial Intelligence (AAAI-18), 2018. (PDF)(Source code)
    • [ICBK-18] Z. Cui, M. Zhang, and Y. Chen, Deep Embedding Logistic Regression, Proc. IEEE International Conference on Big Knowledge (ICBK-18), 2018. (PDF)
    • 2017

    • [KDD-17] M. Zhang and Y. Chen, Weisfeiler-Lehman Neural Machine for Link Prediction, Proc. ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD-17), oral presentation, 2017. (Only 64 out of 748 submissions are accepted as oral presentations) (PDF)(Video)(Slides)(Source code)
    • [Bioinformatics-17] T. Oyetunde*, M. Zhang*, Y. Chen, Y. Tang, and C. Lo, BoostGAPFILL: Improving the fidelity of metabolic network reconstructions through integrated constraint and pattern-based methods,  Bioinformatics, 33(4):608-611, 2017. (*co-first author) (PDF)
    • 2016

    • [BMC Bioinformatics-16] L. He, S. Wu, M. Zhang, Y. Chen, and Y. Tang, WUFlux: an open-source platform for 13C metabolic flux analysis of bacterial metabolism,  BMC Bioinformatics, 17.1 (2016): 444. (PDF)
    • [TNNLS-16] W. Cai, M. Zhang, and Y. Zhang, Batch Mode Active Learning for Regression With Expected Model Change,  IEEE Transactions on Neural Networks and Learning Systems (TNNLS) , (2016): 1-14. (PDF)
    • 2015

    • [IRJ-15] W. Cai, M. Zhang, and Y. Zhang, Active learning for ranking with sample density,  Information Retrieval Journal , 18.2 (2015): 123-144. (PDF)
    • [TWEB-15] W. Cai, M. Zhang, and Y. Zhang, Active learning for Web search ranking via noise injection,  ACM Transactions on the Web (TWEB), 9.1 (2015): 3. (PDF)

    Group

    Haotong Yang, PhD Student, PKU 2021 (co-advised with Zhouchen Lin)

    Xiyuan Wang, PhD Student, PKU 2022

    Fanxu Meng, PhD Student, PKU 2022

    Xiaojuan Tang, PhD Student, PKU 2022 (co-advised with Song-Chun Zhu)

    Zian Li, PhD Student, PKU 2023

    Yanbo Wang, PhD Student, PKU 2023

    Lecheng Kong, PhD Student, WashU

    Jiarui Feng, PhD Student, WashU

    Zehao Dong, PhD Student, WashU

    Hao Liu, PhD Student, WashU

    Xingang Peng, PhD Student, PKU

    Zuoyu Yan, PhD Student, PKU

    Junru Zhou, Undergraduate, PKU

    Yuran Xiang, Undergraduate, PKU

    Cai Zhou, Undergraduate, THU

    Yi Hu, Undergraduate, PKU

    Alumni

    Yinan Huang, Research Intern, BIGAI

    Yang Hu, Research Intern, PKU

    Yuxin Yang, Research Intern, BIGAI

    Zhaohui Wang, PhD Student, ICT-CAS


    Software

    IGMC (Inductive Graph-based Matrix Completion)

    Code for paper "Inductive Matrix Completion Based on Graph Neural Networks"

    D-VAE (DAG Variational Autoencoder)

    Code for paper "D-VAE: A Variational Autoencoder for Directed Acyclic Graphs" on NeurIPS 2019

    SEAL (learning from Subgraphs, Embeddings, and Attributes for Link prediction)

    Code for paper "Link Prediction Based on Graph Neural Networks" on NeurIPS 2018

    DGCNN (Deep-Graph-CNN)

    Code for paper "An End-to-End Deep Learning Architecture for Graph Classification" on AAAI 2018

    Hyperlink Prediction Toolbox

    Code for paper "Beyond Link Prediction: Predicting Hyperlinks in Adjacency Space" on AAAI 2018

    WLNM (Weisfeiler-Lehman Neural Machine)

    Code for paper "Weisfeiler-Lehman Neural Machine for Link Prediction" on KDD 2017