prof_pic.jpeg

Maksim Zhdanov

PhD student at AMLab

Hi!

>>> print(self.status)

3rd year PhD student at AMLab

>>> print(self.supervisors)

Max Welling, Jan-Willem van de Meent, and Alfons Hoekstra

>>> print(self.research_interests)

hierarchical models, sub-quadratic architectures, scalable geometric deep learning

latest posts

selected publications

  1. mosaic.png
    (Sparse) Attention to the Details: Preserving Spectral Fidelity in ML-based Weather Forecasting Models
    Maksim Zhdanov, Ana Lucic, Max Welling, and 1 more author
    preprint
  2. erwin.jpg
    Erwin: A Tree-based Hierarchical Transformer for Large-scale Physical Systems
    Maksim Zhdanov, Max Welling, and Jan-Willem van de Meent
    ICML 2025
  3. adsgnn.png
    AdS-GNN - a Conformally Equivariant Graph Neural Network
    Maksim Zhdanov, Nabil Iqbal, Erik J Bekkers, and 1 more author
    ICLR 2026
  4. clifford_steerable.png
    Clifford-Steerable Convolutional Neural Networks
    Maksim Zhdanov, David Ruhe, Maurice Weiler, and 3 more authors
    ICML 2024

news

Jan 2026 [🥳 Paper accepted] AdS-GNN was accepted to ICLR 2026! See you all in Rio 🇧🇷!
Dec 2025 [🚨 New paper] We depeloped MSPT - parallelized multi-scale attention method based on hierarchical partitioning of data. It is incredibly fast and achieves SOTA performance on multiple PDE tasks.
Dec 2025 [🥳 Paper accepted] Our submission Adaptive Mesh Quantization for Neural PDE Solvers was accepted to TMLR! We suggest a data-driven way of quantizing message-passing neural networks inspired by speculative decoding.