Maximilian Beck

Working on efficient Large Language Models architectures.

I am a final-year PhD student at the Institute for Machine Learning at Johannes Kepler University (JKU) Linz, advised by Sepp Hochreiter, and a part-time PhD Researcher at NXAI.

My research focuses on efficient architectures for Large Language Models (LLMs) with sub-quadratic complexity. I’m particularly interested in optimizing training and inference efficiency, understanding scaling laws, and exploring the application of these architectures in domains such as computer vision and robotics.

Currently, I am interning at Meta FAIR, where I work on LLMs for code generation and understanding.

Before starting my PhD, I completed both my Bachelor’s and Master’s studies in Mechatronics and Information Technology at the Karlsruhe Institute of Technology (KIT), with a focus on Control Theory, graduating in 2017 and 2021.

news

May 12, 2025 I started a summer internship at Meta FAIR in Paris working on code Large Language Models.
May 10, 2025 Our two papers xLSTM 7B and A Large Recurrent Action Model are accepted at ICML 2025!
Apr 28, 2025 Two main and two workshop papers at ICLR 2025.
Jan 22, 2025 Two papers accepted at ICLR 2025!
Our FlashRNN and Vision-LSTM paper will be presented in Singapore.
Sep 25, 2024 🚨 The xLSTM paper has been accepted as a spotlight at NeurIPS 2024! 🎉
I am looking forward to present and discuss the xLSTM in Vancouver.

selected publications

  1. xlstm_figure1_cropped.png
    xLSTM: Extended Long Short-Term Memory
    Maximilian Beck*, Korbinian Pöppel* , Markus Spanring , and 6 more authors
    In Advances in Neural Information Processing Systems (NeurIPS) , 2024
  2. tfla_fig1_icon.png
    Tiled Flash Linear Attention: More Efficient Linear RNN and xLSTM Kernels
    Maximilian Beck, Korbinian Pöppel , Phillip Lippe , and 1 more author
    In Arxiv , 2025
  3. ICML25
    xLSTM 7B: A Recurrent LLM for Fast and Efficient Inference
    Maximilian Beck*, Korbinian Pöppel* , Phillip Lippe* , and 5 more authors
    In International Conference on Machine Learning (ICML) , 2025
  4. flashrnn_fig1_cropped.png
    FlashRNN: I/O-Aware Optimization of Traditional RNNs on modern hardware
    Korbinian Pöppel , Maximilian Beck, and Sepp Hochreiter
    In The International Conference on Learning Representations (ICLR) , 2025
  5. ICLR25
    Vision-LSTM: xLSTM as Generic Vision Backbone
    Benedikt Alkin , Maximilian Beck, Korbinian Pöppel , and 2 more authors
    In The International Conference on Learning Representations (ICLR) , 2025