Skip to main content
King Abdullah University of Science and Technology
Scientific Computing and Machine Learning
SCML
Scientific Computing and Machine Learning
Main navigation
  • Home
  • People
    • Principal Investigators
    • Research Scientists
    • Postdoctoral Fellows
    • Students
    • All Profiles
  • Events
    • All Events
    • Events Calendar
  • News
  • KAUST Innovation Hub in Shenzhen
  • Opportunities

transformers

Structure-conforming Operator Learning via Transformers

Prof. Shuhao Cao, University of Missouri–Kansas City

Apr 22, 16:00 - 17:00

KAUST

transformers

Abstract GPT, Stable Diffusion, AlphaFold 2, etc., all these state-of-the-art deep learning models use a neural architecture called "Transformer". Since the emergence of "Attention Is All You Need", Transformer is now the ubiquitous architecture in deep learning. At Transformer's heart and soul is the "attention mechanism". In this talk, we shall give a specific example the following research program: whether and how one can benefit from the theoretical structure of a mathematical problem to develop task-oriented and structure-conforming deep neural networks? An attention-based deep direct

Scientific Computing and Machine Learning (SCML)

Footer

  • A-Z Directory
    • All Content
    • Browse Related Sites
  • Site Management
    • Log in

© 2025 King Abdullah University of Science and Technology. All rights reserved. Privacy Notice