Skip to main content
Scientific Computing and Machine Learning
SCML
Scientific Computing and Machine Learning
Main navigation
Home
People
Principal Investigators
Research Scientists and Engineers
Postdoctoral Fellows
Students
All Profiles
Events
All Events
Events Calendar
News
Pages
Publications
ISL Publications Repository
Research Output
KAUST Innovation Hub in Shenzhen
Opportunities
transformers
Structure-conforming Operator Learning via Transformers
Prof. Shuhao Cao, University of Missouri–Kansas City
Apr 22, 16:00
-
17:00
KAUST
transformers
Abstract GPT, Stable Diffusion, AlphaFold 2, etc., all these state-of-the-art deep learning models use a neural architecture called "Transformer". Since the emergence of "Attention Is All You Need", Transformer is now the ubiquitous architecture in deep learning. At Transformer's heart and soul is the "attention mechanism". In this talk, we shall give a specific example the following research program: whether and how one can benefit from the theoretical structure of a mathematical problem to develop task-oriented and structure-conforming deep neural networks? An attention-based deep direct