Low energy learning from the analysis of learning landscapes in deep and recurrent neural networks

  • * Register (or log in) to the Neural Network to add this session to your agenda or watch the replay

  • Date
    24 March 2025
    Timeframe
    16:00 - 17:00 CET Geneva
    Duration
    60 minutes
    Share this session

    We investigate non-convex deep and recurrent network landscapes through statistical physics, uncovering key geometric characteristics that inform the design of  algorithms which rely on fully local learning schemes. Our study examines the coexistence of typical minima, which adhere to the Overlap Gap Property, alongside infrequent flat regions that do not. We then discuss the algorithmic implications for the design of energy preserving local learning in deep architectures.

  • Are you sure you want to remove this speaker?