Important Information

We will livestream our workshop at 7:10 am PDT on Sunday, April 26th, 2020. The livestream of our workshop can be found below or on our ICLR workshop page:
https://iclr.cc/virtual/workshops_5.html

Rocket Chat channel for discussion: #workshop_deepdiffeq
Detailed Schedule
Accepted Papers
Questions for Panelists on how researchers from applied math and machine learning communities can join forces to solve challenging problems in both fields.

Zoom links to the poster Q&A sessions are provided next to the papers listed in the Accepted Papers section below. You can also find those links in our Detailed Schedule. All Q&A sessions are from 6:35 pm to 8:35 pm PDT, except for the following posters, whose Q&A session will be during lunch break, from 12:35 pm to 2 pm PDT. Password to all poster Q&A sessions is Diffeq20

Posters whose Q&A sessions are during lunch break, from 12:35 pm to 2 pm PDT.

11. Differential Equations as Model Prior for DeepLearning and Applications to Robotics (Zoom Link to Q&A Poster Session)

13. Comparing recurrent and convolutional neural networks for predicting wave propagation (Zoom Link to Q&A Poster Session)

15. Learning To Solve Differential Equations Across Initial Conditions (Zoom Link to Q&A Poster Session)

17. Learning-Based Strong Solutions to Forward and Inverse Problems in PDEs (Zoom Link to Q&A Poster Session)

20. Can auto-encoders help with filling missing data?. Marek Śmieja, Maciej Kołomycki, Łukasz Struski, Mateusz Juda, Mário A. T. Figueiredo (Zoom Link to Q&A Poster Session).

21. Neural Differential Equations for Single Image Super-Resolution (Zoom Link to Q&A Poster Session)

26. Bringing PDEs to JAX with forward and reverse modes automatic differentiation (Zoom Link to Q&A Poster Session)

27. Urban air pollution forecasts generated from latent space representation (Zoom Link to Q&A Poster Session)

32. Generative ODE Modeling with Known Unknowns (Zoom Link to Q&A Poster Session)

Instructions for the live Zoom discussion:
– when you are joining the live discussion, please mute the player where you’re watching the livestream, otherwise it will feed back in to Zoom
– You will be muted on entry, you can ask questions after they Raise hand and the moderator will unmute you
– it is better to use headphones for the live discussion.


Workshop Abstract

Differential equations form the bedrock of scientific computing, while neural networks have emerged as the preferred tool of modern machine learning. These two methods are not only closely related to each other but also offer complementary strengths: the modelling power and interpretability of differential equations, and the approximation and generalization power of deep neural networks.

While progress has been made on combining differential equations and deep neural networks, most existing work has been disjointed, and a coherent picture has yet to emerge.  Thus, a theoretical foundation for integrating deep neural networks and differential equations remains poorly understood, with many more questions than answers. For example: How can we incorporate a given ordinary/partial differential equation (ODE/PDE) into an architecture of a deep neural network? Under what assumptions can we approximate a system of ODEs/PDEs by deep neural networks? How good are these approximations? How can we interpret deep neural networks from the perspective of ODEs/PDEs? How well-developed mathematical tools for ODEs/PDEs can be leveraged to help us gain a better understanding of deep neural networks and improve their performance? Substantive progress will require a principled approach that integrates ideas from the disparate lens, including differential equations, machine learning, numerical analysis, optimization, optimal transport, computer graphics, and physics.

The goal of this workshop is to provide a forum where theoretical and experimental researchers of all stripes can come together not only to share reports on their progress but also to find new ways to join forces towards the goal of coherent integration of deep neural networks and differential equations. Topics to be discussed include, but are not limited to:

  • Deep learning for high dimensional PDE problems
  • PDE and stochastic analysis for deep learning
  • PDE and analysis for new architectures; stable architecture design using numerical stability approaches
  • Inverse problems approaches to learning theory; regularization of the loss in deep learning, convergence in the data sampling limit
  • PDEs on graphs
  • Physics-inspired neural networks
  • Numerical tools and library for interfacing deep learning models and ODE/PDE solvers
  • Deep learning for computer graphics
  • Optimal transport for deep generative models
  • Applications of deep learning + differential equations in scientific problems

Confirmed Speakers

       
          Tom Goldstein                    Claire Monteleoni                     Xavier Bresson
University of Maryland          University of Colorado Boulder Nanyang Technological University
Plenary Talk                                        Plenary Talk                              Plenary Talk

          
        Matthew Thorpe                   Ricky T. Q. Chen                     Gavin Portwood
University of Manchester               University of Toronto        Los Alamos National Laboratory
Invited Talk                                          Invited Talk                               Invited Talk

Schedule

Detailed schedule of our workshop can be found here. Times are in PDT, the Los Angeles time on April 26th, 2020.

Title Speaker Time
Welcome and Opening Remark Richard Baraniuk 7:00 am – 7:10 am
Invited Talk 1: Image Processing, Differential Equations, and Graph Neural Nets Xavier Bresson 7:10 am – 7:40 am
Contributed Talk 1: Solving ODE with Universal Flows: Approximation Theory for Flow-Based Models Chin-Wei Huang, Laurent Dinh, Aaron Courville 7:40 am – 8:00 am
Invited Talk 2: Correcting the Bias in Laplacian Learning at Low Label Rates Matthew Thorpe 8:00 am – 8:30 am
Contributed Talk 2: Neural Operator: Graph Kernel Network for Partial Differential Equations Anima Anandkumar, Kamyar Azizzadenesheli, Kaushik Bhattacharya, Nikola Kovachki, Zongyi Li, Burigede Liu, Andrew Stuart 8:30 am – 8:50 am
Coffee Break 8:50 am – 9:00 am
Plenary Talk 1: Generalization in neural nets:  a perspective from science (not math) Tom Goldstein 9:00 am – 9:50 am
Contributed Talk 3: A Mean-field Analysis of Deep ResNet and Beyond:Towards Provable Optimization Via Overparameterization From Depth Yiping Lu, Chao Ma, Yulong Lu, Jianfeng Lu, Lexing Ying 9:50 am – 10:10 am
Poster Lighting Talks 1 10:10 am – 11:35 am
Live Panel Discussion Claire Monteleoni, Tom Goldstein, Matthew Thorpe, Gavin Portwood, Bao Wang. 11:35 am – 12:35 pm
Lunch Break 12:35 pm – 2:00 pm
Plenary Talk 2: AI meets Dynamical Systems for the study of Climate Change Claire Monteleoni 2:00 pm – 2:50 pm
Contributed Talk 4: A Free-Energy Principle for Representation Learning Pratik Chaudhari, Yansong Gao 2:50 pm – 3:10 pm
Invited Talk 3: Subtleties of Neural ODEs: Learning with Constraints Ricky T. Q. Chen 3:10 pm – 3:40 pm
Coffee Break 3:40 pm – 3:50 pm
Invited Talk 4: TBA Gavin Portwood 3:50 pm – 4:20 pm
Contributed Talk 5: Amortized Finite Element Analysis for Fast PDE-Constrained Optimization Tianju Xue, Alex Beatson, Sigrid Adriaenssens, Ryan P. Adams 4:20 pm – 4:40 pm
Contributed Talk 6: Nano-Material Configuration Design with Deep Surrogate Langevin Dynamics Thanh V. Nguyen, Youssef Mroueh, Samuel Hoffman, Payel Das, Pierre Dognin, Giuseppe Romano, Chinmay Hegde 4:40 pm – 5:00 pm
Poster Lighting Talks 2 5:00 pm – 6:25 pm
Closing Remarks 6:25 pm – 6:35 pm
Live Poster Q&A Session 6:35 pm – 8:35 pm

Accepted Papers

All accepted papers are posted on our OpenReview site.

Zoom links to the poster Q&A sessions are provided next to the papers listed in the Accepted Papers section below. You can also find those links in our Detailed Schedule. All Q&A sessions are from 6:35 pm to 8:35 pm PDT, except for the following posters, whose Q&A session will be during lunch break, from 12:35 pm to 2 pm PDT. Password to all poster Q&A sessions is Diffeq20

Posters whose Q&A sessions are during lunch break, from 12:35 pm to 2 pm PDT.

11. Differential Equations as Model Prior for DeepLearning and Applications to Robotics (Zoom Link to Q&A Poster Session)

13. Comparing recurrent and convolutional neural networks for predicting wave propagation (Zoom Link to Q&A Poster Session)

15. Learning To Solve Differential Equations Across Initial Conditions (Zoom Link to Q&A Poster Session)

17. Learning-Based Strong Solutions to Forward and Inverse Problems in PDEs (Zoom Link to Q&A Poster Session)

20. Can auto-encoders help with filling missing data?. Marek Śmieja, Maciej Kołomycki, Łukasz Struski, Mateusz Juda, Mário A. T. Figueiredo (Zoom Link to Q&A Poster Session).

21. Neural Differential Equations for Single Image Super-Resolution (Zoom Link to Q&A Poster Session)

26. Bringing PDEs to JAX with forward and reverse modes automatic differentiation (Zoom Link to Q&A Poster Session)

27. Urban air pollution forecasts generated from latent space representation (Zoom Link to Q&A Poster Session)

32. Generative ODE Modeling with Known Unknowns (Zoom Link to Q&A Poster Session)

Contributed Talks
1. Solving ODE with Universal Flows: Approximation Theory for Flow-Based Models. Chin-Wei Huang, Laurent Dinh, Aaron Courville (Zoom Link to Q&A Poster Session).
2. Neural Operator: Graph Kernel Network for Partial Differential Equations. Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar (Zoom Link to Q&A Poster Session).
3. A Mean-field Analysis of Deep ResNet and Beyond:Towards Provable Optimization Via Overparameterization From Depth. Yiping Lu, Chao Ma, Yulong Lu, Jianfeng Lu, Lexing Ying (Zoom Link to Q&A Poster Session).
4. A Free-Energy Principle for Representation Learning. Pratik Chaudhari, Yansong Gao (Zoom Link to Q&A Poster Session).
5. Amortized Finite Element Analysis for Fast PDE-Constrained Optimization. Tianju Xue, Alex Beatson, Sigrid Adriaenssens, Ryan P. Adams (Zoom Link to Q&A Poster Session).
6. Nano-Material Configuration Design with Deep Surrogate Langevin Dynamics. Thanh V. Nguyen, Youssef Mroueh, Samuel Hoffman, Payel Das, Pierre Dognin, Giuseppe Romano, Chinmay Hegde (Zoom Link to Q&A Poster Session).

Poster Lightning Talks 1
7. Nonlinear Differential Equations with External Forcing. Paul Pukite (Zoom Link to Q&A Poster Session).
8. On the space-time expressivity of ResNets. Johannes Christoph Müller (Zoom Link to Q&A Poster Session).
9. Enforcing Physical Constraints in CNNs through Differentiable PDE Layer. Chiyu “Max” Jiang, Karthik Kashinath, Prabhat, Philip Marcus (Zoom Link to Q&A Poster Session).
10. Deep Ritz revisited. Johannes Müller, Marius Zeinhofer (Zoom Link to Q&A Poster Session).
11. Differential Equations as Model Prior for DeepLearning and Applications to Robotics. Michael Lutter, Jan Peters (Zoom Link to Q&A Poster Session).
12. Differentiable Physics Simulation. Junbang Liang, Ming C. Lin (Zoom Link to Q&A Poster Session).
13. Comparing recurrent and convolutional neural networks for predicting wave propagation. Stathi Fotiadis, Eduardo Pignatelli, Mario Lino Valencia, Chris Cantwell, Amos Storkey, Anil A. Bharath (Zoom Link to Q&A Poster Session).
14. Time Dependence in Non-Autonomous Neural ODEs. Jared Quincy Davis, Krzysztof Choromanski, Vikas Sindhwani, Jake Varley, Honglak Lee, Jean-Jacques Slotine, Valerii Likhosterov, Adrian Weller, Ameesh Makadia (Zoom Link to Q&A Poster Session).
15. Learning To Solve Differential Equations Across Initial Conditions. Shehryar Malik, Usman Anwar, Ali Ahmed, Alireza Aghasi (Zoom Link to Q&A Poster Session).
16. How Chaotic Are Recurrent Neural Networks?. Pourya Vakilipourtakalou, Lili Mou (Zoom Link to Q&A Poster Session).
17. Learning-Based Strong Solutions to Forward and Inverse Problems in PDEs. Leah Bar, Nir Sochen (Zoom Link to Q&A Poster Session).
18. Embedding Hard Physical Constraints in Convolutional Neural Networks for 3D Turbulence. Arvind T. Mohan, Nicholas Lubbers, Daniel Livescu, Michael Chertkov (Zoom Link to Q&A Poster Session).
19. Wavelet-Powered Neural Networks for Turbulence. Arvind T. Mohan, Daniel Livescu, Michael Chertkov (Zoom Link to Q&A Poster Session).
20. Can auto-encoders help with filling missing data?. Marek Śmieja, Maciej Kołomycki, Łukasz Struski, Mateusz Juda, Mário A. T. Figueiredo (Zoom Link to Q&A Poster Session).
21. Neural Differential Equations for Single Image Super-Resolution. Teven Le Scao (Zoom Link to Q&A Poster Session).
22. Understanding and Improving Transformer From a Multi-Particle Dynamic System Point of View. Yiping Lu, Zhuohan Li, Di He, Zhiqing Sun, Bin Dong, Tao Qin, Liwei Wang, Tie-yan Liu (Zoom Link to Q&A Poster Session).
23. Neural Dynamical Systems. Viraj Mehta, Ian Char, Willie Neiswanger, Youngseog Chung, Andrew Oakleigh Nelson, Mark D Boyer, Jeff Schneider (Zoom Link to Q&A Poster Session).

Poster Lightning Talks 2
24. Progressive Growing of Neural ODEs. Hammad A. Ayyubi, Yi Yao, Ajay Divakaran (Zoom Link to Q&A Poster Session).
25. Fast Convergence for Langevin with Matrix Manifold Structure. Ankur Moitra, Andrej Risteski (Zoom Link to Q&A Poster Session).
26. Bringing PDEs to JAX with forward and reverse modes automatic differentiation. Ivan Yashchuk (Zoom Link to Q&A Poster Session).
27. Urban air pollution forecasts generated from latent space representation. Cesar Quilodran Casas, Rossella Arcucci, Yike Guo (Zoom Link to Q&A Poster Session).
28. Dissipative SymODEN: Encoding Hamiltonian Dynamics with Dissipation and Control into Deep Learning. Yaofeng Desmond Zhong, Biswadip Dey, Amit Chakraborty (Zoom Link to Q&A Poster Session).
29. Neural Ordinary Differential Equation Value Networks for Parametrized Action Spaces. Michael Poli, Stefano Massaroli, Sanzhar Bakhtiyarov, Atsushi Yamashita, Hajime Asama, Jinkyoo Park (Zoom Link to Q&A Poster Session).
30. Stochasticity in Neural ODEs: An Empirical Study. Alexandra Volokhova, Viktor Oganesyan, Dmitry Vetrov (Zoom Link to Q&A Poster Session).
31. Generating Control Policies for Autonomous Vehicles Using Neural ODEs. Houston Lucas, Richard Kelley (Zoom Link to Q&A Poster Session).
32. Generative ODE Modeling with Known Unknowns. Ori Linial, Uri Shalit (Zoom Link to Q&A Poster Session).
33. Encoder-decoder neural network for solving the nonlinear Fokker-Planck-Landau collision operator in XGC. Marco Andres Miller, Randy Michael Churchill, Choong-Seock Chang, Robert Hager (Zoom Link to Q&A Poster Session).
34. Differentiable Molecular Simulations for Control and Learning. Wujie Wang, Simon Axelrod, Rafael Gómez-Bombarelli (Zoom Link to Q&A Poster Session).
35. Port-Hamiltonian Gradient Flows. Michael Poli, Stefano Massaroli, Atsushi Yamashita, Hajime Asama, Jinkyoo Park (Zoom Link to Q&A Poster Session).
36. Lagrangian Neural Networks. Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David Spergel, Shirley Ho (Zoom Link to Q&A Poster Session).
37. Constrained Neural Ordinary Differential Equations with Stability Guarantees. Aaron Tuor, Jan Drgona, Draguna Vrabie (Zoom Link to Q&A Poster Session).
38. Stochastic gradient algorithms from ODE splitting perspective. Daniil Merkulov, Ivan Oseledets (Zoom Link to Q&A Poster Session).
39. The equivalence between Stein variational gradient descent and black-box variational inference. Casey Chu, Kentaro Minami, Kenji Fukumizu (Zoom Link to Q&A Poster Session).
40. Towards Understanding Normalization in Neural Ordinary Differential Equations. Julia Gusak, Larisa Markeeva, Talgat Daulbaev, Alexander Katrutsa, Andrzej Cichocki, Ivan Oseledets (Zoom Link to Q&A Poster Session, Paper).

Call for Papers and Submission Instructions

We invite researchers to submit anonymous extended abstracts of up to 4 pages (including abstract, but excluding references). No specific formatting is required. Authors may use the ICLR style file, or any other style as long as they have standard font size (11pt) and margins (1in).

Submissions should be anonymous and are handled through the OpenReview system. Please note that at least one coauthor of each accepted paper will be expected to attend the workshop in person to present a poster or give a contributed talk.

Papers can be submitted at the address:

https://openreview.net/group?id=ICLR.cc/2020/Workshop/DeepDiffEq

Important Dates

  • Submission Deadline (EXTENDED): 23:59 pm PST, Tuesday, February 18th
  • Acceptance notification: Tuesday, February 25th
  • Camera ready submission: Sunday, April 19th
  • Workshop: Sunday, April 26th

Organizers

Richard G. Baraniuk                Stanley Osher                        Anima Anandkumar

richb@rice.edu                        sjo@math.ucla.edu                anima@caltech.edu

Animesh Garg                             Bao Wang                          Tan M. Nguyen

garg@cs.toronto.edu                   wangbao@math.ucla.edu    mn15@rice.edu

Please email iclr2020deepdiffeq@gmail.com with any questions.