Speaker: Zhiqin Xu, Associate Professor of Shanghai Jiaotong University
Time: 15 pm, December 22nd, 2020
Place: Room 301, No. 3 Building
Sponsor: SHNU College of Mathematics and Mechanical Engineering
We demonstrate a very universal Frequency Principle (F-Principle) --- DNNs often fit target functions from low to high frequencies --- on high-dimensional benchmark data sets and deep neural networks. We use F-Principle to understand the difference of DNN with traditional methods. We then propose novel multi-scale DNNs (MscaleDNN) using the idea of radial scaling in frequency domain and activation functions with compact support. The radial scaling converts the problem of approximation of high frequency content of the PDEssolution to one of lower frequency, and the compact support activation functionsfacilitate the separation of scales to be approximated by corresponding DNNs. As a result, the MscaleDNNs achieve fast uniform convergence over multiple scales.The proposed MscaleDNNs are shown to be superior to traditional fully connected DNNs and can be used as an effective mesh-less numerical method for elliptic PDEs.