This article discusses the use of residual neural networks (ResNets) to approximate solutions of Kolmogorov partial differential equations (PDEs). It is shown that ResNets are able to approximate solutions of PDEs without suffering from the curse of dimensionality, meaning that the number of parameters of the approximating ResNets grows at most polynomially in the reciprocal of the approximation accuracy and the dimension of the considered PDE. The article adapts a proof from Jentzen et al. to ResNets, which showed a similar result for feedforward neural networks (FNNs).
