In this paper, we discuss some analytic properties of hyperbolic tangent function and estimate some approximation errors of neural network operators with the hyperbolic tan- gent activation function. Firstly, an equation of partitions of unity for the hyperbolic tangent function is given. Then, two kinds of quasi-interpolation type neural network operators are con- structed to approximate univariate and bivariate functions, respectively. Also, the errors of the approximation are estimated by means of the modulus of continuity of function. Moreover, for approximated functions with high order derivatives, the approximation errors of the constructed operators are estimated.
There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some special function space have also been investigated. This paper considers the approximation by the GRBFNs in continuous function space. It is proved that the rate of approximation by GRNFNs with nd neurons to any continuous function f defined on a compact subset K R^d can be controlled by w(f, n^-1/2), where w(f, t) is the modulus of continuity of the function f.
This paper investigates some approximation properties and learning rates of Lipschitz kernel on the sphere. A perfect convergence rate on the shifts of Lipschitz kernel on the sphere, which is faster than O(n-1/2), is obtained, where n is the number of parameters needed in the approximation. By means of the approximation, a learning rate of regularized least square algorithm with the Lipschitz kernel on the sphere is also deduced.