-
Radius of information for two intersected centered hyperellipsoids and implications in optimal recovery from inaccurate data J. Complex. (IF 1.7) Pub Date : 2024-03-08 Simon Foucart, Chunyang Liao
For objects belonging to a known model set and observed through a prescribed linear process, we aim at determining methods to recover linear quantities of these objects that are optimal from a worst-case perspective. Working in a Hilbert setting, we show that, if the model set is the intersection of two hyperellipsoids centered at the origin, then there is an optimal recovery method which is linear
-
A space-time adaptive low-rank method for high-dimensional parabolic partial differential equations J. Complex. (IF 1.7) Pub Date : 2024-02-09 Markus Bachmayr, Manfred Faldum
An adaptive method for parabolic partial differential equations that combines sparse wavelet expansions in time with adaptive low-rank approximations in the spatial variables is constructed and analyzed. The method is shown to converge and satisfy similar complexity bounds as existing adaptive low-rank methods for elliptic problems, establishing its suitability for parabolic problems on high-dimensional
-
Asymptotic analysis in multivariate worst case approximation with Gaussian kernels J. Complex. (IF 1.7) Pub Date : 2024-02-07 A.A. Khartov, I.A. Limar
We consider a problem of approximation of -variate functions defined on which belong to the Hilbert space with tensor product-type reproducing Gaussian kernel with constant shape parameter. Within worst case setting, we investigate the growth of the information complexity as . The asymptotics are obtained for the case of fixed error threshold and for the case when it goes to zero as .
-
-
Thomas Jahn, Tino Ullrich and Felix Voigtlaender are the Winners of the 2023 Best Paper Award of the Journal of Complexity J. Complex. (IF 1.7) Pub Date : 2024-01-30 Erich Novak
Abstract not available
-
Tamed-adaptive Euler-Maruyama approximation for SDEs with superlinearly growing and piecewise continuous drift, superlinearly growing and locally Hölder continuous diffusion J. Complex. (IF 1.7) Pub Date : 2024-01-18 Minh-Thang Do, Hoang-Long Ngo, Nhat-An Pho
In this paper, we consider stochastic differential equations whose drift coefficient is superlinearly growing and piecewise continuous, and whose diffusion coefficient is superlinearly growing and locally Hölder continuous. We first prove the existence and uniqueness of solution to such stochastic differential equations and then propose a tamed-adaptive Euler-Maruyama approximation scheme. We study
-
Online regularized learning algorithm for functional data J. Complex. (IF 1.7) Pub Date : 2024-01-09 Yuan Mao, Zheng-Chu Guo
In recent years, functional linear models have attracted growing attention in statistics and machine learning for recovering the slope function or its functional predictor. This paper considers online regularized learning algorithm for functional linear models in a reproducing kernel Hilbert space. It provides convergence analysis of excess prediction error and estimation error with polynomially decaying
-
On a class of linear regression methods J. Complex. (IF 1.7) Pub Date : 2024-01-09 Ying-Ao Wang, Qin Huang, Zhigang Yao, Ye Zhang
In this paper, a unified study is presented for the design and analysis of a broad class of linear regression methods. The proposed general framework includes the conventional linear regression methods (such as the least squares regression and the Ridge regression) and some new regression methods (e.g. the Landweber regression and Showalter regression), which have recently been introduced in the fields
-
Nonlinear Tikhonov regularization in Hilbert scales for inverse learning J. Complex. (IF 1.7) Pub Date : 2024-01-06 Abhishake Rastogi
In this paper, we study Tikhonov regularization scheme in Hilbert scales for a nonlinear statistical inverse problem with general noise. The regularizing norm in this scheme is stronger than the norm in the Hilbert space. We focus on developing a theoretical analysis for this scheme based on conditional stability estimates. We utilize the concept of the distance function to establish high probability
-
Sharp lower error bounds for strong approximation of SDEs with piecewise Lipschitz continuous drift coefficient J. Complex. (IF 1.7) Pub Date : 2024-01-02 Simon Ellinger
We study pathwise approximation of strong solutions of scalar stochastic differential equations (SDEs) at a single time in the presence of discontinuities of the drift coefficient. Recently, it has been shown by Müller-Gronbach and Yaroslavtseva (2022) that for all p∈[1,∞) a transformed Milstein-type scheme reaches an Lp-error rate of at least 3/4 when the drift coefficient is a piecewise Lipschitz-continuous
-
Randomized complexity of parametric integration and the role of adaption II. Sobolev spaces J. Complex. (IF 1.7) Pub Date : 2024-01-02 Stefan Heinrich
We study the complexity of randomized computation of integrals depending on a parameter, with integrands from Sobolev spaces. That is, for , , , and we are given and we seek to approximate with error measured in the -norm. Information is standard, that is, function values of . Our results extend previous work of Heinrich and Sindambiwe (1999) for and Wiegand (2006) for . Wiegand's analysis was carried
-
Complexity for a class of elliptic ordinary integro-differential equations J. Complex. (IF 1.7) Pub Date : 2023-12-27 A.G. Werschulz
Consider the variational form of the ordinary integro-differential equation (OIDE)−u″+u+∫01q(⋅,y)u(y)dy=f on the unit interval I, subject to homogeneous Neumann boundary conditions. Here, f and q respectively belong to the unit ball of Hr(I) and the ball of radius M1 of Hs(I2), where M1∈[0,1). For ε>0, we want to compute ε-approximations for this problem, measuring error in the H1(I) sense in the worst
-
Randomized complexity of parametric integration and the role of adaption I. Finite dimensional case J. Complex. (IF 1.7) Pub Date : 2023-12-27 Stefan Heinrich
We study the randomized n-th minimal errors (and hence the complexity) of vector valued mean computation, which is the discrete version of parametric integration. The results of the present paper form the basis for the complexity analysis of parametric integration in Sobolev spaces, which will be presented in Part 2. Altogether this extends previous results of Heinrich and Sindambiwe (1999) [12] and
-
On the information complexity for integration in subspaces of the Wiener algebra J. Complex. (IF 1.7) Pub Date : 2023-12-27 Liang Chen, Haixin Jiang
Recently, Goda proved the polynomial tractability of integration on the following function subspace of the Wiener algebraFd:={f∈C(Td)|‖f‖Fd:=∑k∈Zd|fˆ(k)|max(1,minj∈supp(k)log|kj|)<∞}, where T:=R/Z=[0,1), fˆ(k) is the k-th Fourier coefficient of f and supp(k):={j∈{1,…,d}|kj≠0}. Goda raised an open question as to whether the upper bound of the information complexity for integration in Fd can be improved
-
A duality approach to regularized learning problems in Banach spaces J. Complex. (IF 1.7) Pub Date : 2023-12-15 Raymond Cheng, Rui Wang, Yuesheng Xu
Regularized learning problems in Banach spaces, which often minimize the sum of a data fidelity term in one Banach norm and a regularization term in another Banach norm, is challenging to solve. We construct a direct sum space based on the Banach spaces for the fidelity term and the regularization term, and recast the objective function as the norm of a quotient space of the direct sum space. We then
-
On a unified convergence analysis for Newton-type methods solving generalized equations with the Aubin property J. Complex. (IF 1.7) Pub Date : 2023-11-22 Ioannis K. Argyros, Santhosh George
A plethora of applications from diverse disciplines reduce to solving generalized equations involving Banach space valued operators. These equations are solved mostly iteratively, when a sequence is generated approximating a solution provided that certain conditions are valid on the starting point and the operators appearing on the method. Secant-type methods are developed whose specializations reduce
-
Optimal recovery and generalized Carlson inequality for weights with symmetry properties J. Complex. (IF 1.7) Pub Date : 2023-11-07 K.Yu. Osipenko
The paper concerns problems of the recovery of operators from noisy information in weighted Lq-spaces with homogeneous weights. A number of general theorems are proved and applied to finding exact constants in multidimensional Carlson type inequalities with several weights and problems of the recovery of differential operators from a noisy Fourier transform. In particular, optimal methods are obtained
-
Kateryna Pozharska is the winner of the 2023 Joseph F. Traub Information-Based Complexity Young Researcher Award J. Complex. (IF 1.7) Pub Date : 2023-10-20 David Krieg, Erich Novak, Mathias Sonnleitner, Michaela Szölgyenyi, Henryk Woźniakowski
Abstract not available
-
Changes of the Editorial Board J. Complex. (IF 1.7) Pub Date : 2023-10-20 Erich Novak
Abstract not available
-
Nonexact oracle inequalities, r-learnability, and fast rates J. Complex. (IF 1.7) Pub Date : 2023-10-13 Daniel Z. Zanger
As an extension of the standard paradigm in statistical learning theory, we introduce the concept of r-learnability, 0
-
High-order lifting for polynomial Sylvester matrices J. Complex. (IF 1.7) Pub Date : 2023-10-10 Clément Pernet, Hippolyte Signargout, Gilles Villard
A new algorithm is presented for computing the resultant of two generic bivariate polynomials over an arbitrary field. For p,q in K[x,y] of degree d in x and n in y, the resultant with respect to y is computed using O(n1.458d) arithmetic operations if d=O(n1/3). For d=1, the complexity estimate is therefore reconciled with the estimates of Neiger et al. 2021 for the related problems of modular composition
-
Expected multivolumes of random amoebas J. Complex. (IF 1.7) Pub Date : 2023-09-28 Turgay Bayraktar, Ali Ulaş Özgür Kişisel
We compute the expected multivolume of the amoeba of a random half dimensional complete intersection in CP2n. We also give a relative generalization of our result to the toric case.
-
A strongly monotonic polygonal Euler scheme J. Complex. (IF 1.7) Pub Date : 2023-09-01 Tim Johnston, Sotirios Sabanis
In recent years tamed schemes have become an important technique for simulating SDEs and SPDEs whose continuous coefficients display superlinear growth. The taming method involves curbing the growth of the coefficients as a function of stepsize, but so far has not been adapted to preserve the monotonicity of the coefficients. This has arisen as an issue in [4], where the lack of a strongly monotonic
-
Changes of the Editorial Board J. Complex. (IF 1.7) Pub Date : 2023-08-22 Erich Novak
Abstract not available
-
On the power of standard information for tractability for L∞ approximation of periodic functions in the worst case setting J. Complex. (IF 1.7) Pub Date : 2023-08-22 Jiaxin Geng, Heping Wang
We study multivariate approximation of periodic functions in the worst case setting with the error measured in the L∞ norm. We consider algorithms that use standard information Λstd consisting of function values or general linear information Λall consisting of arbitrary continuous linear functionals. We investigate equivalences of various notions of algebraic and exponential tractability for Λstd and
-
Bypassing the quadrature exactness assumption of hyperinterpolation on the sphere J. Complex. (IF 1.7) Pub Date : 2023-08-18 Congpei An, Hao-Ning Wu
This paper focuses on the approximation of continuous functions on the unit sphere by spherical polynomials of degree n via hyperinterpolation. Hyperinterpolation of degree n is a discrete approximation of the L2-orthogonal projection of the same degree with its Fourier coefficients evaluated by a positive-weight quadrature rule that exactly integrates all spherical polynomials of degree at most 2n
-
Sampling numbers of smoothness classes via ℓ1-minimization J. Complex. (IF 1.7) Pub Date : 2023-08-05 Thomas Jahn, Tino Ullrich, Felix Voigtlaender
Using techniques developed recently in the field of compressed sensing we prove new upper bounds for general (nonlinear) sampling numbers of (quasi-)Banach smoothness spaces in L2. In particular, we show that in relevant cases such as mixed and isotropic weighted Wiener classes or Sobolev spaces with mixed smoothness, sampling numbers in L2 can be upper bounded by best n-term trigonometric widths in
-
Convergence of the Gauss-Newton method for convex composite optimization problems under majorant condition on Riemannian manifolds J. Complex. (IF 1.7) Pub Date : 2023-08-09 Qamrul Hasan Ansari, Moin Uddin, Jen-Chih Yao
In this paper, we consider convex composite optimization problems on Riemannian manifolds, and discuss the semi-local convergence of the Gauss-Newton method with quasi-regular initial point and under the majorant condition. As special cases, we also discuss the convergence of the sequence generated by the Gauss-Newton method under Lipschitz-type condition, or under γ-condition.
-
The minimal radius of Galerkin information for the problem of numerical differentiation J. Complex. (IF 1.7) Pub Date : 2023-08-05 S.G. Solodky, S.A. Stasyuk
The problem of numerical differentiation for periodic functions with finite smoothness is investigated. For multivariate functions, different variants of the truncation method are constructed and their approximation properties are obtained. Based on these results, sharp bounds (in the power scale) of the minimal radius of Galerkin information for the problem under study are found.
-
Random-prime–fixed-vector randomised lattice-based algorithm for high-dimensional integration J. Complex. (IF 1.7) Pub Date : 2023-08-02 Frances Y. Kuo, Dirk Nuyens, Laurence Wilkes
We show that a very simple randomised algorithm for numerical integration can produce a near optimal rate of convergence for integrals of functions in the d-dimensional weighted Korobov space. This algorithm uses a lattice rule with a fixed generating vector and the only random element is the choice of the number of function evaluations. For a given computational budget n of a maximum allowed number
-
Rates of approximation by ReLU shallow neural networks J. Complex. (IF 1.7) Pub Date : 2023-07-31 Tong Mao, Ding-Xuan Zhou
Neural networks activated by the rectified linear unit (ReLU) play a central role in the recent development of deep learning. The topic of approximating functions from Hölder spaces by these networks is crucial for understanding the efficiency of the induced learning algorithms. Although the topic has been well investigated in the setting of deep neural networks with many layers of hidden neurons,
-
Approximating smooth and sparse functions by deep neural networks: Optimal approximation rates and saturation J. Complex. (IF 1.7) Pub Date : 2023-07-27 Xia Liu
Constructing neural networks for function approximation is a classical and longstanding topic in approximation theory. In this paper, we aim at constructing deep neural networks with three hidden layers using a sigmoidal activation function to approximate smooth and sparse functions. Specifically, we prove that the constructed deep nets with controllable magnitude of free parameters can reach the optimal
-
Worst case tractability of linear problems in the presence of noise: Linear information J. Complex. (IF 1.7) Pub Date : 2023-07-26 Leszek Plaskota, Paweł Siedlecki
We study the worst case tractability of multivariate linear problems defined on separable Hilbert spaces. Information about a problem instance consists of noisy evaluations of arbitrary bounded linear functionals, where the noise is either deterministic or random. The cost of a single evaluation depends on its precision and is controlled by a cost function. We establish mutual interactions between
-
On the complexity of a unified convergence analysis for iterative methods J. Complex. (IF 1.7) Pub Date : 2023-07-11
A local and a semi-local convergence of general iterative methods for solving nonlinear operator equations in Banach spaces is developed under ω-continuity conditions. Our approach unifies existing results and provides a new way of studying iterative methods. The main idea is to find a more accurate domain containing the iterates. No extra effort is used to obtain this. Also, the results of the numerical
-
Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs J. Complex. (IF 1.7) Pub Date : 2023-06-15 Dinh Dũng, Van Kien Nguyen, Duong Thanh Pham
We investigate non-adaptive methods of deep ReLU neural network approximation in Bochner spaces L2(U∞,X,μ) of functions on U∞ taking values in a separable Hilbert space X, where U∞ is either R∞ equipped with the standard Gaussian probability measure, or [−1,1]∞ equipped with the Jacobi probability measure. Functions to be approximated are assumed to satisfy a certain weighted ℓ2-summability of the
-
The rate of convergence for sparse and low-rank quantile trace regression J. Complex. (IF 1.7) Pub Date : 2023-06-19 Xiangyong Tan, Ling Peng, Peiwen Xiao, Qing Liu, Xiaohui Liu
Trace regression models are widely used in applications involving panel data, images, genomic microarrays, etc., where high-dimensional covariates are often involved. However, the existing research involving high-dimensional covariates focuses mainly on the condition mean model. In this paper, we extend the trace regression model to the quantile trace regression model when the parameter is a matrix
-
Optimal recovery and volume estimates J. Complex. (IF 1.7) Pub Date : 2023-06-17 Alexander Kushpel
We study volumes of sections of convex origin-symmetric bodies in Rn induced by orthonormal systems on probability spaces. The approach is based on volume estimates of John-Löwner ellipsoids and expectations of norms induced by the respective systems. The estimates obtained allow us to establish lower bounds for the radii of sections which gives lower bounds for Gelfand widths (or linear cowidths)
-
-
The curse of dimensionality for the Lp-discrepancy with finite p J. Complex. (IF 1.7) Pub Date : 2023-06-08 Erich Novak, Friedrich Pillichshammer
The Lp-discrepancy is a quantitative measure for the irregularity of distribution of an N-element point set in the d-dimensional unit-cube, which is closely related to the worst-case error of quasi-Monte Carlo algorithms for numerical integration. It's inverse for dimension d and error threshold ε∈(0,1) is the minimal number of points in [0,1)d such that the minimal normalized Lp-discrepancy is less
-
Tractability of L2-approximation and integration in weighted Hermite spaces of finite smoothness J. Complex. (IF 1.7) Pub Date : 2023-06-01 Gunther Leobacher, Friedrich Pillichshammer, Adrian Ebert
In this paper we consider integration and L2-approximation for functions over Rs from weighted Hermite spaces. The first part of the paper is devoted to a comparison of several weighted Hermite spaces that appear in literature, which is interesting on its own. Then we study tractability of the integration and L2-approximation problem for the introduced Hermite spaces, which describes the growth rate
-
Discrepancy bounds for normal numbers generated by necklaces in arbitrary base J. Complex. (IF 1.7) Pub Date : 2023-05-23 Roswitha Hofer, Gerhard Larcher
Mordechay B. Levin (1999) has constructed a number λ which is normal in base 2, and such that the sequence ({2nλ})n=0,1,2,… has very small discrepancy N⋅DN=O((logN)2). This construction technique was generalized by Becher and Carton (2019), who generated normal numbers via nested perfect necklaces, for which the same upper discrepancy estimate holds. In this paper we derive an upper discrepancy bound
-
-
Numerical weighted integration of functions having mixed smoothness J. Complex. (IF 1.7) Pub Date : 2023-05-05 Dinh Dũng
We investigate the approximation of weighted integrals over Rd for integrands from weighted Sobolev spaces of mixed smoothness. We prove upper and lower bounds of the convergence rate of optimal quadratures with respect to n integration nodes for functions from these spaces. In the one-dimensional case (d=1), we obtain the right convergence rate of optimal quadratures. For d≥2, the upper bound is performed
-
Dmitriy Bilyk and Feng Dai are the winners of the 2023 Joseph F. Traub Prize for Achievement in Information-Based Complexity J. Complex. (IF 1.7) Pub Date : 2023-04-18 Erich Novak
Abstract not available
-
A continuous characterization of PSPACE using polynomial ordinary differential equations J. Complex. (IF 1.7) Pub Date : 2023-03-27 Olivier Bournez, Riccardo Gozzi, Daniel S. Graça, Amaury Pouly
In this paper we provide a characterization of the complexity class PSPACE by using a purely continuous model defined with polynomial ordinary differential equations.
-
Average case tractability of non-homogeneous tensor product problems with the absolute error criterion J. Complex. (IF 1.7) Pub Date : 2023-02-27 Guiqiao Xu
We study average case tractability of non-homogeneous tensor product problems with the absolute error criterion. We consider algorithms that use finitely many evaluations of arbitrary linear functionals. For general non-homogeneous tensor product problems, we obtain the matching necessary and sufficient conditions for strong polynomial tractability in terms of the one-dimensional eigenvalues. We give
-
Nonasymptotic analysis of robust regression with modified Huber's loss J. Complex. (IF 1.7) Pub Date : 2023-02-28 Hongzhi Tong
To achieve robustness against the outliers or heavy-tailed sampling distribution, we consider an Ivanov regularized empirical risk minimization scheme associated with a modified Huber's loss for nonparametric regression in reproducing kernel Hilbert space. By tuning the scaling and regularization parameters in accordance with the sample size, we develop nonasymptotic concentration results for such
-
Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the curse of dimensionality J. Complex. (IF 1.7) Pub Date : 2023-03-01 Philipp Grohs, Shokhrukh Ibragimov, Arnulf Jentzen, Sarah Koppensteiner
Artificial neural networks (ANNs) have become a very powerful tool in the approximation of high-dimensional functions. Especially, deep ANNs, consisting of a large number of hidden layers, have been very successfully used in a series of practical relevant computational problems involving high-dimensional input data ranging from classification tasks in supervised learning to optimal decision problems
-
On Huber's contaminated model J. Complex. (IF 1.7) Pub Date : 2023-02-28 Weiyan Mu, Shifeng Xiong
Huber's contaminated model is a basic model for data with outliers. This paper aims at addressing several fundamental problems about this model. We first study its identifiability properties. Several theorems are presented to determine whether the model is identifiable for various situations. Based on these results, we discuss the problem of estimating the parameters with observations drawn from Huber's
-
Low-energy points on the sphere and the real projective plane J. Complex. (IF 1.7) Pub Date : 2023-02-21 Carlos Beltrán, Ujué Etayo, Pedro R. López-Gómez
We present a generalization of a family of points on S2, the Diamond ensemble, containing collections of N points on S2 with very small logarithmic energy for all N∈N. We extend this construction to the real projective plane RP2 and we obtain upper and lower bounds with explicit constants for the Green and logarithmic energy on this last space.
-
On oracle factoring of integers J. Complex. (IF 1.7) Pub Date : 2023-02-10 Andrzej Dąbrowski, Jacek Pomykała, Igor E. Shparlinski
We present an oracle factorisation algorithm, which in polynomial deterministic time, finds a nontrivial factor of almost all positive integers n based on the knowledge of the number of points on certain elliptic curves in residue rings modulo n.
-
Consistency of randomized integration methods J. Complex. (IF 1.7) Pub Date : 2023-02-06 Julian Hofstadler, Daniel Rudolf
We prove that a class of randomized integration methods, including averages based on (t,d)-sequences, Latin hypercube sampling, Frolov points as well as Cranley-Patterson rotations, consistently estimates expectations of integrable functions. Consistency here refers to convergence in mean and/or convergence in probability of the estimator to the integral of interest. Moreover, we suggest median modified
-
The BMO-discrepancy suffers from the curse of dimensionality J. Complex. (IF 1.7) Pub Date : 2023-01-24 Friedrich Pillichshammer
We show that the minimal discrepancy of a point set in the d-dimensional unit cube with respect to the BMO seminorm suffers from the curse of dimensionality.
-
-
Bit-Complexity of Classical Solutions of Linear Evolutionary Systems of Partial Differential Equations J. Complex. (IF 1.7) Pub Date : 2023-01-04 Ivan Koswara, Gleb Pogudin, Svetlana Selivanova, Martin Ziegler
We study the bit-complexity intrinsic to solving the initial-value and (several types of) boundary-value problems for linear evolutionary systems of partial differential equations (PDEs), based on the Computable Analysis approach. We compute classical solutions to such problems approximately up to guaranteed precision 1/2n, so that n corresponds to the number of reliable bits of the output; algorithmic
-
On the cardinality of lower sets and universal discretization J. Complex. (IF 1.7) Pub Date : 2023-01-02 F. Dai, A. Prymak, A. Shadrin, V.N. Temlyakov, S. Tikhonov
A set Q in Z+d is a lower set if (k1,…,kd)∈Q implies (l1,…,ld)∈Q whenever 0≤li≤ki for all i. We derive new and refine known results regarding the cardinality of the lower sets of size n in Z+d. Next we apply these results for universal discretization of the L2-norm of elements from n-dimensional subspaces of trigonometric polynomials generated by lower sets.
-
Information-based complexity young researcher award J. Complex. (IF 1.7) Pub Date : 2023-01-05 Alexey Khartov, David Krieg, Erich Novak, Michaela Szölgyenyi, Henryk Woźniakowski
Abstract not available
-
Changes of the Editorial Board J. Complex. (IF 1.7) Pub Date : 2023-01-04 Josef Dick, Aicke Hinrichs, Erich Novak, Klaus Ritter, Grzegorz Wasilkowski, Henryk Woźniakowski
Abstract not available
-
Best Paper Award of the Journal of Complexity J. Complex. (IF 1.7) Pub Date : 2023-01-04 Erich Novak
Abstract not available
-
Nominations for 2023 Joseph F. Traub Information-Based Complexity Young Researcher Award J. Complex. (IF 1.7) Pub Date : 2023-01-04 Erich Novak
Abstract not available