Full Length ArticleBracketing numbers of convex and -monotone functions on polytopes
Section snippets
Introduction and motivation
To quantify the size of an infinite dimensional set, the pioneering work of [34] studied the so-called metric entropy of the set, which is the logarithm of the metric covering number of the set. In this paper, we are interested in a related quantity, the bracketing entropy for a class of functions, which serves a similar purpose as metric entropy. Metric or bracketing entropies quantify the amount of information it takes to approximate any element of a set with a given accuracy . This
Bracketing with lipschitz constraints
If we have sets , , for , and then for , , and any class of functions , where, for a set , we let denote the class where is the restriction of to the set . We will apply (5) to a cover of by sets with the property that for some bounded vector , so that we can apply bracketing results for classes of convex functions with Lipschitz bounds. Thus, in this section, we develop the
Bracketing without lipschitz constraints
In the previous section we bounded bracketing entropy for classes of functions with Lipschitz constraints. In this section we remove those Lipschitz constraints. With Lipschitz constraints we could consider arbitrary domains , but without the Lipschitz constraints we need more restrictions: now we will take to be a simple polytope (defined below). We now define notation and assumptions we will use for the remainder of the document.
Properties of
In this section we show how to embed the domains , which partition , into hyperrectangles. We used this in the proof of Theorem 3.5 so we could apply Theorem 2.1. Theorem 2.1 says that the bracketing entropy of convex functions on domain with Lipschitz constraints along directions depends on (since that gives the maximum “rise” in “rise over run”). In our proof of Theorem 3.5 we partitioned into sets related to parallelotopes. Thus we will study these parallelotopes. We
Further applications
We now consider further entropy bounds that rely on the above ideas, results, or their proofs. In Section 5.1 we consider so-called univariate and multivariate -monotone functions. In Section 5.2 we briefly consider estimation of level sets of convex functions and the question of adaptation to polytopal level sets. Further discussion is given at the beginning of the two subsections.
References (45)
- et al.
Entropy estimate for high-dimensional monotonic functions
J. Multivariate Anal.
(2007) - et al.
Estimation of a -monotone density: limit distribution theory and the spline connection
Ann. Statist.
(2007) - et al.
Estimation of a -monotone density: characterizations, consistency and minimax lower bounds
Statist. Neerlandica
(2010) Ellipsoids of maximal volume in convex bodies
Geom. Dedicata
(1992)- et al.
Rates of convergence for minimum contrast estimators
Probab. Theory Related Fields
(1993) An introduction to convex polytopes
-Entropy of convex sets and functions
Sib. Math. J.
(1976)Methods for estimation of convex sets
Statist. Sci.
(2018)- et al.
Maximum likelihood estimation of a multi-dimensional log-concave density
J. R. Stat. Soc. Ser. B Stat. Methodol.
(2010)
Approximation and regularization of Lipschitz functions: convergence of the gradients
Trans. Amer. Math. Soc.
Extremal problems for geometric hypergraphs
Discrete Comput. Geom.
Unimodality, convexity, and applications
Bracketing numbers of convex functions on polytopes
Global rates of convergence of the MLEs of log-concave and -concave densities
Ann. Statist.
Inference for the mode of a log-concave density
Mode-constrained estimation of a log-concave density
Bandwidth selection for kernel density estimators of multivariate level sets and highest density regions
Electron. J. Stat.
Kolmogorov Entropy for classes of convex functions
Constr. Approx.
Central limit theorems for empirical measures
Ann. Probab.
A course on empirical processes
Uniform central limit theorems
Cited by (1)
- 1
Supported by National Science Foundation grant DMS-1712664..