Project Details
Projekt Print View

Quantitative Uniform Complexity Theory of Multivalued Real Functions and Operators in Analysis

Applicant Professor Dr. Thomas Streicher, since 8/2015
Subject Area Theoretical Computer Science
Mathematics
Term from 2013 to 2016
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 229100744
 
Recursive Analysis was initiated by Alan Turing as the theory of computability over real numbers in the sense of rational approximations up to prescribable absolute error 1/2^N and, more generally, over any universe of continuum cardinality with respect to a fixed encoding of its elements as infinite binary sequences on type-2 machines (cmp. Weihrauch 2000). This view complements common numerics which typically is concerned with 'large' input dimensions (e.g. the matrix dimension) of fixed precision (double, say), and captures arithmetic of adaptive accuracy using libraries like MPFR, GMP, or iRRAM. Univalent tests become discontinuous and thus uncomputable in thus setting ('Main Theorem') hence requires resorting to multivalued (i.e. intensional) functions.Complexity-theoretic refinements of mere computability have been developed since the 80ies and 90ies, e.g., by H.Friedman, K.-I.Ko, and N.T.Müller for (single-valued) real functions and for operators likemaximum, integral, and the solution to ordinary differential equations: in the nonuniform sense that, for instance, a polynomial-time computable analytic function is known to have a polynomial-time computable integral but not how to obtain that and from what data and what its running time actually is (other than polynomial) or depends on. As matter of fact a reasonable notion of efficient uniform computability of operators was found by Akitoshi Kawamura and Stephen A. Cook (2010) in terms of so-called second-order polynomials, cmp. Mehlhorn (1976) and Kapron&Cook (1996).We stratify this structural into a quantitative complexity theory and extend it to multivalued functions and operators. More precisely we analyze known, and develop new, algorithms with respect to the exponent of their polynomial (in the prescribed output precision N) running time and its dependence on parameters of the (function) space under consideration. This will allow for a more realistic estimate of their practical efficiency and strengthen the connections to the practice of interval computation / validated numerics. Complementing lower complexity bounds often emerge from quantitative analysis of the modulus of (multivalued) continuity of the functions and operators under consideration --- and its multivalued generalization (Pauly&Ziegler 2011). To obtain such estimates we adapt adversary methods from Information-Based Complexity (IBC, closely related to the Blum-Shub-Smale model) to the setting of Recursive Analysis where function evaluation involves both approximate output and input, i.e. provides non-local information.
DFG Programme Research Grants
International Connection Japan
Ehemaliger Antragsteller Professor Dr. Martin Ziegler, until 8/2015
 
 

Additional Information

Textvergrößerung und Kontrastanpassung