You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

Dagstuhl Seminar on Measuring the Complexity of Computational Content 2018

This double issue of the journal Computability is a special issue dedicated to the Dagstuhl Seminar on Measuring the Complexity of Computational Content: From Combinatorial Problems to Analysis that took place in September 2018 at the Leibniz Center for Informatics at Dagstuhl castle, Germany [1]. The seminar was dedicated to topics related to Weihrauch complexity and reverse mathematics with applications that range from combinatorial problems to analysis.

Reducibilities such as many-one, Turing or polynomial-time reducibility have been an extraordinarily important tool in theoretical computer science from its very beginning. In recent years these reducibilities have been transferred to the continuous setting, where they allow us to classify computational problems on real numbers and other continuous data types.

In the late 1980s Weihrauch has introduced a reducibility that can be seen as an analogue of many-one reducibility for (multi-valued) functions on infinite data types. This reducibility, now called Weihrauch reducibility, was studied since the 1990s by Weihrauch’s school of computable analysis and flourished recently when Gherardi and Marcone proposed this reducibility as a tool for a uniform approach to reverse mathematics.

Reverse mathematics aims to classify theorems according to the axioms that are needed to prove these theorems in second-order arithmetic. This proof theoretic approach yields non-uniform classifications of the computational content of certain theorems. However, many of these classifications also have uniform content and Weihrauch complexity allows us to study this uniform computational content directly using methods of computability theory.

This perspective has motivated Dorais, Dzhafarov, Hirst, Mileti and Shafer, on the one hand, Hirschfeldt and Jockusch, on the other hand, to study combinatorial problems using this approach. This research has led to a number of further reducibilities (computable reducibility, generalized Weihrauch reducibility and others) that can be seen as non-uniform or less resource sensitive versions of Weihrauch reducibility. Using this toolbox of reducibilities one can now adjust the instruments exactly according to the degree of uniformity and resource sensitivity that one wants to capture.11

Altogether, the seminar did proceed in a highly productive atmosphere, thanks to many excellent contributions from participants. This special issue contains eight carefully selected articles of high quality that have been reviewed following the usual standards of our discipline. These articles represent a cross section through topics that have been presented and discussed during the seminar. We would like to use this opportunity to thank the authors for their excellent contributions and the additional reviewers for their thorough work!

Notes

1 An up-to-date bibliography on Weihrauch complexity can be found at http://cca-net.de/publications/weibib.php.

References

[1] 

V. Brattka, D.D. Dzhafarov, A. Marcone and A. Pauly (eds), Measuring the complexity of computational content: From combinatorial problems to analysis (Dagstuhl Seminar 18361), Technical Report, 9, Dagstuhl, Germany, 2019. ISSN 2192-5283. http://drops.dagstuhl.de/opus/volltexte/2019/10327. doi:10.4230/DagRep.8.9.1.