A Trip Down Memory Lane: Desc comp, Constant Round Sorting, Division Breakthrough, Derandomization.
Computational Complexity 2024-10-14
I came across (by accident) the link to all of the BEATCS complexity columns from 1987 to the 2016. See HERE. (If you know a link to a more recent webpage then email me or make a comment. There is a link to all of the issues of BEATCS here; however, I want a page with just the complexity theory columns.)
This gave me a trip down memory lane and a series of blog posts: Briefly describe an articles and also get commentary from the original authors on what they think now about the area now.
I did the first in this series, on two articles by Eric Allender, here. Today is the second: articles by Neil Immerman, Gasarch-Golub-Kruskal, Eric Allender (again!), and Valentine Kabanets.
67) Progress in Descriptive Complexity by Neil Immerman, 1999.We usually think of P, NP, and other classes in terms of TIME or SPACE. Descriptive complexity is about defining complexity classes in terms of how they can be described in a logical language. (For reviews of three books on this topic, including Neil Immerman's, see here.) I asked Neil Immerman to comment on the state of Descriptive Complexity now and he said:
- Graph Neural Nets: see Martin Grohe, "The Descriptive Complexity of Graph Neural Networks'', (see here); and
- Graph Isomorphism: see Martin Grohe, Descriptive Complexity, Canonisation, and Definable Graph Structure Theory, Cambridge University Press, 2017, and Anuj Dawar, "The Nature and Power of Fixed-Point Logic with Counting", ACM SIGLOG News, Jan. 2015, Vol. 2, No. 1.
72) A Survey of Constant Round Sorting by Bill Gasarch, Evan Golub and Clyde Kruskal in 2000. The model of computation is that in each round one could do lots of comparisons. Transitive closure was considered free, so the model was not practical and didn't claim to be; however, lower bounds on this model implied lower bounds on more realistic models. I asked Bill Gasarch to comment on the paper from the prospective of 2023 and he said:
I don't think the paper is out dated in terms of later results, though the model of parallelism used is not studied anymore. A survey on sorting on more realistic models would be a good idea and may actually exist.
74) The Division Breakthrough by Eric Allender, 2001. How complicated are addition, subtraction, multiplication, and division? The addition and subtraction algorithm that you learned in grade school is a log space algorithm (your teacher should have pointed that out to you). Multiplication is also in log space---that's an easy exercise. But what about division? When Eric Allender was doing division in grade school he wondered can this be done in log space? This was open for quite some time, but was solved- Division is in Log Space- by Chui, Davida, Litow, in 2001 (actually done by Chui in his MS thesis in 1995 but not published until 2001. Fun Fact: Chui went to law school). Log space is an upper bound, what about a lower bound? Hesse showed that Division is complete for DLOGTIME-uniform-TC^0. What is needed is a complete write up of all of this in a uniform notation. Eric Allender has provided that! I asked Eric if there are more open problems in the area of complexity-of-arithmetic. He responded:
What complexity class best captures the complexity of GCD (computing the Greatest Common Divisor)? GCD is hard for\( TC^0\) (proved in a paper I wrote with Mike Saks and Igor Shparlinski) but we have no upper bound better than P. Similarly: What is the complexity of modular exponentiation (given x, y, and m, compute x^y mod m)? Again, we have no upper bound better than P. Modular exponentiation is an important subroutine in the polynomial-time algorithms for checking primality, and the lack of a better upper bound for this problem is one of the main reasons that we have no better upper bound for primality-checking than P. ...and while we're talking about the primes, let me mention that my SIGACT News article( see here) from early 2023 explains that I'm offering a $1,000 reward to anyone who can provide a modest improvement on the \(TC^0\)-hardness of primality-checking. We currently only know a non-uniform \(AC^0\) reduction from MAJORITY to primality-checking. I'm asking for a uniform \(AC^0\) reduction.
76) Derandomization: A Brief Survey by Valentine Kabanets, 2002. A very nice survey. Has there been progress since then? I asked Valentine and he said