Sketch of P not equal to NP
Here is a sketch of how to prove that \(P\) is not equal to \(NP\). Our ideas arose and directly developed from discussions with C.A. Feinstein. Ultimately the question is whether random search over unstructured sets is reducible. We argue that it’s not reducible, basically there are no shortcuts to search over a large set. And this is formalized in the whole conception of computation as deciding the supports of measures, and algorithms are compositions of measures.
Measures and Algorithms:
(Deciding supports is the universal decision problem) Every decision problem is equivalent to the problem of deciding the support of a measure \(\mu\). Or almost equivalently, of deciding the support of the image of a function \(F\). E.g. for the subset sum problem, the image is the pushforward onto \(\bf{Z}\) of the sum total \(\epsilon: 2^X \to {\bf{Z}}\).
(Algorithms are compositions of measures, equivalently paths in the category of finite measure spaces) An algorithm to solve a decision problem is a composition of the measure \(\mu\) or the function \(F\) into intermediate measures or function evaluations. Here we comment that there is a natural composition of measures, c.f. the proofs of triangle inequalities in Wasserstein spaces.
What is the Complexity of a Measure or Function or Algorithm?
The classic definition of computational complexity counts the number of time steps required, or number of calls to RAM or function evaluations, in the worst-possible case. We require here a definition of complexity for measures. The definition must be absolutely simple, and we can only say that the complexity of a measure is the “size” of its support. In this sense, complexity is equivalent to “entropy” as defined by Gromov [insert ref].
- (Complexity and Entropy: the Same Quantity) The complexity of a measure or a function is the “size” of the support. Complexity is subadditive (or superadditive?) with respect to composition. This means that decomposing \(F\) into a very long composition \(F=\cdots \circ F_i \circ F_{i-1} \circ \cdots\) does not necessarily reduce the complexity.
Every Algorithm Is a Composition of Random Searches
- (Algorithmic Factorization Theorem) Every algorithm is a composition of deterministic function evaluations and random search over a subset. (This requires more formal definition and proof). This also formalizes the idea that Lebesgue measure on a set amounts to random search on the support.
Random Search is Computationally Irreducible
- (Invariance of Domain and Irreducibility of Measure Complexity) Brouwer-Lebesgue Invariance of Domain theorem applies. This says there do not exist continuous injective measure-preserving maps from \(R^n\) to smaller dimension \(R^k\) for \(k < n\).
The items 1., 2., …, 5. when rigorously established will essentially prove that \(P \neq NP\).