## Posts Tagged ‘**P Versus NP**’

## Gödel admits P=NP

** Why Undecidability is Too Important to be Left to Gödel ** ∞

How difficult is it to answer a question like “Where will the Dow Jones industrial Average sit at the end of 1994?”, or “What are the odds of the Dallas Cowboys repeating as the NFL champions?”, or even “On what day will the Santa Fe Institute really move to its new campus off Hyde Park Road?”

While none of these questions is particularly “scientific,” at least in the usual sense of that term, they all have far more immediacy and importance for the daily lives of most of us than do far more scientifically respectable queries such as “What is minimal cost of touring the 50 state capitals?” or “What is the “complexity” of Champernowne’s number?” The point here is that the traditional measures of computational and algorithmic complexity as used in mathematics and computer science don’t even begin to characterize the difficulties inherent in trying to come up with believable, let alone correct, answers to these sorts of questions. So there most certainly are limits to our knowledge – scientific or otherwise.

But for the most part these limits are not the kind envisioned by Gödel, Turing, and their many able successors. Rather they are limits of a far subtler and more insidious kind: limits due to the ill-posed nature of most questions, limits stemming from the vagueness in natural human languages, limits coming from our inability to make error-free measurements, limits arising out of weaknesses in our standard modes of reasoning and, in general, limits generated by the inherently complicated mutual interactions between the many components of most natural and human systems.

So, in my view the most productive way to go about understanding whether or not any of these kinds of limits to our knowledge are permanent or merely temporary barriers to our understanding of the world we live in is to think about the development of new ways of looking at complexity in formal terms, ways that more faithfully mirror our informal ideas about what is and isn’t “complex.” These ways must necessarily include the crucially important fact that real-world complexity (whatever that might be) is not a context-free property of anything. Like truth, beauty, justice and “the good,” complexity resides as much in the eye of the beholder as it in the beholden.

Whatever such a real-world measure of complexity turns out to involve, almost certainly will not be something as simple and naive as a single number characterizing the complexity of a given problem, question or situation. Rather, I envision it being some kind of multi-criterion, interactive type of measure, saying in effect that some kinds of problems are very complex indeed if you attack them by, say, deductive modes of reasoning, but evaporate away like trickle of water in the desert if you use other modes of reasoning like induction, abduction or (fill in your favorite “-ism” here). Sad to say, I have no such “grand unified theory of complexity” to offer up for your entertainment at the moment. But all my intuitions say that such a theory is not beyond hope.

Furthermore, I feel strongly that only when a skeletal framework of this kind of theory emerges will we finally be on the way to banishing the ghost of Gödel from the scientific banquet table.

## Great Lazarus! Grigori Perelman resurrected!

Great Lazarus! Grigori Perelman resurrected! (In a strictly Quantum Naturalization [P=NP] novel proof sense).

It has been three years since the famed New Yorker Article told the tale of unsung academic principled man who called a spade a spade and what a commemoration this will be:

PROOF:[ P =NP ]

DATE #1 ON title page: February 1, 2008

DATE #2 ON title page: 11 November 2002

As a tribute to my friend I have never met: The entropy formula for the Ricci flow and its geometric applications by Grisha Perelman*

The production of this paper by terms 6 years later and inside the melting pot of mathematical alchemy that is the ArXiv government funded research archives in “e-paper” terms is simply the product of the repeating of the digits “211” in frame.

The engine and cryptography that drives NASA and The FDIC screeches to a halt with three-digits that rhyme with a Bannette’s song or a piece of improv.

[P=NP] system scripting 2 is a technique to take problems such as deductive reasoning over massive amounts of written and numerical data and strip the parts that do not pertain to the subject matter like a colander.

Yes I will be collecting the Clay Mathematics Institute Millennium Prize but I will donate 1/2 to charity and the other half I will invest in forming an innovative business which gives back to the community. The idea is an endeavor which does something to shift the predominance of wealth in a minute majority. Currently the top 1% of the world population has 40% of the global assets. The lower 50% has barely 2% of the total wealth 3. I should like to contribute to fostering a corporate and social economy where everyone who is sick can go to a doctor and everyone who wants to learn can go to school. And everyone who works a full-time job in this country and around the world has not to worry about feeding their family and paying their mortgage as factors to consider against one another. (Unforgotten are the homeless and those without clean water.)

Truly this claim of computational complexity is known or unknowable the last leg of all economic and capital support which hides itself in the name of prosperity while those who work for a living make ends meet. For a cure to cancer and AIDS. For a solution to so many things.

M.M.Musatov