NP complexity class
- NP-hard
- This problem is as least as hard as every problem in NP
- In NP
- This problem can be solved in polynomial time on a non-deterministic Turing machine.
- NP-complete
- Both NP-hard and in NP.
It is widely suspected that no NP-hard problem can be solved in polynomial time, but this has not been proven. The P!=NP question is one of the greatest open problems in computer science and mathematics and is among the 7 million-dollar Clay Math millenium challenge problems.
Technicalities
Variants
- Strongly NP hard
- the problem is still NP-hard if all numbers in the problem are expressed in unary.
- Weakly NP hard
- the problem is polynomial-time solvable if all numbers in the problem are
Optimization problems
The definition of NP given about is for yes/no decision problems only. However, many natural problems are optimization problems, for example the traveling salesman problem, where the question is not whether or not a tour exists, but the cost of the cheapest one. Optimization problems are considered in this framework by considering a decision variant. For example, the decision variant of the traveling salesman problem is the question of whether or not there exists a tour with cost less than a given value. An optimization problem is classified as being NP-hard, in NP or NP-complete based on the classification of its decision variant.
Example NP-complete problems
Two NP-complete problems are particularly important because many other problems can be easily reduced to them: 3-SAT and Integer Programming.
3-SAT
Satisfiability is determining whether or not a given boolean formula in conjunctive normal form admits an assignment of values to the variables which satsfies all the clauses. The special-case of 3-SAT is the restriction of satisfiability to 3 variables per clause; one can easily convert a general satisfiability problem into a 3-SAT problem by adding extra variables and splitting clauses.
Integer Programming
If one removes the constraint that the variables be integer-valued and allows them to be real-valued, the result is the linear relaxation of the problem. Linear relaxations are crucial both for branch and bound and approximation algorithms (discussed below).
MAX CUT
Many important techniques, such as semi-definate programming for approximation algorithms, were first used to solve max cut and later extended to other problems.
Traveling salesman problem
This problem is particularly famous because it is easy to explain.
Knapsack
The knapsack problem is only weakly NP-hard.
Example problems whose hardness is unknown
Factoring
The problem of determining the factors of a composite integer is famously difficult. Factoring is in NP, but is not known whether or not it is NP hard. Factoring $n$ as such is not a decision problem, but one can ask the question of whether In fact, the related decision problem of determining whether an integer is prime or composite was shown to be in P.
Graph Isomorphism
Determining whether or not two given graphs are isomorphic is another important problem which is not known to either be polynomial-time solvable or NP-complete.
Nash equilibrium
Many problems related to Nash equiliria are not known to be polynomially-time solvable but are not known to be NP-complete. However, finding a Nash equilibria is PPAD-complete.
Solving NP-complete problems
NP-complete problems are extremely important in applications, so one cannot simply give up when a problem is shown to be NP-hard. Fortunately, there are at least three ways to get around NP-hardness. NP hardness is an asymptotic result of worst-case instances, so it is often possible to use branch and bound to solve the instances one cares about in a reasonable amount of time. Local search techniques cannot generally be proven to work well, but they often work excellently in practice. For optimization problems, one can
Branch and Bound
Local Search and other heuristic approaches
In computer science, a heuristic is a technique which works works in practice but comes with no theoretical guarentees of performance.
Local search techniques start with a solution and try to improve it. Local search techniques include:
- Simulated annealing
- inspired by nature's ability to find low-energy configurations in statistical physics, accept moves that reduce the quality of the solution with probability . Some simulated annealing algorithms can be theoretically shown to converge to the right answer if one waits infinitely long, but these theoretical guarentees are worse than exhaustive search and therefore of minimal import.
- Tabu Search
- to prevent the search from repeating past mistakes, keep an explicit list of changes made recently and avoid changing one's mind.
- Genetic Algorithms
- Inspired by biological evolution. Rather than keeping 1 current solution, keep a population of solutions. Simulate mutation and sexual reproduction.
Approximation Algorithms
Determining the exact optimum value of an NP-complete optimization problem in polynomial time is not possible unless P=NP, but many such problems can be solved approximately in polynomial time. For example, the MAX-CUT problem (see above) can be approximated to within a factor of two by simply choosing a random cut! To see that this is a two-approximation, note that firstly, the optimum cut can cut at most all the edges, and seconly, every edge will be cut with probability 1/2, so on average half of the edges will be cut.