Easy-ish Intractable Reductions

In the last article we covered the most elementary examples of how you can reduce one problem into another. In this article, we’re going to provide some more reductions in the context of computer science, particularly in relation to our extensive study of graphs. But first, onto the Daily Problem:

This article is part of the Data Structures and Algorithms Series following the course The Analysis of Algorithms. If you missed the previous article, check that out first as well as a copy of The Algorithm Design Manual.

Answers to the previous Daily Problem

We wish to compute the laziest way to dial given n-digit number on a standard push-button telephone using two fingers. We assume that the two fingers start out on the * and # keys, and that the effort required to move a finger from one button to another is proportional to the Euclidean distance between them. Design and analyze an algorithm that computes in time O(n) the method of dialing that involves moving your fingers the smallest amount of total distance.

And you thought you could get rid of dynamic programming, didn’t you? So we’ve got 10 numbers on the keypad plus the special characters. In other words, you’re asking yourself “how many phone numbers can I construct with two fingers?”

This breaks down into a bunch of subset of problems like “how many 10-digit numbers can I construct with my fingers on * and #?” which is a superset of the configuration where you slide your left finger up by 1 unit and pressed 7, or “how many 9-digit numbers can I construct with my fingers on 7 and #”. That is a superset of the configuration where you slide your left finger up by 1 again and pressed 4, or “how many 8-digit numbers can I construct with my fingers on 4 and #.” I think you see where this is going.

To drill this into your brains: the key here is that we see this repetition and use the previous results via a cache to speed up our program. Thus, all of those 10-digit phone numbers that start with 7, we remember we store “move up 1” as a way to construct a phone number starting with 7. We never have to worry about figuring that out again for all other numbers starting with 7.

Because we’re saving time by increasing the space, we reduce this algorithm down from something exponential to something linear. The algorithm, at a high level, looks like this:

  1. Set a default value of # and * for your starting points. For a 1-digit or 2-digit number, the cost is simply the distance from that key to the new number for one finger (or two fingers).
  2. For the remaining digits with a number range from 0 to 9, store the cost in a 3D matrix mtx[digit][fromLeft][fromRight] where the first axis digit is the number of digits into the sequence you’re in and fromLeft (or fromRight) is the successive score coming from a certain key on the keyboard given where your finger was previously placed.
  3. Return the score of the matrix mtx for the n digits specified starting with the * and # keys.

Since we’re filling one cell at a time in the matrix, this is just a simple run of our n digits across a matrix of space 100n, since there are 10n cells for each finger. We can reduce O(100n) down to O(n), which is a linear-time algorithm.

Three examples of easier reductions

To reiterate, a reduction is simply a way of turning one concept into another; reducing it into a simpler/different problem. Here are a few more examples that we can extend upon from last time.

1. Hamiltonian Cycle & TSP

We all know the Traveling Salesman Problem: getting from one point to another as fast as possible. The Hamiltonian cycle is a similar problem with two major distinctions:

  1. The graph is unweighted (i.e. in TSP every vertex has a cost, but in HC those costs are all the same)
  2. The goal is to tour all elements exactly once. TSP also visits every edge (to find the best path), but that isn’t the goal

Due to their similarities, we can express a Hamiltonian cycle as a reduction of a Traveling Salesman Problem:

  1. First, make a copy of your Hamiltonian graph such that all vertices have the same weight
  2. For every vertex against every vertex (an n^2 loop), if the combo of these two edges expresses a vertex in your graph, set the weight of that vertex to be 1. Otherwise, set the weight to 2.
  3. Use this newly-weighted Graph to decide if there is an answer to the TSP across all n of these vertices.

We know this works because if we find a Hamiltonian cycle on this graph, we can trace a TSP solution that has translated across all valid edges. Since each edge is of weight 1 and we have to explore all n vertices, the cost of the TSP tour has to be n. Reducing that back into the Hamiltonian cycle graph which has no weights, it would only use the edges of weight 1 which is the cycle we’re aiming to find.

2. Vertex Cover & Independent Set

Still looking at graphs we see two more similar problems revolving around subsets of vertices in a graph.

A vertex cover is a subset of vertices in a graph that touch all edges. Conversely, an independent set is a subset of vertices in a graph that don’t touch each over via an edge. For certain graphs, coloring in the vertices of vertex cover look like the inverse of what an independent set would be.

So, to handle the reduction, thus proving that both problems are just as hard to tackle, we’ve got a pretty straightforward algorithm:

  1. Make a copy of the vertex cover graph
  2. Make a copy of the number of vertices in your vertex cover subset
  3. Find an answer to the independent set graph using your copied graph and vertex subset

The important thing here is we transform the inputs, not the solution. In fact, we don’t even need to know what the solution is. Simply by translating the inputs into independent set problem we are able to solve for the hardness of vertex cover.

Oh, and to make things more fun: we reduced TSP into a Hamiltonian cycle, and a Hamiltonian cycle can be reduced into a vertex cover, just like an independent set.

3. Clique & Independent Set

For our last examples we look at cliques in graphs. Yes, like social cliques of friends, graphs can form a clique which is to say that a graph where a vertex has an edge to every other vertex in some kind of subset of vertices.

An example of a five-vertex clique would be the pentagram formed within a pentagon. Each corner of the pentagon forms a shape, connected the adjacent vertices. To solidify the clique, you now create edges to all of the other vertices that aren’t part of the pentagon’s outer shape. And what do you know, a clique is just a reduction of an independent set.

  1. Copy your independent set graph, except the edges in this copy are the inverse of the edges in your independent set. In other words, if the edge (i,j) is not in your original graph, add that edge to your copy.
  2. Find an answer to the clique graph with the same vertex subset and your new graph

As we can see here, we can imply the hardness of clique by the hardness of independent set, and thus the hardness of vertex cover. Where does this chain of hardness reductions end? That’s where satisfiability comes in.

Satisfiability: the one reduction to rule them all

NP-completeness derives all the way down to satisfiability. Any hard problem can be reduced down into a series of graph constraints like the three we saw above. Those problems all eventually reduce down to satisfiability. We can satisfy a statement by saying at least 1 truth exists for every expression in a program. This is super easy to describe with JavaScript:

let v1, v2;
v1 = true;  v2 = true;  // satisfied
v1 = false; v2 = false; // satisfied
v1 = true;  v2 = false; // not satisfied

const expr1 = v1 && !v2;
const expr2 = !v1 && v2;
return expr1 && expr2; // if this returns true, we're satisfied

In the above example we can find configurations for v1 and v2 that satisfy the final boolean expression. Certain problems, no matter how hard you try, cannot be satisfied:

let v1, v2;
v1 = true;
v2 = true;

const expr1 = v1 && v2;
const expr2 = v1 && !v2;
const expr3 = !v1;
return expr1 && expr2 && expr3; // never satisfied

Go ahead and try the above code in your console. No matter how you arrange v1 and v2, you will never get your return statement to return a value of true.

One of the elementary reductions from satisfiability (the parent to vertex cover) is 3-Satisfiability, or 3-SAT. This is just a specialized form of the above but every expression must contain exactly 3 variables. And since we know satisfiability is hard, we know 3-SAT is hard, too. This can extend for all numbers larger than 3, so a statement made up of four-variable expressions is called 4-SAT, a statement with all five-variable expressions is called 5-SAT, and so on.

Why does this only go up and not down in numbers?

With one variable in 1-SAT, if you set that variable to true you have a trivial problem. For 2-SAT, you can solve it in linear time with depth-first search, a polynomial-time algorithm and thus not in the real of non-polynomial hard problems.

From the n-SAT problems derives vertex cover and all of the other previous elementary reductions, so we’ll stop there.

Onto the next Daily Problem

Suppose we are given a subroutine which can solve the traveling salesman decision problem in, say, linear time. Give an efficient algorithm to find the actual TSP tour by making a polynomial number of calls to this subroutine.

More problems to practice on

For easier reduction problems, check out these from the homework set:

  1. Problem 9-8.
  2. Problem 9-10.
  3. Problem 9-11.
  4. Problem 9-12.

Get the FREE UI crash course

Sign up for our newsletter and receive a free UI crash course to help you build beautiful applications without needing a design background. Just enter your email below and you'll get a download link instantly.

A new version of this app is available. Click here to update.