Down to Zero II

We’ll tackle this one through dynamic programming. We’ll build up solutions for all possible values, starting from the smallest ones where we know the answer (2, in this case — it must have answer 2).

min_steps = list(range(1000001))

for n in range(2, 1000001):
    # Check: is the current value better, or is it better to do n-1 first?
    min_steps[n] = min(min_steps[n-1]+1, min_steps[n])

    # For any one where we know the solution, update all multiples
    # to follow the rule around factoring.
    for i in range(2, min(n, 1000000//n) + 1):
        min_steps[n*i] = min(min_steps[n]+1, min_steps[n*i])

def downToZero(n):
    return min_steps[n]

We might consider using recursion instead. Obviously, caching would be important, but that alone would not save us. One recursive path would be to subtract one each step, leading to a recursion depth of $n$. If this doesn’t hit a cached value, it will easily exceed Python’s recursion depth limit.

Will a greedy algorithm work?

It is tempting to try to solve this with a greedy algorithm, “greedy” in the sense that moves to the smallest number that can be reached at each step.
This turns out to not always work—sometimes it’s better to take a smaller step first that sets you up for more efficient steps later.

The smallest example of this is $n=33$. The “greedy” solution would be 33→11→10→5→4→2→1→0 (7 steps), but it’s faster to step down by one first: 33→32→8→4→2→1→0 (6 steps).

This is even true when choosing between several divisors. For $n=60$, it would be tempting to go 60→10→5→4→2→1→0 (6 steps), but it’s actually shorter to move to 12 on the first step: 60→12→4→2→1→0 (5 steps).