State an algorithm that converts from base 10 to base 2 in essentially linear time. The input to the algorithm is a sequence of digits a_0, a_1, a_2, ..., a_{k-1}. The output is the binary representation of a_0 + 10 a_1 + 100 a_2 + ... + 10^{k-1} a_{k-1}.