Integer multiplication
Traditional multiplication takes $O(n^2)$ time.
Let $x$ and $y$ be two $n$-bit binary numbers (although this will work for any base).
Let $x = 2^{\frac{n}{2}} \cdot x_1 + x_0$ and $y = 2^{\frac{n}{2}} \cdot y_1 + y_0$.
The product, $xy$, can then be written as
$$ \begin{aligned} xy &= (2^{\frac{n}{2}} \cdot x_1 + x_0)(2^{\frac{n}{2}} \cdot y_1 + y_0)\\ &= 2^n(x_1y_1) + 2^{\frac{n}{2}}(x_1y_0 + x_0y_1) + x_0y_0 \end{aligned} $$ The multiplication can now be done with 4 recursive calls and a constant number of additions of $n$-bit numbers. Given this the running time is $T(n) \leq 4T(n/2) + O(n)$. Solving this recurrence gives $T(n) = O(n^2)$, which means this is no more efficient than standard multiplication.
Karatsuba's Algorithm
The equation can be manipulated further to give $$ xy = 2^n(x_1y_1) + 2^{\frac{n}{2}}((x_0+x_1)(y_0+y_1)-x_1y_1-x_0y_0) + x_0y_0 $$ Now only three recursive calls need to be made to calculate $x_1y_1$, $x_0y_0$ and $(x_0+x_1)(y_0+y_1)$. This new method has running time $O(n^{\log_2(3)}) \approx O(n^{1.58})$, which is better than the $O(n^2)$ traditional method.