@The-Darkin-Blade
A polynomial is an expression that looks like this:
$$ a_n x^n + a_{n-1} x^{n-1} + a_{n-2} x^{n-2} + \ldots + a_1 x^1 + a_0 x^0 $$
where the \(a_n, a_{n-1}, \ldots , a_1, a_0\) are all constants (numbers, not variables). The only variable is \(x.\) It's quite common that some of the \(a_n\) numbers are equal to \(0,\) like in this example:
$$ 3x^3 + x = 5 $$
Here we are missing the \(x^2\) term, but that's perfectly fine! Polynomials come in all shapes and sizes. The Latin root "poly" means many, so if you had only one term, like
$$ 2x,$$
I suppose that wouldn't pass the test of being a polynomial, would it? But as long as you have at least two terms, then it's a polynomial. And if the polynomial isn't simplified, like
$$ x^5 + 3 + x^5 + 2 + 5x^5 + 1 = 12x^5 + 1 $$
then it's still a polynomial. The only thing you can't have is \(x\) in the denominator of a fraction, like this:
$$ x + \frac{1}{x} = 3 $$
$$ \text{ Not a polynomial, sadly.} $$
But that's all right, since there so many more polynomials out there!
Question 2
Why did we have to go through so many steps? Well, that's because in this bonus section, Prof. Loh was doing something different from what he usually does: he was trying to prove Heron's Formula. Often theorems that are elegant and simple aren't that easy to prove. That's why we have theorems in the first place. If the steps in order to get the theorem were already simple, then we wouldn't need the theorem, because we could just remember the few simple steps. But the fact that we use a theorem as a shortcut tool sort of means that it's a pain to try to get the answer the alternate way.
And another way to look at it is, well, at least the proof wasn't so hard that Prof. Loh didn't show it at all! Later on when we learn things like Pick's Theorem or Stewart's Theorem, Prof. Loh might not be able to prove these to us, and we'll just have to memorize them straight.