As you may know, a computer program is made up of ones and zeros, like this: 110010101011000. The way they are arranged tell the computer program what to do.
Pet peeve: I get it, but I tire of that example, because it is not how programs are written. Binary is a numerical representation, and it is applied in computer science because of transistors being on and off, having enough power or not. Binary logic and arithmetic are important, but don't kid yourself if you think programmers are writing entire programs in binary. That would be dumb.