And here's a simplistic explanation why:
Whenever you launch a program Windows loads what it needs into RAM, and uses it as and when it needs to. As you work, Windows fetches and carries parts of the program - back and forth from the hard drive and the RAM. Sometimes you'll be using more than one program, and you'll be surfing the net and probably checking your email.
While you're doing all that Windows is rushing around like a mad thing, fetching bits of programs and putting them away again, except there isn't time to put everything back in exactly the same position each time. Gradually, over a period, this fragmentation becomes worse, until eventually Windows begins to slow down under the strain of having to go and pick up all the bits of program from all over your hard drive each time you launch Word for instance.
Windows knows where to go, because each time it puts anything anywhere on your drive it records the position very accurately in a kind of database (sometimes called a File Allocation Table or FAT). The trouble is that once the drive gets well and truly fragmented it can take quite a while to find things, and Windows has to charge around the drive collecting enough stuff to put a window on your screen and allow you to work.
Defragmenting the drive allows all the bits to be sorted into their correct order and placed in neat chunks. If the drive was badly fragmented you'll be able to detect a definite speed increase afterwards.
Does that help? The process is in reality quite complex, and I've kept to basics.