Dear Americans: Whatever You Do, Don't Ban College Football
Banning college football is un-American.
Being European, I can't help but wonder over recent debates in the United States about the NFL and college football. Increasingly more analysts believe that these sports should be banned, or at least 'reformed'. The latter of course meaning that they'll lose what makes them unique and appealing to sports fans everywhere.
Now, make no mistake about it: I'm just as convinced as the average Joe that football is a very dangerous sport indeed. But why did this fact take analysts by surprise? Why do they make such a big issue out of concussions and other football-related problems? Is it a matter of them wanting to wash their hands in all innocence?
I still remember the first time I watched the NFL: I was shocked - shocked. These weren't athletes, they were gladiators. Anyone not blind could see that they were out to hurt each other and that the crowd loved them for it.
Once I got into the NFL, I started watching college football too. It was just as great, if not better, simply because it's less commercial. These youngsters were trying to prove themselves; they wanted to be the best they could possibly be, while hoping for a professional career in the NFL. They were willing to run through brick walls to reach their goals.
Of course, they too were taken off the field regularly. One had a concussion, another a broken leg. Some of the injured players were probably scarred for life. That much was clear.