It's hard to take American sports seriously given the arrogant way they insist on calling it "world" something for their national leagues. Well done, you're the best in the world at something you've invented and no-one else cares about. What turns me off football is the lack of humility, American sports are worse and I've not even got the initial interest to hook me in.