IMO, it's because the rest of the world has never lived here. All they see of the US is from Hollywood.Not trying to be political either ,,,but i wonder how many americans actually wonder why if its such a splendid counrtry that most of the rest of the world even its allies dont particularly like it ,,,especially these days
Do Americans know as much as they should about the rest of the world? I can tell you unequivocally NO.
From my admittedly microscopic view of the rest of the world (I've only been to a few other countries), I wouldn't want to live anywhere else.