That tells you their perspective on how they view the situation. They truly believe they're performing a righteous duty for the protection of their land, while we're apparently an evil force that needs to be routed. It's the same the world over. Does the United States ever portray itself as an aggressor when it invades and destroys lands in the Middle East? Does it ever describe its missions in foreign lands using aggressive and domineering terms, or do they use peaceful language to paint a picture that's at complete odds with their actual aims? These worldpowers have overstepped their limits. They're the actual tyrants they cry about in the media; a media that's in their pocket despite the occasional fake stabs at objectivity in order to muddy the waters. It's a sick world, and for some reason they've suddenly stopped caring about covering their tracks. I'd like to know what they know that's given them such confidence.