The first movie I saw about an Alien Invasion was Mars Attacks (the spoof) and I always convinced myself when I was younger and more innocent that if aliens found Earth, they would "come in peace". But then of course there are movies like Independence Day which my grown up self thinks are a more likely reality - if alien life was advanced enough to find the planet Earth, would it really be with the mission to create a peaceful alliance with us? What exactly could we offer them? We have seen the way human nature makes those in power behave, with poorer countries being more likely to be invaded for their resources such as oil, under the pretence of "protecting peace". If aliens found Earth, the likelihood is that we would be severely disadvantaged and have nothing to offer them even if they would be willing to create an allegiance. And even if they simply wanted to observe and study us, I can't see that being a positive thing for us either. They are far more likely to have an interest in our space and our resources, which would be more plentiful if we were not there. So we could be killed or, worse, we could also be enslaved. I think of all the apocalyptic scenarios, an alien invasion would be one of the worst. Worse than a zombie outbreak and worse than a deadly virus. Because I find humans so much more threatening than animals, illnesses, etc., and I can only imagine an alien race to be humans exemplified. I guess I just wanted to share my thoughts here What does everyone else think?