When did we become proud of hatred, violence, public abuse, lies and chaos?
When did it become okay to excuse rapists, but max penalize someone with weed?
When did it become acceptable to have a person representing your country that tells you bold face lies?
When did the world become so blind to believe that politicians are going to save us?
When did it stop being about the people?
Or was it ever?
It seems like it’s only been about the 1% who run the world and what they want.
It’s been about making the rich, richer. Always.
Here we are hating each other over skin color and outer appearance & all because of some folklore that our almost dead ancestors taught us.
I hate to burst everyone’s bubble, but you’ve been fed lies since birth.
Your life was predetermined for you, by your parents and your government.
You’re told your name, what religion you are, what social class you were born into and then, you’re taught how to hate.
We’re not born evil or racist, we are taught.
We are told “x,y,z” about different races and religions and we never question it.
The movies, media and surroundings have taught you that “blacks can’t use the same things or you’ll catch something”,
“Stay away from that Muslim man, he’s a terrorist”
No one dared to ask why, because they were afraid of the backlash.
Fear is what the world runs on.
What’s crazy is that 90% of what you see on media is just stereotypical propaganda to promote whatever movement or law the government wants to push.
It’s all a big performance and we’ve all been a part of it.
One day, everyone will get tired of the people in power letting them down, and only then will the world be ready to truly take back their power.
Divided, we are powerless. United, we are unstoppable.