

People use the term “the West” interchangeably with “US and historical allies after WW2 in Europe and North America that I will pretend are a single coherent block with the same history, social issues, internal affair and foreign policies (the US one) and that I will hate because of war/colonialism/slavery/lgbtq+”
The fact that France, Italy, Denmark, Poland, USA, and Brazil are radically different Western countries is unimportant to most that uses the world “West” in normal conversation.
I do not really care about west being used while geographically makes no sense. We are full of label that makes no sense. I would like at least to have a consistent definition.


Italy has always been a trend setter. Trump is just a cheap mashup of Berlusconi and Mussolini