View Single Post
  #72  
Old Posted Aug 9, 2018, 12:10 AM
Docere Docere is offline
Registered User
 
Join Date: Jul 2014
Posts: 7,364
Quote:
Originally Posted by Capsicum View Post
When did the west coast (of North America more broadly) become the "left coast" or become associated with being more left wing? Was it the time of the 60s, the 70s counterculture or environmentalist movement?

I know some say that there's always been a more "libertarian", individualistic streak for the west since it was settled by Americans from the more "establishment" east.

There's also the trend that shows westerners as less religious than those out east (for both the US and Canada). Places like BC, Washington state etc. are among the least religious on the continent.
In Canada, BC had a much more radical labor movement than elsewhere in the country. Even today, it probably has the strongest class-based voting on the continent.
Reply With Quote