View Single Post
  #71  
Old Posted Aug 4, 2018, 11:04 PM
Capsicum's Avatar
Capsicum Capsicum is offline
Registered User
 
Join Date: Sep 2017
Location: Western Hemisphere
Posts: 2,489
Quote:
Originally Posted by Denscity View Post
The west coast states and province have never been right wing from Alaska all the way down to California.
Quote:
Originally Posted by Docere View Post
Never been right-wing? California used to be a GOP stronghold and both presidents Nixon and Reagan were from there.
Quote:
Originally Posted by Firebrand View Post
So what caused California that made it became left-wing? (Other than Inland California.)
When did the west coast (of North America more broadly) become the "left coast" or become associated with being more left wing? Was it the time of the 60s, the 70s counterculture or environmentalist movement?

I know some say that there's always been a more "libertarian", individualistic streak for the west since it was settled by Americans from the more "establishment" east.

There's also the trend that shows westerners as less religious than those out east (for both the US and Canada). Places like BC, Washington state etc. are among the least religious on the continent.
Reply With Quote