Based on your response which sounds hateful, your desire to see the west coast states gone from your nation sounds hypocritical in the context of it being a great nation.
As an outsider I feel America has long had a belief of what it is which runs counter to reality. Talk of greatness, and liberty and freedom whilst US citizens are denied decent healthcare and are removed from electoral roles and the uber-wealthy given tax cuts they have no need for.
I feel the reality of America is catching up to the dream and for some states like California and New York they can no longer square what they see with what they believe.
If you believe me wrong I’d ask you to read your declaration of Independence and question does America, your state live up to the ideals written in that document? Do the actions of your state still match those principles? Do the actions of California fit with the actions of your founding fathers?