I really don’t understand why people want government healthcare. It just doesn’t seem better. That’s coming from someone who’s always had health insurance. The only person I know that’s ever lived somewhere with public healthcare hates it because she had to wait two weeks to have her appendix removed. She had appendicitis, which is life threatening. The doctor literally told her “Go to America if you want it fixed faster.” However, I’m open to someone giving me a logical, no-BS explanation as to why it’s better. Because liberal garbage like this irritates me. I don’t think “most people want government healthcare.” Prove it. I’m listening.