De-Romanticizing Europe

So Europe is actually kind of poor. I don’t understand or get the notion that Europe is a rich, clean and wealthy country? Have we romanticized it to the point that “Western European” culture is the highest form of all culture?
Like honestly, my first experience of Europe was living in the ghettoes of Paris, France, and how the wealth gap is incredibly noticeable in Europe among Iberian, Southern Europeans (Italians/Sicilians) and Slavic nations, that need to travel to France, England or Germany just to get a proper education and to make a decent living, or else end up homeless on the streets of their cities begging for money and working in low wage jobs.
Very much how we discriminate against Hispanics in North America or Filipinos in East Asia, is the same way powerful Western European countries view countries like Portugal, Romania, or Italy, as poor Europeans who migrated to another country with little to no knowledge and end up working in the “service” industry to make a living.
