Using Cramer’s V to determine the strength of association between Maltese student and national politics

Charles Mercieca
Feb 25, 2017 · 8 min read

In Malta, the student political landscape has split along partisan lines in a manner akin to the main political one. SDM, founded in 1974, pitches a Christian Democratic platform. The newer Pulse is less clear on its position within the political spectrum, however its support for the decriminalisation of drugs for instance indicate it is left leaning.

The generally stereotypical view of Pulse and SDM voters is that the first half go home to a television unflinchingly set on One while the second half to one just as steadfastly locked on Net. It’s a gross oversimplification — all stereotypes are. But how accurate in reality is the relationship between support in student politics and national politics?

Should there even be a relationship?

After all, three major Maltese political parties have independent youth wings were youths can hone their skills and advocate for causes near and dear. Some might point out that your location on the political spectrum is largely irrelevant in leading a student council executive. For instance, would an organisation that identifies itself as leftist hypothetically implement a student tax and subsidiarize the canteen’s pastizzi at 10 cents each? Would a right leaning organisation change the room allocation report to a purely capitalistic exercise where the largest organisations could erect massive towers on quad?

And yet, even in spite of this, factors like social identity theory, where a person’s membership in a social group plays an important role in an individual’s self-concept could still motivate this relation. Regardless, this question is one that has not elicited any serious research.

This was one of the questions I wanted to answer as I was ramping up the Insite Febuary 2017 poll.

The Insite Febuary 2017 Poll… and its limitations

To get our sample, we elected to use the mall intercept method; which involves walking up to students and asking them if they’d like to take our questionnaire. It is important to note that this method is not without limitations or criticism. It is what’s technically called a non-probability sample, which means that participants selected do not have an equal probability of being chosen — they might, for instance, choose to stay at home or go to lunch off campus. Not giving your population an equal probability of being chosen is always a limitation when attempting to generalise to the whole. This is why traditional political polls still use telephones, since a phone book is something very easy to randomly pluck numbers from.

The university equivalent to a probability sample would be through the registrar, however this avenue was not pursued because of survey fatigue on this medium, low response rates (usually it’s below 15%) and what would probably be a long wait. The main criticism of mall intercepts is that they often aren’t representative of the society at large. The reason for that is simple; shopping malls, in the United States, where the method is mostly deployed, tend to be frequented by females of a specific age bracket. Additionally, in most malls, other demographics like social class and race also come into play.

These things aren’t really that much of an issue for university — to be eligible to vote, people must be students, and students can overwhelmingly be found going to lectures. To further decrease the risk of an unrepresentative sample, students were intercepted as they were entering or leaving certain faculty buildings and across the whole campus. Care was taken to ensure that the demographic distribution of the sample by faculty and year of study was as close to that of the university at large population as possible. This ensures that no one faculty is disproportionately over or under represented.

How representative was our sample?

The way this was assessed was to view the distribution of university students across faculties, and see whether this matched the distribution in our sample. Luckily for us, the numbers for the 2016/2017 academic year are available online. So to put it simply, since we know that 11% of all university students attend the faculty of Arts, if in our sample 30% of the respondents came from this faculty, we’d know it was over represented.

Table showing the percentage distribution of students by faculty in the university population and those observed in our sample. The third column indicates the percentage by which each faculty is over or under represented in our sample.

However this didn’t happen by much and distribution was satisfactory across most faculties. Some, like Science and Social Wellbeing were slightly over represented in our sample, while others, like FEMA or Health Science students were under represented.

Gender also provided us an opportunity to check; we know that 58.3% of all university students are female. In our sample, 61.9% were female.

Lastly, year of study can also shed some light in this. It’s actually pretty hard to come across statistics on which year students are in, but logic would dictate that the number of students in their 1st year would be largest, followed by relatively equal 2nd and 3rd years and a slight drop by the 4th year (since most Bachelor of Arts courses are only 3 years long).

A histogram of the distribution of our sample across year of study.

In this regard, the sample seems to be slightly younger in its year of study, and 4th year or 4th year or more categories could stand to be a little higher. It’s also a bit more female, and misses some faculties (Theology) all together.

The Survey

The main question we sought to answer in the survey was which student political organisation was in the lead. A related facet to this was whether that number would change if you weighed in the likelihood of students voting. This was assessed on a 7 point Likert scale ranging from “Not very likely” to “Very likely”. Two additional close ended questions were intended to gauge KSU’s approval rating and whether the majority of students would like to see more than two organisations contest.

The final two questions attempted to look into two other factors that might drive student politics; party affiliation at the national level, and where a student is from. While casual observation of how a prototypical Pulse voter might differ from an SDM voter might elicit some stereotypical responses, to our knowledge, no real research into the matter has ever been attempted — our aim was to change that.

The overarching concern was to extract as much relevant information that we could analyse in the shortest amount of time.

The survey protocol and script was maintained as uniformly as possible. Participants were politely asked if they could spare a minute, the interviewer identified himself as being a member of Insite, the topic of the survey was introduced and if the participant had no objections he or she was handed a survey sheet, asked to fill it, fold it and deposit it into a bag. Participants were informed of the confidentiality of the survey, and the option to skip any question they did not feel like answering. Following this, a few pleasantries were exchanged, and the participant was reminded that he or she could follow the outcome of the survey on Insite’s website and social media accounts.

Out of 251 individuals approached, 243 replied, eliciting a response rate of 96%. Just one response was invalid, yielding 242 valid responses. The study was conducted on Monday 20th February, in a 6 hour time period spanning from 9am to 3pm.

How do students vote in a General Election and a Student one?

In summary:

  • 89% of SDM voters would vote PN
  • 78% of Pulse voters would vote PL
  • 7% of SDM voters would vote PL
  • 17% of Pulse voters would vote PN
  • 90% of PD and 73% of AD voters wouldn’t vote in KSU elections

Surprisingly, nearly a fifth of students whose sentiments lie with Pulse would vote for PN if the general election was held tomorrow. The number is much less for SDM, but 4% of students backing them would vote for AD as opposed to PN.

However, information like the above doesn’t tell the whole story. It is for instance possible that the above distribution was obtained merely due to chance and means nothing. To make sure this isn’t the case we can deploy a few statistical tools. Pearson’s chi-squared test allows us to check if the distribution in data is due to chance alone or if there is in fact a statistically significant relationship between who you support in the General Election and how you’re likely to vote in the KSU one.

All the analysis that follows was done using the SPSS Statistics software package.

The Pearson’s chi-squared test result indicated an effect between who an individual supports in student politics and national politics — the relationship is larger than one you’d expect to see by chance alone.

Additionally, we can ran another test that determines the strength of that relationship. Cramer’s V takes a chi-squared input and determines just how close and strong the relationship is. Cramer’s V’s result is a number between 0 (no relationship) and 1 (a perfect relationship).

The Cramer’s V value for the relationship between how a student would vote in the KSU election and how a student would vote in the general election is .449 — that’s defined as a extremely strong relationship. Indeed, some lecture notes would call it “worrisomely strong [and one that’s] either an extremely good relationship or [a case where] the two variables are measuring the same concept”.

Notes & Caveats

Only responses from 173 participants were used, as 69 elected not to reveal their voting intentions. Additionally, 4 responses for Moviment Patrijotti Maltin and 10 for Partit Demokratiku were excluded from this analysis, since retaining them would force a number of cells to have a 0 count, violating an assumption of the Chi-squared test.

It needs emphasising that a probability sample is required for a chi-squared test like this to generalise to the whole population, however as we’ve already covered in our methodology write up, we have sufficient reason to believe that our sample generalises well to the student population at university, hence the results are probably representative of university students.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade