Retrospective on

Alex Parsons
Mar 21, 2019 · 12 min read

A few years ago I spent a lot of time looking at letting agent websites, seeing how much they charged people, and putting that in a different website!

Boring? Yes. Worthwhile? Let’s see!

The big goal of was to help the case for a ban in letting fees and this has now pretty much happened. Broadly I think the ban resulted from multiple campaigns from various groups and campaigners, so don’t want to over-claim success, but I also spent a good few months of time working on this and it fed into effective lobbying on the issue — so I’ve been meaning to write a retrospective on the site for a while and think about what worked and what didn’t.

The basic idea of the project came out of a project I was involved in with the Waltham Forest Renters group trying to work out the costs of fees in the borough and report those that weren’t displaying their fees to the council. For that project, we mostly worked in a spreadsheet and then uploaded that to a data display service ( — that has since shut down and so that original page can’t be seen (remember to take screenshots of everything you’re ever going to want to point back at kids).

Basically this kind of fees research works on two problems:

  • Letting agents are supposed to display their fees, and they can get fined if they don’t. By collecting these agents, you can pass their details to the council and they might get fined.
  • Transparency is supposed to stop tenants being ripped off, but in reality this doesn’t work. By collecting information about fees and how unresponsive they are to transparency, you can make a stronger case for a change in the system.

Coming out of that project, I wanted to try and build a tool to make that kind of research easier and scale it. Talking to Generation Rent, we worked out a plan for a website that could help volunteers in areas across the country research fees in their area. The goals of the project was to a) make information more accessible to renters and b) generate research that could be used for activism and lobbying.

Trying to work out what the general cost of letting fees was to tenants wasn’t a new idea (and Shelter and Citizens Advice both had numbers based on surveys of tenants), so to collect something new, we built it to it would be able to understanding different kinds of fees, for better understanding of what typical charges were for similar services. The overall ‘comparison fee’ would then be automatically calculated from all the other fees charged. This also helps to understand fees that not paid by every tenant, but are paid often enough that they were a feature of the rental market that is under-explored.

After building a basic version of the website we found a small group of volunteers for different areas, who started trying to record the fees for agents. As recorded in the main report, this is a tricky task because letting agents do not make it easy to find fees — and when found can still be confusing.

Fees were also complicated — and this required constant revision of the data entry page to account for different variations and to speed up the work of researchers. The system had to both understand cases where a £Y cost per person might fall to £X for each person after the second person, or alternatively where having three people might trigger a group discount which lowers the cost of the first payment. Inclusion of VAT was very variable (and as discussed in the report, this was already against the rules and reflective of the weakness of enforcement) — and this meant the system had to keep track of which values did and didn’t have VAT for comparisons.

None of this is difficult technically — but the complexity of how fees were described raises the complexity of the data entry process and makes it more difficult for volunteers. The final entry process looked like this:

Where researchers would get an initial suggestion of common fees, and a drop down of fees already entered by other researchers. There was a separate reconciliation page where admins could combine different fees categories that were functionally identical (‘admin fees’ + ‘administrative fees’, etc).

On completion of the form, the database would generate a ‘comparison fee’ of what mandatory costs there would be to two tenants using this letting agent. This would be used understand the general picture of fees in an area. The individual fees were then also useful for understanding different charges for the same item.

Once we had over 700 agents over 11 local authorities — we felt this was enough to start making use of the data. This involved a data cleaning stage before we made the data from each council live, typically double checking the top and bottom ranked councils for errors and all those flagged as not having fees — as well as looking for outliers in each category of fees.

The website had a page for the national picture, each local authority, and separate pages for each kind of fee. There was a summary showing a set of curated important fees on a single page. Each individual agent had a page showing the fees charged.

Research Findings

From this we wrote up our findings about what was being charged, and exploring the specifics of different kinds of fees. This came up with some conclusions that validated looking at individual fees and a larger than usual sample of letting agents:

  • The large sample size produced a useful range of fees, which reflects a large variation, which raises questions as to why some are able to charge far less than others — a useful line of critique wasn’t just that these fees were expensive, but that seemed to be based on no consistent costs.
  • Tracking individual fees let us discover that letting agents charging for an on-going reference also charged above average core fees. Similarly agents who charged renewal fees were charging more for the initial contract that those who weren’t. This showed that agents who felt able to charge high would do so at any opportunity and this was visible in the numbers.
  • There was a small group of letting agents who were charging for deposit protection — which, because those businesses use deposit money as working capital, meant that some tenants were (perfectly legally) being charged to loan their letting agents money at 0% interest.


The initial launch got some press, especially in areas outside London where we researched a local area:

Letting agent fees range from zero to 780 for no apparent reason — Business Insider UK

Generation Rent beefs up calls to have letting agent fees abolished — Property Industry Eye

‘Captive’ renters paying letting agents £780 just to move in — Mirror

Tenants Charged Up To £780 In Fees To Move Home — Londonist

Call to ban letting agents’ fees —

High letting fees for York tenants are revealed and slammed — York Press

Pressure group running comparison site for letting agent fees — Letting Agent Today

The highest and lowest letting agent fees in Manchester revealed — Manchester Evening News

Moving into 2017 and 2018, the release of our second report + news of a proposed ban got references into a few more publications:

Letting agents found charging tenants more than £800 in fees — Guardian

Can you evade the dirty tricks of the rental trade? — The Times

The promised ban on rip-off letting agent fees STILL hasn’t materialised — but the Government claims it’s getting close — Daily Mail

The letting fees ban can’t come soon enough — I had to find £2,000 just to move flats — iNews

Autumn statement: Letting fees are awful, and Philip Hammond is right to ban them — CityMetric

Figures were quoted in parliamentary debates, and importantly our stats were used in the government consultation on banning fees — and then later on in the ARLA-commissioned research opposing a ban. So generally was being read in the right places alongside output from longer standing research-producing charities.

It was also referenced in a Welsh Government report on letting agent fees (48/2017) — and luckily we’re not particularly looking for academic impact here, but did get a mention in a paper in Landlord and Tenant Review.

To the end of 2018, the website has received over 40,000 users, with over 13,500 page views of the report (average read time of three and a half minutes) with two thousand views of the report on the day a ban was announced. The follow up report much less impressively had 443 page views.

Most users of the site arrived through organic search (80%), with referrals in the hundreds from links on Manchester Evening News, The Guardian, Tenant Voice, the main Generation Rent page, Facebook and Twitter.

Less directly traceable (but the most important impact) was the research was then able to form part of Generation Rent’s lobbying efforts (see written evidence, but also a lot of offline stuff).

Counter Research

A year on, we updated the site with another 300 agents and revisited the previous agents to see if there were changes. Our follow-on report detailing this is here (basically some went up, some went down), but in this report I also tried to use our data to respond the strategies of the letting agents in opposing the ban.

While I was happy to see our research showing letting fees were expensive and varied wildly taken up uncontested by industry funded research, I was less happy to those numbers being mangled a bit to push a new ‘young and rich’ vs ‘old and poor’ line on a letting fee ban.

As I go into in the write-up, there’s a calculation error in how they annualise the cost of moving, but more substantially they do not account for the annual cost of renewal fees. Correcting these problems shows a benefit of a reduction in fees for even quite long tenancies — so the young vs old line just doesn’t work. But as this speech by a generally pro-ban MP shows, the idea that a ban on fees would hurt low-income tenants was certainly loose in the world.

To be fair to the consultancy they do say they projected rent increase is smaller than the current cost of fees (and hence a ban would be a net gain to most tenants even by their numbers) but for some reason, this isn’t the bit of the report the people who funded it like to quote. For instance ARLA’s David Cox says that:

Research commissioned by ARLA Propertymark carried out by Capital Economics demonstrated that if a full ban comes into force, two tenants will end up paying an extra £206 per year in rent. Therefore, rather than making savings, this policy will end up costing tenants more the longer they live in their home.


We do not believe the Bill will achieve its aims as our own research last year demonstrated that tenants will end up worse off and banning fees will not result in a more affordable private rented sector.

This last quote especially is a misrepresentation of what the research they commissioned actually found — before we even get to the issue that the research itself is flawed. While the research was produced independently, once it reaches the media quote stage — the fiddly details of what the research produced doesn’t really matter. This needs thought about the correct thing to do in response.

Having produced research that showed this was untrue, I don’t think we were particularly good at making use of it to counter this idea, but I’m genuinely unsure what the best approach to take would have been. if the resources were there, would more forcefully trying to oppose this line have been effective? Would trying for more “report shows no negative effect of long-term tenants of fees ban” coverage been a) possible b) actually positive, or just remind people of the ARLA position?

But in the end it didn’t do them any good, so there’s some karma there.

What would I do differently

While the dataset was built by the hard work of many volunteers, to reach a large number of agents in the initial dataset I researched a large proportion of the original data myself — and those which I didn’t do I had usually done a basic check on where they were outliers. I wanted to be sure that we weren’t making obvious mistakes with high or low charges which would call into question the wider dataset.

This played to the large amount of free time I had to do this work — generally the site could have benefited from a different, more volunteer centered approach of cross-checking with a more community to it (which might have helped with scaling the result). On the other hand, fees are now banned so from a goal perspective it doesn’t really matter.

I was obviously happy to see the research picked up as much as it was, but found it a bit interesting that while showing up nicely in DCLG publications, Commons Library briefings tended not to mention it. There might just be a credibility filter here, but also that I didn’t present it as typical citable research might be a factor. I was in part using these reports as a test of a long-form publishing platform, as opposed to the blog post and a PDF that would be typical — the thinking being that it should be easier to read research on phones. I think this thinking holds up generally, but my new opinion based on working on mySociety research is that it needs to be both PDF and online to fit into different kinds of workflows. I’ve actually retrospectively created PDF copies of these reports now, but I would now do this from the very start on future similar work. This does annoyingly increase the amount of work required to publish the results, but I suspect it’s worth it.

We didn’t really adequately capitalise on the good search real estate of having and getting traffic for people looking for fees for individual agents. While traffic wasn’t big in the grand scheme of things — it was much larger than expected. That traffic could have been funneled better towards getting people to take action in more general campaigns or signed up to mailing lists. This was a bit of a casualty of having less time to spend on the project as I became more properly employed and an example of something going better than expected but not really adjusting strategy to match.

Another approach would have been to double down on reporting letting agents to the councils trading and standards — but we had some mixed success with this in the early councils. Where trading and standards are under-resourced, this isn’t necessarily an easy path. In aggregate the amount made in fines for non-compliant agents across the country would be easily enough to fund a slimmed down version of the process and a researcher to find them (and turn a profit!) — but each local authority running an independent system means that we can’t unlock easy funding for how scalable the discovery process is. Successfully getting half the maximum fine for just half of the non-compliant agents discovered would unlock £187,000 — and this represents less than 10% of local authorities. But as is, most non-compliant agents are undiscovered, un-fined and I’m down money in hosting costs and time. Problems that can be entirely detected over the internet can allow far more effective enforcement — but only if the structure is correctly aligned.

What will happen now

There’s some thinking going on about what kind of project would be useful in this space after the ban. The website itself hasn’t been seriously updated in a year and is likely to be out of date. As such:

But generally isn’t it great no one’s going to have to go look for fees on those awful websites anymore?

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store