There was an interesting item on the Today Programme this morning - the BBC's Mark Easton and Sir Andrew Foster presented new research from the 2020 Public Services Trust which they claim reveals the public's true attitude to public services, localism and public spending.
There were a number of key claims made:
-
- People are happy with the way public services are currently performing.
-
- The public "don't want massive change".
-
- People "don't want to be any more involved in running" services.
-
- The public don't believe spending on public services needs to be cut.
The problem is, no proof of any sort has been produced for these findings - and the hints about how they were produced are sufficient in themselves to raise serious doubts about their reliability.
Given that this is a key topic for the TPA, as soon as I got to my desk this morning I went to the 2020 Trust's website to read the full report. Unfortunately, all that is available there is a blogpost presenting their chosen headline findings, with no methodology or full report. Apparently we're supposed to just take the Trust's word for all of this as fact for the next few weeks: "The full report will be published in early May."
This is quite remarkable in itself - even people who don't agree with the TPA's conclusions in our factual reports cannot avoid the fact that we publish all of our data when we publish reports. That applies even more strongly when it comes to reports like this one which purport to represent public opinion - it's the equivalent of Brasseye's famous "there's no actual scientific evidence, but it is a fact" line.
But the blogpost does provide a few snippets that suggest serious flaws in their methodology.
For a start, this isn't a survey of public opinion at all - it's the subjective findings of a series of focus groups. Don't get me wrong, focus groups can be useful. They are particularly good ways to test political and campaign messages on tightly focused demographics. What they emphatically do not provide is an accurate representation of public opinion. Anyone who has carried out a focus group can tell you how easily a strong character can sway a whole group's discussion. Indeed Ipsos MORI who carried out the research (and whom I'm sure from experience will have done a good job) have a handy examination of the weaknesses of focus groups available here.
A typical opinion poll would talk to 1000 people - or maybe 500 people at a pinch, though that would start to raise questions of accuracy. The 2020 Trust say they have based these findings on 13 focus groups in 5 towns. Assuming 10 people per group, which would be fairly typical, that is around 130 people.
It gets worse. According to Sir Andrew Foster of the 2020 Trust on Today, the focus groups were deliberately weighted towards "heavy users" of public services. This means the findings not only come from a very small group of people, but those people were specifically chosen to have an abnormally close relationship with public services. Someone who requires daily medical care or who is reliant on benefits is inevitably going to have a different view from someone who has been a working taxpayer all their life and gets very little back in return from the state.
All things considered, rather than titling the report "The disconnect between voters and politicians on public services" it might be more appropriately monikered "The disconnect between intentionally biased focus groups and actual public opinion."
That methodology puts the findings in a very, very different light. On localism, for example, it explains why the focus groups supported the idea of individual budgets (which would allow the people in the focus group to control spending) but sceptical of the involvement of the electorate as a whole (which would allow other people than them to be involved).
The findings on public spending are even more vague. On Radio 4, percentage figures were cited that "only 24% of the public believe that spending on public services
needs to be cut". The 2020 Trust website contains no information at all about these numbers, but they appear to be based on opinion polling rather than the focus groups.
If so, then the 2020 Trust is not going to be able to keep its findings under wraps until "early May". According to British Polling Council rules, if you publish any poll findings in the media then you have to publish the full data from your poll within two working days.
In 48 hours, therefore, we should know more, assuming 2020 have thought their media strategy through. Even at this stage, though, you can see some problems. Judging from this morning's discussion, the poll gave people a false choice between whether they thought spending cuts could be made by cutting spending "on public services" or "efficiency savings".
Given that choice, it is no surprise that most people went for the latter - but that doesn't justify the sneering comments this morning that this showed the public were suffering from "startling optimism" and needed "educating". In reality, people probably mean back office, waste and quango cuts when they tick the "efficiency savings" box, while they assume cutting "public services" means slashing the front line.
So what lies behind a headline? Not much, it seems. Try as they might the 2020 Trust have not in fact produced evidence for public opposition to cuts or to localism. Frankly, it is a mystery why they have gone to such lengths to overplay what little evidence they do have. Oh, did I mention that they're funded by the Department of Communities and Local Government?