There were 491 new HIV diagnoses reported in Ireland in 2015. That’s more than in any previous year and an increase of about 30% from the number reported in 2014.
That’s a significant and worrying rise in new diagnoses, urgently deserving of a serious response.
But for the Irish Times that unprecedented 30% rise wasn’t dramatic enough. Citing preliminary figures from the first few months of 2016 the IT reported an even more alarming 75% increase in reported diagnoses compared to last year!
The first appearance of this figure was an headline in early April that declared “HIV cases up by 75% this year, HSE report finds.” Another IT article from mid-June repeated the claim of “an increase of 75 per cent,” as did yet another article just yesterday.
This figure has been echoed uncritically by other sources, as in a June 24 post at theoutmost.com (the web presence of Dublin’s month gay lifestyle magazine GCN) that reported that new diagnoses had “increased by 75% in the first three months of this year alone.”
Other outlets made similar claims though phrased differently. TheJournal.ie breathlessly reported in May that new diagnoses “almost doubled in the first four months of 2016,” although they seemed confused about the numbers, describing an increase from 106 to 175 as “about 40%” (it’s actually 65%). The Irish Examiner at least got the math right with the same figures.
So is it true? Are this year’s figures showing an increase more than twice the already huge increase from last year?
In a word, no.
Let’s start by figuring out where this 75% figure came from in the first place.
This is how Paul Cullen, author of the initial report in the Irish Times, explained his calculations:
There were 128 cases of HIV in the first three months of the year, compared with 73 in the same period last year, according to the latest weekly report. This is an increase of 75 per cent.
What’s the “latest weekly report”? Well, the Irish Health Protection Surveillance Centre (HPSC) is the agency that keeps track of reports of diagnoses of a range of infectious diseases including HIV and other STIs. They provide a weekly report of notifications of HIV & STI diagnoses which gives a rough idea (because numbers are preliminary) of the trends for the current year.
One of the figures that is listed in these reports is the increase or decrease in notifications for each disease compared to the same period last year. Now there’s probably a good reason for doing this, thought I’m not sure exactly what it is, because unfortunately this comparison of “the same period last year” invites exactly the sort of careless and misleading calculations that occurred in the example above.
If you’re going to compare quarters you don’t compare the current quarter to one a year ago, you compare it to the immediately previous quarter. Or you look at a trend over several successive quarters.
Here’s a chart from the most recent weekly report which shows notifications by month over the last 4 years:
Look at the last 3 months of 2015 (circled in blue) and you’ll notice that (if you account for the dips and spikes) the rate of notifications is pretty darn close to the first 3 months of 2016 (circled in red). Which isn’t at all surprising since one follows the other. They are both among the quarters with the highest number of notifications in the last 4 years.
Now look at the first 3 months of 2015 (circled in green). That quarter has one of the lowest number of notifications in the last 4 years.
It’s not surprise that comparing quarters which are among the highest and lowest of recent years will reveal a dramatic difference. But it’s not an “increase” that’s meaningful. Headlining your back-of-the-envelope calculations as something comes from an “HSE report” is really getting into tabloid levels of distortion.
It would be completely accurate to say that the rate of notifications remains high going into 2016. If the rate remains that high it’s likely that there will be yet another increase in total new diagnoses at the end of the year. But a 75% increase? 860 new HIV diagnoses at year’s end? Based on current trends that’s simply not realistic.
Even worse is the way that this misleading figure keeps reappearing in articles more than 2 months later. Heck, if you’re going to do this kind of ignorant calculation, the least you can do it update it as more data becomes available.
Let’s check the most recent figures available, again from the week 25 report which covers notifications up to June 25. That report shows 270 notifications so far in 2016, compared to 202 in 2015. That’s still more than the same period in the previous year (remember the chart above: the first half of 2015 had a much lower rate of notifications than the second half and this year started off with a high rate of notifications).
But here’s the thing, it’s only 34% more, not 75% more. It’s still misleading, it’s still not a meaningful indicator of what the current trends are, but at least it reflects some attempt to refer to more current information.
Indeed, the Irish Examiner managed to find more current data for a June 15 story which noted that “[t]he provisional data means there has been an increase of 38% compared with the first 22 weeks of 2015.” So why does the Irish Times keep referring to a supposed 75% increase when, even by its own erroneous method, it’s clearly out of date?
As an advocate I know there are times when it’s useful to present statistics in a way that emphasises particular things. If an absolute increase isn’t very big you might choose to describe a relative increase instead.
The thing is, when you start to make claims which turn out not to be true, you lose credibility. If you don’t understand the data you’re working with and try to make it show something that isn’t there, people don’t take you seriously.
That is if anyone was actually paying attention in the first place.