Seasonally adjusted data and the Anti-Christ
About seasonal adjustment: the good, the bad and the ugly
Seasonal adjustment can ‘distort’ data. But, for the most part, if you know how it works, you can get some idea what sort of distortion certain circumstances might introduce. It is bad to ignore regular seasonal patterns when they occur because you can miss short terms shifts in trends that are important and that would more easily appear if data were adjusted. Economists use this form of data for a reason (hunker down for the criticism landslide here). If you ignore seasonally adjustment altogether you do not SOLVE most of the problems people blame seasonal adjusted for creating. In fact you introduce a whole new kind of error by making clumsy and stupid comparison. Seasonal adjusted is not perfect but it is good where the data go over the statistical hurdle for it to be applied. When it is workable, it is bad to ignore it or not use. To ignore it can produce some ugly results. Yes when you use seasonal adjustment the ‘seasonal factor’ kicks everything that happens whether it is seasonal or not and therein lies the potential for distortion. But unless you think we are being ‘shocked’ all the time, that is a small price to pay for a more stable and meaningful data-series.
Ignoring seasonal adjustment can be ugly- There are things that are 'seasonal' but that defy the calendar. For example, Easter is on the lunar calendar schedule not the Gregorian calendar. It pops up all over the Gregorian calendar more than Kurt Vonnegut’s Billy Pilgrim pops in out as he gets unstuck in time. If you were to compare the 'same' calendar week to the calendar week in the prior years, just previous to Easter, at Easter, and post Easter you would get distorted garbage.
Moreover, many of the people who rant about seasonally adjusted data claim that the weather is distorting the data today. The approach of looking at NSA data (Not Seasonally Adjusted) does NOTHING to alter that. Having a warmer, colder drier or wetter (spring, winter, summer or fall) is a factor to contend with whether data are seasonally adjusted or not. It is a problem that is independent of how the data are presented.
Also it’s leap year this year.
How do you account for that? What is the comparable 'week' for last year when there was no leap day? Seasonal adjustment takes account of that; NSA data do not.
When you RESTRICT yourself to 'NSA' data you can only look at Yr/Yr trends and you run the risk of missing the boat when trends turn. You cannot look at six month or even three month growth rates. And... picking the proper 'analog week' for last year is not nearly as straight forward as it seems...The week that lies 52 weeks ago may not be the best comparison…
There are 14 different 'types of years'. They can start with any day of the week (7) then each can be or not be, leap years (14). Then we can multiply that by noting the holidays that move around... to create many, many, more ‘types’ of years. Easter alone roams over a very wide range (look it up; there is a good internet site on Easter and how it aligns with the Gregorian calendar- it’s a mess!)
Seasonal adjustment programs have their problems; but we know what the methods are and can think about how the process might distort the data under certain circumstances.
Ignoring seasonal adjustment takes us back to the stone-age ( stoned-age ,dude?).
No thank you.
There is nothing highbrow or insightful about eschewing seasonally adjusted data. Zinging seasonally adjusted methodology because it is not perfect is throwing the baby out with the bathwater. Many people love to be critical of what they cannot - or refuse to- understand.
Don't be one of those. Of course if you do not LIKE or WANT TO ACCEPT the message in the ‘SA’ data, the NSA data may provide a refuge for the naysayer. And I suppose that is its true purpose more than people thinking seasonal adjustment is done by the anti-Christ or something.