Well now,
I'm not sure whether I'm posting this in the right place, but here goes nothing.
Having been around this forum for a while now, one question keeps popping up in my head... Is it really true that the majority of people in the USA actually don't fit their cars with any kind of snow tires in winter? Because it sure sounds that way after reading some of the posts here, especially ones where people are discussing rwd vs fwd. Someone actually said that fwd is better because noone has the money to switch to a different set of tires in the winter season!! (actually implying that fwd cars on summer tires in winter do anything else than plow straight into the next snow bank.)
Now I'm obviously not talking about Miami residents here, but people further up in the northern regions of the country.
To me, the very notion of driving through winter on standard summer tires (or, god forbid, low-profile ones even) just sounds like pure idiocy. Here in Iceland you'd be laughed out of town for suggesting it, and our Gulf-stream warmed behinds enjoy far less extreme colds than the northernmost parts of the USA. :blink:
And as for the price of a set of either quality all seasons, or winter tires (or preferably, nailed ones) holding people back, just your laughably low gas prices would cover it in about 3 months if compared with ours. Here, gas is going for close to 7 dollars a gallon. :(
But none of that matters of course, since I'm probably way off base here. American drivers are most likely more responsible than I could ever hope to be, and this is all a big misunderstanding on my part.
Isn't it???