I freak out about losing my car keys. Still it always makes me mad when I am burdened by the bad decisions of others. Apparently this was really controversial when floating point was standardized, and as so often happens the wrong people won.
Don't worry about it.
The only place I can imagine this can have any noticeable effect is when you repeatedly multiply a large array of numbers by .9999 or something. Damping object speeds seems like a place where this could happen, but if you're worried about performance, you wouldn't update physics for objects that move so slow anyway.
A hundred times slower sounds awful, yeah, but since those numbers are so rare in practice, it's likely that your code is filled with much worse bottlenecks.
But even testing that the speed is that low would be 100x slower when the speeds are denormalized. Unless you explicity flush the value to 0.0. Maybe that would have a significant impact with a 100 or so objects. Or maybe not, but it is still annoying to have another little worry lodged in the brain.