I disagree with that, because with retina displays, you can't tell the difference between something that's anti-aliased and something that's not, since you can't make out the individual pixels. But with HFR, you will always be able to tell the difference between something with motion blur and something without. I'm just saying that it's not comparable.
That's exactly what I'm saying, and it's perfectly comparable. As Schoq alluded to earlier, anti-aliasing is spatial interpolation, and motion blur is temporal interpolation. How exactly will you be able to tell the difference between something with and without motion blur at an extremely high frame rate? For a simple example, suppose an object moves 10 pixels in one frame at a low frame rate, so it's motion blurred over those 10 pixels to simulate a higher frame rate. If instead of doing that, the frame rate is multiplied by 10, and the object moves discretely by one pixel per frame, essentially the same information reaches your eyes over that period of time, just in 10 increments instead of all at once.
Now, that's not to say you couldn't extrapolate motion blur so that the information of more than 2 adjacent frames is represented in a single frame, just as you could extrapolate anti-aliasing so that the color of one pixel bleeds farther in one direction than into the pixel directly next to it...but that would be a weird stylistic choice that would make things blurrier than necessary.
As you say, you can't see anti-aliasing on a retina display since you can't make out individual pixels. With a high enough frame rate (which 48 is not), you won't be able to make out individual frames. Same deal. How do you see those two as fundamentally different?