It's funny how Ubisoft just exported the latest videos with a higher bitrate and turned motion blur on in the settings and suddenly everyone goes "Wow, the game looks so smooth and polished now."

And then the first thing they do is turn off motion blur again once they've installed the game because "motion blur sucks and causes motion sickness."
If you were to export the initial gameplay showcase with a higher bitrate and motion blur enabled, you'd notice that the overall graphics and animations remain unchanged. These weren’t heavily improved due to the delay. The visuals would simply appear sharper and smoother, resembling the latest gameplay clips.
I'm really excited about the game—it's my most anticipated AC since Origins—but the export settings and the decision to turn off motion blur were just some of Ubisoft's many poor choices.
"Hey team, let's reveal our game, which is actually really beautiful, detailed, and the last chance to save the company’s reputation, but let’s make the video quality so bad that it looks like a pixelated mess. This will lead people to think the game looks terrible—even worse than Origins or even than Unity, which came out 11 years ago."

As a filmmaker and video editor, I find it funny how people dislike things like 30 fps cutscenes or motion blur. Cinematic cutscenes should be rendered at 24–30 fps. Anything above that makes them look like a cheap TV show because the fluidity takes away the cinematic feel. There’s no such thing as "60 fps cinematic."
24–30 fps combined with proper motion blur is one of the core elements that make a video or movie appear "cinematic." Remember when The Hobbit: The Desolation of Smaug was marketed with "enjoy a unique 48 fps experience"? It looked absolutely wrong and horrible, at least to me.
People dislike motion blur mainly because they've experienced poorly implemented motion blur that smears the entire screen. However, well-done, object-based motion blur (like in RDR2 or The Last of Us) enhances the smoothness and polish of animations. Ironically, players often turn it off and then complain that the game feels choppy and poorly optimized.

TL;DR:
Many players instinctively turn off motion blur in games due to bad past experiences with it, even though well-implemented motion blur can enhance smoothness and animation quality. Ubisoft's poor showcase choices, like low bitrate and disabled motion blur, made their beautiful game look worse than it is. Cinematic cutscenes work best at 24–30 fps with motion blur, as higher frame rates ruin the cinematic feel.