Don't matter.
Don't matter.
Why? Because in games of today, at 1080p it's no more than 6 percent discrepancy between 5.0 and 3.0 (and the OP might have a 4.0 board which makes it <1% perf loss) and it only loses in games where it woulda lost to an AMD card anyway. Also because game system requirements don't stand still, 'pecially when the new generation consoles are getting so close to be released. Today's 4K is tomorrow's 1440p and next week's 1080p.
This 5% loss is de facto impossible to tell in a blind test. In the games of tomorrow, there will be no such loss as you'll run of GPU speed way before that.
Why do I still recommend a 5060 Ti over a 9060 XT despite them being effectively the same performance GPUs? Because the list of games where FSR4 is present is obscenely short. Even if we include FSR3.1 titles, it's still A LOT worse than DLSS4 availability. I don't talk frame gen, I talk actual upscaling. And because RDNA4 handles VRAM leaks worse and it shows in Spider-Man 2 for example where 5060 Ti beats 9060 XT exactly because the latter runs out of memory despite having equal amounts of VRAM.