For instance I have a monitor that defaults at 60Hz but can be set to 75Hz from the AMD software.
Does it matter at all?
What that usually means is that the monitor can be set up to 75Hz when it's not running at its native resolution. Since you never want to run a display outside of its native res anyway, setting it to 75Hz isn't going to do anything. The monitor will still run at 60Hz; you can check this yourself by going into the display's own menu.
There was some talk a long while back about overclocking your monitor (yes that's a thing) and one of the advantages was a higher refresh rate. Unfortunately on my old monitor the best I could get was 71hz and it made it look horrible too...I undid everything right after finding that out since 11hz extra wasn't worth the trouble.
It matters to me. Transitioning from 60hz to a 144hz (or 120hz with LightBoost atm) made games feel more fluid, responsive, and also have very little motion blur while removing tons of screen tearing without the need for vsync. When playing fps games it definitely makes the biggest difference, but with other genres I don't think it benefits as much even though it looks clean.