For Cat-5 carrying 100 MB Ethernet, you've got a 12.5 MHz fundamental, which has a wavelength of 24 meters. Gigabit Ethernet on Cat-5e has a fundamental of 62.5 MHz, with a wavelength of 4.8 meters. I don't see that the slight impedance change over a distance of 3-4 millimeters is likely to have much effect on that slow of a signal. Has anybody set this up and looked at an eye plot? I doubt that you're going to see any changes in the eyes, but I can't say for sure.