I did an analysis of the CATV system in my home when I first got HDTV, and was astonished to discover I was splitting my signal down to about 2% of the original signal at the box. All the splits really add up...

5-way splitter at the drop- 4 ea to 4 TVs in the old house, 5th to another 4-way splitter in the addition. And then that goes to a 2-way splitter to feed the DVR and TV inputs. The DVR then has an internal 2-way splitter for the dual tuners.

20% x 25% x 50% x 50% = 1.25% per tuner in the DVR, both of which had crappy signals and a horrible picture that would constantly break up on several channels and never work at all on a few others. What it DID have, though, was an excellent signal meter! I added a 15dB amp, which increased signal strength, but didn't solve the problem, which was never signal strength (despite the losses), but signal-to-noise ratio- and the noise remained. I troubleshot all the connections and other lines, but the difference was negligible. (Even when disconnected from cable, I was getting a clear picture of several channels on my one TV- there was just that much EMI bouncing around.) The additional grounding point the amp offered is what finally fixed my TV- Grounded the amp, and BAM, every analog TV in the house cleared up considerably, and errors dropped to 0 at the DVR. Problem solved! My analog TV pictures are still quite lousy, though. I blame my neighbors, gotta be their fault, probably using a coat hanger instead of RG6 or something.

So... ah... don't mind the haters, just use a splitter in the attic wink