If you’ve worked with radios or other high-frequency circuits, you’ve probably noticed the prevalence of 50 ohm coax. Sure, you sometimes see 75 ohm coax, but overwhelmingly, RF circuits work at 50 ohms.
[Microwaves 101] has an interesting article about how this became the ubiquitous match. Apparently in the 1930s, radio transmitters were pushing towards higher power levels. You generally think that thicker wires have less loss. For coax cable carrying RF though, it’s a bit more complicated.
First, RF signals exhibit the skin effect–they don’t travel in the center of the conductor. Second, the dielectric material (that is, the insulator between the inner and outer conductors) plays a role. The impedance is also a function of the dielectric material and the diameter of the center conductor.
When you put all this together, you learn that the loss of the cable is minimized at 77 ohms for a cable with air dielectric. Of course, that’s not 50 ohms, right? It is close to the 75 ohms used to carry weak antenna signals in TV systems. According to [Microwaves 101], this isn’t the reason — it is related to using cheap steel for the center conductor instead of copper.
Transmitting is different. You want to handle the highest power you can manage. The peak power handling of the same cable varies with impedance. So a cable that has least loss at 77 ohms, also has maximum power handling capacity at 30 ohms. The mean between 30 and 77 ohms is 53.5 ohms.
If you want to learn more about RF design, you could do worse than watch [Michael Ossmann’s] workshop (see below). If you are more interested in coax terminations, we got you covered there, too.
Filed under: radio hacks
via radio hacks – Hackaday http://ift.tt/29siUJN
No comments:
Post a Comment