Skip to main content

A Tale of Two Currents: AC vs. DC

 Difference Between Alternating Current (AC) and Direct Current (DC)

                                        

 

Ever wonder why you need a  wall adapter to charge your phone while your toaster plugs directly into the wall? The answer lies in the difference between  alternating current (AC) that’s supplied to your house, and direct current (DC) that’s required to power an electronic device. Understanding the difference between AC and DC is important for anyone looking to build their own digital electronic devices.

 

Tesla vs. Edison: A Battle for the Future of America’s Electrical Grid

At this point you may be wondering, if digital devices require DC circuits, why do we still live in a world where all our outlets output AC power? It relates to a battle between the technologies of two famous inventors: Nikola Tesla and Thomas Edison.

 

In the late 1880s the world was at a crossroads. Edison had set up 121 DC power stations across the United States. Ganz Works had just electrified all of Rome with AC power lines. Both technologies seemed viable until Westinghouse Electric with the help of Tesla’s AC power patents greatly improved the AC distribution system. Compared with the DC power stations of the time which could only supply power within a 1 mile radius, the new AC power stations could distribute AC over long distances thanks to the transformer, a device which can readily change the voltage of an AC. Higher voltages mean lower currents, which means less power loss due to heat generation over longer distances. By the time long distance high-voltage direct current (HVDC) became viable, the world had already largely switched to an AC grid. Interestingly enough, while DC voltage conversion equipment is still too costly for the distance scale of our basic electrical grid, HVDC becomes more efficient over extremely long distances, such as in the underwater power cables between countries.

 

What is Direct Current

DC is the unidirectional flow of electric charge. TVs, smartphones, laptops, and other digital devices all run off DC power. If you remember the water tank analogy from school, where there’s a simple tank with a water hose towards the bottom; water will flow out the hose in one direction until the tank is empty. This is analogous to a DC power supply, like a battery which outputs DC in one direction until its capacity is depleted.

 

Common DC generation techniques include:

 

  • Batteries which store energy electrochemically.
  • Photovoltaic Cells which convert sunlight into DC.
  • Devices such as commutators and rectifiers which take AC and convert it to DC

 

DC is the preferred current for powering digital devices because the direction of charge (and therefore the polarity of the voltage) remains constant, making it easier to work with. Many components in digital electronics have polarity (positive and negative poles) such as a transistor. Since direction matters in these components, a DC bias is necessary for them to function properly. Furthermore DC is more efficient for low power devices and offers a constant voltage level.

 

What is Alternating Current

AC is a bit more complicated than DC because the direction of the flow of charge and the polarity of the voltage periodically reverses. The primary application for AC is to distribute power over our electrical grids. Your toaster, dryer, and refrigerator can all run directly off of AC power.

 

The textbook water analogy for AC is a closed loop of water pipes connected to a pump wheel which drives the flow of water. The pump rapidly changes the direction of the water flow causing the water to slosh back and forth within the pipes. The primary means of producing AC is via a specialized electric generator called an alternator, which generates AC by rotating a magnetic field around a set of stationary wire coils (i.e. applying Faraday’s Law of electromagnetic induction).

 

To formally describe AC it’s necessary to think in terms of waveforms. The sine wave is the most common waveform used to describe AC circuits. For example the standard mains power supply in the United States is 120V RMS with a frequency of 60Hz AC. The 60Hz refers to the 60 oscillations between -170V and +170V that occur within a single second. If you were to check your mains electricity with an oscilloscope (WARNING: do not actually attempt to measure your mains electricity outlet with an oscilloscope, this can be dangerous!) and plot voltage over time, you would see a sign wave with peaks close to 170V and troughs close to -170V depending on the quality of your power lines. Why 170V? Because the 120V number used to rate US power outlets is actually the RMS (root mean square) of the 170V mains electricity AC waveform:

 

V(t) = 170 sin(2π60t)

 

The RMS is used to make it easier to compare the heating effect of the AC mains to an equivalent DC power line.

 

Further Reading

In a DC circuit, the flow of electric charge is unidirectional and the voltage is constant. In an AC circuit, the flow of electric charge changes periodically causing the voltage to reverse accordingly. Now that you know the difference between AC and DC, you’ll understand why all your electronics must compete for adapter real estate on your power strip. More importantly, you’ll have the solid foundation you need to tackle more advanced topics such as the need to isolate AC and DC signals in your PCB circuits for noise reduction. Even though electronic devices are powered by DC, it’s important to remember that there are many cases where AC components may need to be used within a circuit.