Beginning Electronics

This tutorial will attempt to explain the very basics of electricity. There are two fundamental units that you are almost guaranteed to have heard of, voltage and amperage (also called current). We will go over exactly what these are and how they relate to a third unit, watts. Building an intuitive understanding of these units and how they interact is an incredibly important foundation required for just about every future tutorial so take you time. It took me about a year before I built an intuition so don't be discouraged if things don't make sense at first.


Voltage is the driving force that makes any circuit work. It is defined as the electrical potential between two points. But, what does that mean? Electrical potential can be thought of as pressure. You can't talk about the absolute pressure of something. Pressure, like voltage, is always in reference to something else. When you measure the pressure of your bike's tires, you are measuring the pressure difference between the atmosphere and the tire. When you have something like a AA battery, the 1.5V rating means that between the two terminals there is a difference of 1.5V of electrical potential.

Most electrical devices will have a voltage rating. You need to supply them with the proper voltage for them to operate. If you have too little voltage, there just won't be enough force to make them work properly. If you have too much voltage, you will likely damage the component. It would be like using a sledge hammer to ring someone's doorbell. The doorbell is designed to the force of a single finger. By applying the force from a sledge hammer, you'll obliterate it.


Amperage, also known as current, is the measure of how much charge is actually moving. Amperage is often abbreviated to amps and is equal to one coulomb per second. Coulomb is a unit of charge and is roughly 6,241,507,648,655,549,400 electrons! That means if you have something that is using 1A of current, there are 6,241,507,648,655,549,400 electrons moving past any fixed point in the wire per second.

It is often convenient to compare electricity to water. Pipes are like wires, pressure is like voltage, and flow rate is like amperage. While voltage and amperage are related in a fixed circuit, there is no universal relation between them. Just like you can have a small amount of water pushing very hard (think of a water gun or partially covering the end of a hose with your thumb) you can have a large amount of water barely pushing (think of a wide, deep, slow moving river). Some electrical equivalents to these example would be a static discharge which is very high voltage but very low current and an electric stove which is relatively low voltage but high current.

Most small electronic devices don't require a lot of current and typically specify their requirements in milliamps (mA) instead of amps. A milliamp is simply 1/1000th of an amp. For example, a typical LED requires somewhere around 20-35mA.

It is impossible to provide too many amps to a circuit as long as the voltage is correct. Think of the faucet in your bathroom, you can easily use the handle to control how much water comes out. This is true even though the water supply to your house is capable to suppling a lot more water than your faucet can possibly let out.

This is something that confused me for quite a while. What added to my confusion is that power supplies are rated with voltage and amperage values. For example, a typical phone charger will be rated for 5V and 1A. But, what if your phone only needs 500mA (0.5A)? This is totally fine! The phone will only use 500mA. What happens is the charger just doesn't operate at it's peak capacity. That rating is the maximum current it can supply and is not the current it must supply.

You will run into trouble if you try to use something that requires more current than the power supply can supply. The voltage may drop causing your device to not work and/or the power supply can overheat and break.


Wattage is the unit used to measure power. People often incorrectly call current or voltage power, but really it is the voltage multiplied by the current which dictates the power used. The equation for watts is watts = volts * amps. This means that a circuit that is powered with 1V and draws 1A is said to consume 1W. Another circuit powered with 5V that draws only 200mA also consumes the same amount of power, 1W.

There are types of circuits that can manipulate the voltage/current ratio. These can increase voltage at the expense of providing a smaller current or vice versa. This trick is actually used to power your home. The electricity that flows through long distance power lines is incredibly high voltage to minimize the current. This helps with efficiency, as we will see in the next tutorial. When the power gets near your house, the voltage is reduced and the current increased. In this case, the devices that do the voltage/current transformations are called transformers.

Now you should at least have some sort of idea of what electricity is. Be patient as it will take some time to understand it all. I believe the best way to learn this stuff is to use it! Just dive right in, get your hands dirty. You will learn much faster than reading a lot of pages of abstract text.

The next step is to learn about some basic components. The resistor tutorial should be your first stop!