how many amps does a television use

How Many Amps Does a Television Use? (Cost, Power, & More)

Power consumption specifications for TVs can be a bit of a pain to come by these days. Lucky for you I dug through the manuals of six of the most popular TV brands and summarized their power usage details below.

How many amps does a television use?

The average American TV is 50 inches and uses 0.95 amps at 120 volts. That works out to an average TV power consumption of 113 watts. In a given year, the average TV will use 142 kWh and cost a little over 17 dollars (assuming 5 hours of use per day).

Brand (50'')AmpsWattsKwH per yearCost per year
Toshiba (4K UHD)0.66 A79 W150 kWh$18
Samsung (7 series)1.13 A135 W120 kWh$14
Sony (X80J series)1.22 A146 W179 kWh$22
Vizio (M series)1.09 A131 W154 kWh$19
TCL (4 series)0.66 A79 W100 kWh$12
Hisense (A6G series)0.92 A110 W148 kWh$18
Average0.95 A113 W142 kWh$17

Average American TV size

Amazingly, the average American TV size has more than doubled from 1998 to today, growing from 23 to 50 inches.

average tv size and amps

The average TV size is extremely important to know when trying to determine how many amps a TV uses. Smaller TVs will use far less amperage, and larger TVs will use more.

This is because larger TVs have higher power consumption requirements. For example a typical 85” inch TV uses over 400 watts, whereas a 43” TV uses a few as 100 watts.

So larger TVs use more watts, but how do watts impact amperage, you ask?

Great question.

Amps, volts and watts

I won’t get into all the details about what exactly amps, volts and watts are (there are plenty of great resources for that).

But for our purposes, what’s important to know is that amps = watts/volts.

In the typical American home, the supply voltage at the power outlet is a consistent 120 volts.

So, knowing that the volt portion of our equation will remain constant at 120, that means watts are what will ultimately determine how many amps the TV uses.

The more watts, the greater the amperage.

Larger, and less energy efficient TVs, use more watts, and in turn, more amperage.

Take for example the Sony 50” X80J series television. It uses 146 watts. If we divide 146W/120V we get 1.22 amps. That’s almost double the TCL 50” 4 series which only uses 79 watts!

And, as you can imagine, you most certainly pay for this difference over the course of a year. That Sony 50” costs nearly twice as much as the TCL to power.

That said, TV costs in general are extremely low, regardless of which TV you have and there are many different factors that lead to overall TV power costs.

What factors impact TV power costs?

If you’ve gone shopping for an appliance in the past decade you’ve probably seen a yellow Energy guide sticker.

tv amps energy guide

This sticker is required by law and it details just how much energy that exact appliance uses and how much you can expect to pay to power it for a full year.

Energy guide labels on TVs, like all appliances, are based solely on estimates. There are a great number of factors that can increase, or decrease your TVs’ power costs.

For example:

  1. How many hours of TV you watch per day. All TV energy costs are estimated based on the assumption of 5 hours of use per day. If you use your TV more or less than this, your costs will differ. That said, even if you use your TV for a whopping 10 hours a day, your annual costs will still be <100 dollars.
  2. Utility/power rates. The amount you pay for power depends largely on where you live and if you have solar or other renewable energy sources. Energy guide calculations assume 11 cents per kWh. Your actual costs may be lower or higher. Mine, for example, are higher.
  3. TV settings: Energy guide calculates costs based on the TVs’ default picture settings. These settings almost always use less energy than every day use.
    1. The number one TV setting that impacts power uses is brightness/contrast. The brighter your TV, the more power it will use and the higher the annual cost.
    2. Power save mode is offered by some TVs and is worth turning on if your TV has it. This more or less automatically adjusts the TVs brightness settings so they are optimal for viewing and cost.
    3. Another option is to lower the volume or mute the TV during commercials. This way your TV isn’t using power for sound when you’re not really paying attention anyway. You might just find you actually prefer the peace and quiet!
    4. Finally, consider using the programmable timer (sleep timer). This is perfect for those of us that tend to fall asleep in the middle of a movie or show and end up leaving the TV on all night. With the sleep timer on your TV it will shut off automatically.

Conclusion

The average TV size in America today is a whopping 50 inches, up from 23 inches back in 1998.

So how many amps does a television use if the average TV is 50”?

The average 50” TV uses about 0.95 amps at 120 volts. That works out to an average TV power consumption of 113 watts.

Over the course of a year the typical 50” TV costs just 17 dollars to power (142 kWh per year at 11 cents per kWh).

Larger TVs use more watts and in turn, more amperage. That’s because amps = watts/volts, and in America the volts always equal 120.

There are a handful of factors that influence TV power costs such as the number of hours you watch per day, your local power/utility rates, and the settings on your TV.

That’s why each individual’s annual TV costs will vary, but even if your costs are double the American average, you’ll only be paying about 34 dollars a year! Not bad.