That was my first guess as well. By default the TV is in an eco mode and would only use around 50 watts. But as soon as you make the TV actually usable it will double in power.
OP if you want to get a worst cast scenario of the power consumption of your TV just look at the power supply. If it’s an external brick just look at the DC output from the brick and multiply the voltage by the amperage. If you’re running it off of a battery powered inverter then power factor, and efficiency of the brick come into play, but it shouldn’t be too much worse than the absolute highest the brick is rated to output.
They usually test whatever the manufacturer says is the default. And that most likely happens to be the lowest power mode that barely resembles a reasonable usage.
I’m guessing it’s an EU model. They have all sorts of “eco” modes to pass environmental laws, but you wouldn’t use them IRL
So yes, it could, but fuck that, stick it on dynamic HDR and drive your eco friendly -ish car to compensate lol
That was my first guess as well. By default the TV is in an eco mode and would only use around 50 watts. But as soon as you make the TV actually usable it will double in power.
OP if you want to get a worst cast scenario of the power consumption of your TV just look at the power supply. If it’s an external brick just look at the DC output from the brick and multiply the voltage by the amperage. If you’re running it off of a battery powered inverter then power factor, and efficiency of the brick come into play, but it shouldn’t be too much worse than the absolute highest the brick is rated to output.
I get 22 mpg, is that good enough?
Is it possible that the local version of Energy Star for my TV used the Eco mode setting for the tests?
They usually test whatever the manufacturer says is the default. And that most likely happens to be the lowest power mode that barely resembles a reasonable usage.