Jump to content

Television TV


Tim Doran

Featured Posts

I am thinking of buying a lcd TV, but i dont know what sort to go for 12v or 240v. I currently have a good mastervolt inverter which can supply ample power for all my needs. But which would be the most efficient. I realise that the cost of the 12v TVs is much higher than a normal 240v TV so the energy saving would have to be considerable. Appologies if this question has been posed a million times before.

 

Tim

Link to comment
Share on other sites

It has indeed been asked a million times before (try using the forum's search facility, but it will point to a million different threads so not necessarily a lot of use).

 

It will not make much difference on energy consumption whether you use a 12v TV or a 240v one. If you already have a good 240v supply from an inverter, go for the 240v one and save yourself some money.

Link to comment
Share on other sites

I am thinking of buying a lcd TV, but i dont know what sort to go for 12v or 240v. I currently have a good mastervolt inverter which can supply ample power for all my needs. But which would be the most efficient. I realise that the cost of the 12v TVs is much higher than a normal 240v TV so the energy saving would have to be considerable. Appologies if this question has been posed a million times before.

 

Tim

 

Most 15", and one or two 17" LCD tv's run on either 12 volts or 240 volts [via an external power supply]. The larger ones run on 240 volts only [usually].

 

It depends what size of tv you were looking for. Energy wise, you will use the same amount on either system, eg, 40 Watts for example.

 

You will need to recharge your batteries whichever system .... battery or inverter .... that you use, so there is little benefit one way or the other.

Edited by 1066
Link to comment
Share on other sites

I am thinking of buying a lcd TV, but i dont know what sort to go for 12v or 240v. I currently have a good mastervolt inverter which can supply ample power for all my needs. But which would be the most efficient. I realise that the cost of the 12v TVs is much higher than a normal 240v TV so the energy saving would have to be considerable. Appologies if this question has been posed a million times before.

 

Tim

 

As you already have the inverter it's no contest. The difference in price and range of 12 V versus 240 V makes a standard 240 V TV the sensible option. We have a 19" LCD unit and the inverter doesn't even notice it's on. Evidently some cheaper inverters can give rise to interfearance but that shouldn't be a problem with a Mastervolt.

 

Ken

Link to comment
Share on other sites

Most 15", and one or two 17" LCD tv's run on either 12 volts or 240 volts [via an external power supply]. The larger ones run on 240 volts only [usually].

 

It depends what size of tv you were looking for. Energy wise, you will use the same amount on either system, eg, 40 Watts for example.

 

You will need to recharge your batteries whichever system .... battery or inverter .... that you use, so there is little benefit one way or the other.

I tend to disagree, I bought a 26" LCD HD reay Samsung tv, it wont work off of 12v only 240. Because of having to run it through inverter it drains the batteries taking 10amps a hour! if I had a land line it would only take half a amp a hour to power it. I really wish I had bought a smaller LED tv that I could have run directly from 12v DC

Link to comment
Share on other sites

I tend to disagree, I bought a 26" LCD HD reay Samsung tv, it wont work off of 12v only 240. Because of having to run it through inverter it drains the batteries taking 10amps a hour! if I had a land line it would only take half a amp a hour to power it. I really wish I had bought a smaller LED tv that I could have run directly from 12v DC

 

So around 40 Watts then.

 

Ken

Link to comment
Share on other sites

I tend to disagree, I bought a 26" LCD HD reay Samsung tv, it wont work off of 12v only 240. Because of having to run it through inverter it drains the batteries taking 10amps a hour! if I had a land line it would only take half a amp a hour to power it. I really wish I had bought a smaller LED tv that I could have run directly from 12v DC

 

 

I think a lot depends on the efficiency of the inverter - with many of them, you're lucky to get 40% efficiency, whilst others can go as high as 90%. The end result is a lot of wasted current.

 

10 amps an hour to power a 26" tv doesn't sound excessive, tho' :rolleyes:

Link to comment
Share on other sites

I think a lot depends on the efficiency of the inverter - with many of them, you're lucky to get 40% efficiency, whilst others can go as high as 90%. The end result is a lot of wasted current.

 

10 amps an hour to power a 26" tv doesn't sound excessive, tho' :rolleyes:

Could you tell me which inverter is only 40% efficient?

Link to comment
Share on other sites

Could you tell me which inverter is only 40% efficient?

 

No specific make, but the cheaper ones often struggle to better this figure.

 

When you look at the waveform of some of the so-called "modified sinewave" inverters, they look more like a 50-50 duty cycle square wave, which theoretically would give 50% efficiency, but never does.

Edited by 1066
Link to comment
Share on other sites

I do not think waveform has anything to do with efficiency.

 

Efficiency is to do with what goes in to what comes out.

 

It has a lot to do with it.

 

If you only use 50% of a waveform, you cannot expect the same efficiency as you would if you used 100% of it.

Link to comment
Share on other sites

square-modified-sinewave.jpg

 

Sorry Les

 

I am not qualified or knowledgeable enough to disagree, maybe you could explain how only half the waveform is used.

 

I would be grateful.

 

When generating a sinewave from a squarewave, the squrewave does not exceed the limits of the sinewave .... so your diagram is very pretty, but not relevant.

Link to comment
Share on other sites

When generating a sinewave from a squarewave, the squrewave does not exceed the limits of the sinewave .... so your diagram is very pretty, but not relevant.

 

Sorry you have lost me, told you I was not qualified or knowledgeable.

 

Thanks for trying.

Link to comment
Share on other sites

I tend to disagree, I bought a 26" LCD HD reay Samsung tv, it wont work off of 12v only 240. Because of having to run it through inverter it drains the batteries taking 10amps a hour! if I had a land line it would only take half a amp a hour to power it. I really wish I had bought a smaller LED tv that I could have run directly from 12v DC

If you're running your big Samsung from a landline, it takes half an amp. That's fine, that means it's using 120 watts (240 volts times 0.5 amps equals 120 watts)

 

If you run it from an inverter it still uses 120 watts, by taking one half an amp from the 240 volt output of the inverter. To provide this power, the inverter takes a little over 120 watts from the batteries, which is 10 amps (12 volts times 10 a,ps equals 120 watts).

 

If your TV had a 12 volt socket on the back, and you could run it from the batteries directly, you would find that it consumed 10 amps.

 

It is a shame that modern TVs, especially the digital and HD ones, take so much power. The old cathode-ray TV's were really just one big energy-saving lightbulb - ours for example takes only 35 watts.

Link to comment
Share on other sites

Sorry you have lost me, told you I was not qualified or knowledgeable.

 

Thanks for trying.

 

OK. If you draw the squarewave inside the sinewave, you will see a lot of "empty" space which is not occupied .... this space corresponds to lost power.

 

In a cheap inverter, a simple squarewave is used to "approximate" a sinewave. This results in a lot of wasted 'space', therefore power, therefore efficiency.

 

In more expensive inverters, a more complex squarewave [similar to, but more involved than the modified squarewave in your diagram] is used, giving better efficiency.

 

I need a lie down now! :rolleyes:

Edited by 1066
Link to comment
Share on other sites

OK. If you draw the squarewave inside the sinewave, you will see a lot of "empty" space which is not occupied .... this space corresponds to lost power.

 

Thanks can see it now, again thank you for your patience.

Link to comment
Share on other sites

OK. If you draw the squarewave inside the sinewave, you will see a lot of "empty" space which is not occupied .... this space corresponds to lost power.

 

In a cheap inverter, a simple squarewave is used to "approximate" a sinewave. This results in a lot of wasted 'space', therefore power, therefore efficiency.

 

In more expensive inverters, a more complex squarewave [similar to, but more involved than the modified squarewave in your diagram] is used, giving better efficiency.

 

I need a lie down now! :rolleyes:

 

It doesn't work like that. I'm not qualified to explain why, but even the cheapest modern inverter is about 90% efficient. The nearer you use them to full capacity the more efficient they are. I've got one that uses 5 watts to tickover + some loses depending on load, so if running something at 50 watts it uses about 58watts, so it's about 86% efficient. If running flat out at 800watts, it still only loses about 50 watts , so its 750/800 x 100 %= 94% efficient.

 

I also think you'll find that a pure sine wave inverter is less efficient than a modified sine wave because the modified sine wave is easy to generate simply by electronic switches, whereas the pure sine wave is quite a hi tec contraption.

 

BTW, I'm not arguing about how efficiently the device you are supplying uses the voltage that you feed it, just how efficiently the inverter inverts.

 

Go on now, shoot me down in flames.

Link to comment
Share on other sites

It doesn't work like that. I'm not qualified to explain why, but even the cheapest modern inverter is about 90% efficient. The nearer you use them to full capacity the more efficient they are. I've got one that uses 5 watts to tickover + some loses depending on load, so if running something at 50 watts it uses about 58watts, so it's about 86% efficient. If running flat out at 800watts, it still only loses about 50 watts , so its 750/800 x 100 %= 94% efficient.

 

<<<<<<<<<<<<<<<<<<<<<<SNIP>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

 

First of all sorry for hijacking this thread.

 

The above is how I understood it and why I said I did not think the wave made any difference.

 

But I don't know :rolleyes:

 

I can see 1066's explanation and thought I understood.

Link to comment
Share on other sites

It doesn't work like that. I'm not qualified to explain why, but even the cheapest modern inverter is about 90% efficient. The nearer you use them to full capacity the more efficient they are. I've got one that uses 5 watts to tickover + some loses depending on load, so if running something at 50 watts it uses about 58watts, so it's about 86% efficient. If running flat out at 800watts, it still only loses about 50 watts , so its 750/800 x 100 %= 94% efficient.

 

I also think you'll find that a pure sine wave inverter is less efficient than a modified sine wave because the modified sine wave is easy to generate simply by electronic switches, whereas the pure sine wave is quite a hi tec contraption.

 

BTW, I'm not arguing about how efficiently the device you are supplying uses the voltage that you feed it, just how efficiently the inverter inverts.

 

Go on now, shoot me down in flames.

 

It's not my intention to shoot anybody down in flames - and modern inverters are more efficient than they used to be.

 

The following link shows a test which somebody undertook on various inverters.

It shows poor efficiency at low loads, increasing to the 90% region over quite a large range, and then deteriorating again, which is normal.

 

http://www.fieldlines.com/story/2004/10/2/145353/321

 

This is because most modern inverters use a modified squarewave that is much more efficient than used to be the case.

 

"Pure Sinewave" inverters generally contain more components, which reduces overall efficiency, but you can still expect efficiencies greater than 90% with them.

Link to comment
Share on other sites

1. Modified sinewave inverters (MSW) have far more complex shapes than the one drawn above. (unless its a really cheap low wattage type)

 

2. The total harmonic distortion (THD) of a modified sinewave inverter (with the complex waveform) is only about 6% so it's a very good approximation to a sinewave. Even a pure sinewave inverter (PSW) has a THD of 2% so the difference between the two is not that great.

 

3. A pure sinewave inverter is less efficient than an MSW when the two are on standby (ie: no load). An MSW may draw an amp, a PSW 5A on standby. So unless you are always going to switch a PSW inverter off everytime there is no load, an MSW is more efficient.

 

4. There is no such thing as "amps per hour" as someone stated above. There are just amps. If you use 10A for 1 hour you have consumed 10 ampere hours not 10 amps per hour.

 

5. Allan is absolutely correct on the wattage calculations. The TV will use the same wattage (120W) on either 240v or 12v. On 240v there will also be the inefficiency of the inverter to add on too in terms of total consumed power.

 

6. Some devices (like motor speed control circuits, eg: washing machines) often don't like MSW's because they specifically use the sinewave phase to control the speed.

 

Chris

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.