AC charging (in)effeciency | Kia EV Forum
Kia EV Forum banner

AC charging (in)effeciency

1.2K views 21 replies 9 participants last post by  EV- Technician  
#1 ·
I've wondered why most people install a 'home charger'. In the UK they are typically 7 kW so only 3 times faster than a 'granny cable'. If you can get all the daily range you need at 2.2kW why pay for a wall box?
Maybe the answer is the inefficiency of the EV6 and maybe all EVs in converting AC to DC. In my case If I charge the car for 6 hours at 2.2kW AC I 'input' 13.23 kWh (this is based on actual meter reading). This 13.23kWh gains me 12%charge at the car. If we assume the usual quoted figure of a useable 74kWh battery for UK 2024 EV6 this equates to only 8.88kWh. This is 33% less than I'm putting in! I realise there are losses to heat etc. The usual quoted figure is 10%.
So - is the EV6 spectacularly bad at converting AC to DC? What losses do others see when charging with AC? What about the majority of you that charge with DC? Has anyone with a DC home charger ever calculated how may kWh are actually going INTO the DC charger using their electric meter to get 1kWh of charge at the car?
If I get a similar 33% loss at a rapid public charger for which I'd pay 75-85p per kWh in the UK that would work out about twice the cost per mile of a diesel car!
 
Discussion starter · #4 ·
It's not just about converting AC to DC. When you're charging, the car is effectively "on". The electronics are on, the coolant pump is on, the coolant fan might be on, and in extreme cases, the compressor could even be on. You can see some of this consumption via Car Scanner, by viewing the ICCU 12v output amperage & voltage (but not compressor or battery heater, as those are high voltage). If I remember correctly, it can typically be somewhere in the neighborhood of 250-300w (in addition to the conversion losses). That overhead of keeping the car operating while charging can be a much larger percentage when charging at a low rate than at a high rate. On the flip side, you get slightly higher resistance losses at higher currents, but in most cases, probably trivial compared to the operating overhead.
My recent charging has been midnight to 6am in a garage in the UK in April and May. Temperature is probably around 8 to 10°C. Should I really expect a 33% inefficiency in these conditions?
 
Discussion starter · #6 ·
It's not just about converting AC to DC. When you're charging, the car is effectively "on". The electronics are on, the coolant pump is on, the coolant fan might be on, and in extreme cases, the compressor could even be on. You can see some of this consumption via Car Scanner, by viewing the ICCU 12v output amperage & voltage (but not compressor or battery heater, as those are high voltage). If I remember correctly, it can typically be somewhere in the neighborhood of 250-300w (in addition to the conversion losses). That overhead of keeping the car operating while charging can be a much larger percentage when charging at a low rate than at a high rate. On the flip side, you get slightly higher resistance losses at higher currents, but in most cases, probably trivial compared to the operating overhead.
I'm 'losing' 4.5kWh during my 6 hour charging window. Even a 'car operating cost' of 300W would only result in 1.8kWh extra cost. Where's the other 2.7kWh going?
 
Discussion starter · #8 ·
There will always be energy loss when moving electrons (or vibrating them in terms of AC) due to wire thickness, length, material and temp. There will always be loss when transforming AC to DC (heat). Kia has overreacted to keep the ICCU cooled. I find that the fans are running for an L2 charge session even on a mild day 70°F/21°C. Maybe the battery is being heated?

Some folks have graphed their Tesla's charging efficiency (this graph is from a UK owner so similar to you but not sure how the higher kW were obtained). There have been posts about L1 barely charging due to the overhead of the cooling/heating of the ICCU/battery. I only charge at 32A 1-phase 240V(ish)->7.2kW so I don't have any efficiency graphs with varying kW. I do have a monitor on my circuit box and can see more kWh being used than stored. From a charge yesterday to 80%, 26.95kWh used by the charger, 31% added so 23.7kWh ->12.1% loss.

This graph shows lower efficiency at lower kW. What are your end SOC%? I would think that as you get closer to 100%, efficiency would drop.

View attachment 26839
My figures are pretty constant over all SOCs. In this example I went from 62 to 74% SOC. I'm comfortable with an approx 10% charging loss as this is the oft quoted figure. But I'm scratching my head at 33%. I'm not in the 80-100% SOC and I'm not in the extreme ends if any temperature range. My question really is would I get the same in the same conditions if I were charging with a DC charger?
 
Discussion starter · #9 ·
Does your charger get hot? How long is your charging cable? Cable get warm or hot? How about your circuit breaker, does it get hot?

I have an L1 travel charger that gets super hot. That tells me it's not efficient. Heat increases resistance, which over time creates more heat->more resistance. Most folks don't get a true energy use since their charger is giving them the output and not true input to the charger.
Nothing gets especially hot. I am measuring how many kWh I am putting IN from my domestic meter, not from whatever the car or charger are telling me. I have calculated how many kW the 'house" uses for 6 hours overnight when I am not charging the car. Therefore I can know how many kWh it is actually 'costing' me to charge for 6 hours. The difference between kWh in and kWh out is 33%. My cable is 10 metres long but I don't believe an extra 4 metres are resulting in the sort of losses I am seeing.
 
Discussion starter · #14 ·
For better estimates, you should know the voltage under load (sag under load). No-load and under-load voltage are way different when you're doing the calculations. Voltage drop or sag is directly related to heat loss from wiring resistance.
My question was not the newbie 'how come 1kWh in doesn't get me 1kWh out?'. It was 'AS I have measured that my home AC charging is 33% inefficient, could anyone tell me the efficiency of home DC charging please?'. I'd like to know if anyone has measured the kWh they need to put INTO a home DC charger to get 1kWh at the car.
 
Discussion starter · #18 ·
As I indicated above, my e-Niro would get 10% charge per hour charging on a 7.2 kW charger. That was fairly consistent between 20-80%. It would therefore take 10 hours to charge from 0-100%, which equates to 72 kWh going into a 64 kWh battery. Ergo, approx. 11% losses.

When charging at 2.4 kW (10A at 240 volts) I would get 2.5% per hour. Charging from 0-100% would be 40*2.4= 96 kWh. Ergo, 33% losses. Almost exactly what you are seeing.

There will be some margins in those calculations but it's clear that there's a lot more losses due to overhead when charging at 10A vs 32A.

BTW: there is no such thing as home DC charging. All home charging is AC charging. The home charger is in fact not a charger at all, but more of a fancy on/off switch with some added intelligence that allows certain control. But there is no electronics between the electricity wiring that goes into the charger and the plug that goes into your car. Therefore, all losses you see are related to the stuff that is happening in the car, unless you really have crappy wiring and you lose energy there. But your socket would have burnt out by now if that were the case so let's discard that.
That is really useful, thank you. So maybe there is a reason to install a 7kW home supply.
 
Discussion starter · #19 ·
As I indicated above, my e-Niro would get 10% charge per hour charging on a 7.2 kW charger. That was fairly consistent between 20-80%. It would therefore take 10 hours to charge from 0-100%, which equates to 72 kWh going into a 64 kWh battery. Ergo, approx. 11% losses.

When charging at 2.4 kW (10A at 240 volts) I would get 2.5% per hour. Charging from 0-100% would be 40*2.4= 96 kWh. Ergo, 33% losses. Almost exactly what you are seeing.

There will be some margins in those calculations but it's clear that there's a lot more losses due to overhead when charging at 10A vs 32A.

BTW: there is no such thing as home DC charging. All home charging is AC charging. The home charger is in fact not a charger at all, but more of a fancy on/off switch with some added intelligence that allows certain control. But there is no electronics between the electricity wiring that goes into the charger and the plug that goes into your car. Therefore, all losses you see are related to the stuff that is happening in the car, unless you really have crappy wiring and you lose energy there. But your socket would have burnt out by now if that were the case so let's discard that.
So is a 7kWh wall charger putting AC into the car?