The Prius Prime is a dual fuel vehicle, able to run 100% on Electric, or 100% on gasoline, or a computerized blend in-between. This presents me a great opportunity to be able to do a direct comparison with the same car of an EV engine vs an ICE engine.
-
Toyota computer claims 3.2mi-per-kwhr.
-
Kill-a-watt (https://en.wikipedia.org/wiki/Kill_A_Watt) claims 2.2mi-per-kwhr.
-
Additional 1.5% losses should be assumed in the wires if you wish. (120V drops down to 118V during charging, meaning 2V of the energy was lost due to the resistance of my home’s wires).
-
Level 1 charger at home (known to be less efficient).
-
Toyota computer claims 53miles-per-gallon (American Gallon).
-
I have not independently verified the gallon usage of my car.
-
295 miles driven total, sometimes EV, sometimes Gasoline, sometimes both.
-
30F to 40F (-1C to 4.5C) in my area this past week.
-
Winter-blend fuel.
-
12.5miles per $electricity-dollar (17.1c / kw-hr home charging costs)
-
17.1 miles per $gasoline-dollar ($3.10 per gallon last fillup).
If anyone has questions about my tests. The main takeaway is that L1 charging is so low in efficiency that gasoline in my area is cheaper than electricity. Obviously the price of gasoline and electricity varies significantly area-to-area, so feel free to use my numbers to calculate / simulate the costs in your area.
There is also substantial losses of efficiency due to cold weather, that is well acknowledged by the EV community. The Prius Prime (and most other EVs) will turn on a heater to keep the battery conditioned in the winter, spending precious electricity on battery-conditioning rather than miles. Gasoline engines do not have this problem and remain as efficient in the winter.
It’s partially subtraction.
The 100W heater (or whatever it is…) to condition the battery in the winter is a subtraction problem, not a multiplication problem. The heater needs to run during the winter when charging or when driving.
1100W L1 charger - 100W heater is less efficient than 3500W to 10,000W L2 charger - 100W heater.
Now there is another component of multiplicative losses (the inductor and/or voltage conversion coil + MOSFET switch resistance). So it’s partially subtraction and partially multiplicative losses. Ultimately I’ll need to just test the damn thing to find reality, we can’t math this out on paper. (100W losses I assumed earlier was just that, an assumption. I have no idea how much the battery conditioning circuits / pumps / etc. uses up in practice)
The real problem is that I’m not aware of any Kill-a-watt model for 240V circuits. I’ll have to rely upon the charger to give me accurate readouts. But all the theory (and apparently some internet testing) suggests that the
You can look into a current clamp meter as a stand in for the kill-a-watt. If you’re willing to tinker a bit and like automation, you can use one of those and an esp32 board to make a remote power monitoring system for your 240v circuit.
If you’re less willing to tinker you can get some off the shelf stuff that you run the wires through or a clamp style variant to do something similar.
(Note: I can’t provide any recommendations on off the shelf products, I just bought the clamps and am working on tinkering it together).
Hmmm, current is the bulk of it. But voltage drop is also important as that’d measure wire losses. (All wires have resistance and it adds up the longer the wire runs get).
But yeah, good idea. I’m actually into electrical engineering so ESP32 is right up my alley. I mean, I prefer AVR but any uC can be used in that circumstance.