The size of the battery varies from phone to phone, but let's choose 3000mAh as a typical value so the battery is enough for a full day.
If you want to charge it in "seconds", let's use 60 seconds as an upper bound.
So if the charger has a 100% efficiency, then it has to provide 3000mAh/60sec = 3000mAh * 3600 sec/h / 60sec = 180000mA = 18A.
An USB can provide between 0.1A and 0.9A. For comparison, a typical plug in a home can provide 10A. So to charge the phone you will need some big connector, not a tiny microUSB like connector.
But it's worse. From the article:
> "If they were to replace the batteries with these supercapacitors, you could charge your mobile phone in a few seconds and you wouldn't need to charge it again for over a week," said Nitin Choudhary, a postdoctoral associate who conducted much of the research published recently in the academic journal ACS Nano.
To recharge the phone once a week, I guess you will need a 20000mAH battery, and a few seconds is something like 5, so the connector must survive to 1000A, that is a ridiculous current.
You are confusing yourself and everybody else. The 10A your outlet provides is at 120 volts. The lithium battery in your phone is a little under 4 volts, so multiply all your numbers by 30.
Also. An Anker USB charger can do 2.4 amps max per port, at 5V, not 0.9A. I don't believe any modern smart phone only accepts .9A. But a USB-C port can negotiate a higher voltage too. That's how you have 40W USB-C chargers powering laptops a year ago, and higher now. The new MacBook is pulling 87W over USB-C.
You are both right, in a way. The main limitation on power delivery through a conductor is current (resistive losses/heat per Ohms law). To deliver more power through a smaller conductor requires raising the voltage, much as USB-C does, or the high-voltage transmission lines used for long-distance power delivery. But the challenges to deliver 2Ah (or let's call it 10Wh) in a few seconds to a phone are farcical, especially once taking into account the charge circuitry inside the phone.
Yeah. I see what you're saying. If you could magically plug your iPhone into your Macbook charger, it'd still take 290 seconds to deliver enough juice to change the battery, ignoring heat and losses. Then there's the amount of copper and shielding it'd take inside the phone to carry that much current. An iPhone is about a quarter the volume of the apple charger, but from the teardowns the DC side is about 25-20% of the size. I would find it reasonable if you wanted to assert that there would be no room for the 'iphone' inside of the case. It'd be all battery and charge controll circuits.
Even if you allow the usual marketing bullshit where anything under 2 minutes is allowed to be called 'seconds', we're still off by a factor of 3. You'd have to go through the next set of mental gymnastics where you say 'well if the phone charges that fast, I can just top it up whenever so it doesn't need to last for 3 days at a time'. In other words you could charge it 'in seconds' but you'll be doing that a couple times a day.
You could also charge at something higher than 5V, to keep the current manageable. Something like 1.8A/50V would deliver 90W, same as 18A/5V, still be within most of the safety guidelines, and allow a pretty similar connector bulk to microUSBr.
Or looking at what's already popular, USB-C connectors already spec do 3A@5V while handling data, or in PD mode (re-purposing pins) can do 5A@20V (100W).
So the first case of a charge 3000mAh in a minute doesn't seem completely ludicrous, though it's going to be challenging to make everything efficient enough the heat isn't prohibitive.
Sorry the high voltage USB idea is no good: although 1.8A * 50V equals 18A * 5V in terms of power, 50V is no use to a 5V battery. You cant deliver power to a battery at a voltage higher than its charging voltage or it will burn.
A 90W transformer will not fit into a phone for the foreseeable future.
If it's really a capacitor what you said does not apply. You can charge a capacitor at any voltage as long as you cut the current off when it acquires the desired charge / terminal voltage.
No. If you plug a capacitor charged to 1V into a voltage source at 100V, the 99V difference is going to be dropped somewhere and will dissipate 99% of your incoming power as heat. Averaged over a full charge (0-100V) you lose half your energy as heat. In a phone at these power levels, something will melt.
But this is all moot. The phone contains a DC-DC converter that changes the voltage. Capacitors aren't really any different from batteries in this regard except that they have a very different discharge curve.
P.S. 5V is already above your phone's battery voltage.
1. The power is dissipated across the internal resistance of the power source. Just make sure you are using a power source which can handle the heat (and current) and you should be fine.
2. Charging circuits change all the time to accommodate new battery technology. It will change again for supercaps.
> 1. The power is dissipated across the internal resistance of the power source. Just make sure you are using a power source which can handle the heat (and current) and you should be fine.
This is totally wrong.
Suppose you approximate your power supply by its Norton equivalent circuit (the Norton equivalence is exact, but the power supply is probably nonlinear at some point). You have an open-circuit voltage V_s and internal resistance R_s. A good power supply will try to keep R_s small.
Now you plug this power supply into a battery that currently has voltage V_b and internal resistance R_b. You seem to be assuming that R_b=0, which is odd.
The current I = (V_s - V_b) / (R_s + R_b). For many power supplies, R_s and R_b are quite low, and if you have V_s = 20V and V_b = 3V (which would be the case for your average laptop power brick and a partly discharged single-cell or parallel-connected laptop battery), you get 17V / (R_s + R_b). For common power bricks and batteries, R_s + R_b is very small and I ends up being rather large. If you set R_b = 0 (as you did), then you're nearly short-circuiting your charger and expecting it to dissipate the resulting heat.
If you change your scenario a bit and imagine a tiny USB-C phone charger at 9V or 12V and a phone battery at 3V, you're going to melt your charger. And your hypothetical design completely ignores the battery's preferred charging current, which I can pretty much guarantee you'll exceed.
As a real-world example of how wrong you are, go get a 9V battery and a little 3V lightbulb. Connect them, cross your fingers, and imagine that light bulb will run correctly and all that excess voltage dissipates in the 9V battery. (And don't hold the light bulb with your bare hands while you do this, please.)
Fortunately, chargers don't work like this. But please never design a power electronic circuit.
This is where you went wrong a capacitor is not a light bulb. Please take a look at this circuit: http://imgur.com/gallery/sOZ1F
The first graph shows the voltage and current across the capacitor when it is charged using a 10v source. The second shows the same parameters with a 5v source. The voltage source is cut-off by the zener diode at 4.7v in both cases.
The point I am trying to make is that the capacitor can be charged at any voltage as long as you cut off the supply when the voltage across the capacitor reaches a certain threshold. Using a higher initial voltage will only charge your capacitor faster it will not necessarily destroy it.
Note1: Voltage across capacitor != supply voltage.
Note2: The above circuit was simulated using LTSpice.
I never said you couldn't charge a capacitor like this. I said (a) that it was a bad idea to charge a capacitor living in a portable device using a small power and (b) that it's extremely inefficient. (And (b) tends to imply (a) when the power brick is a tiny little thing with limited heat dissipation.) You also have the capacitor's leads to worry about if you scale to cell-phone-battery-sized capacitors.
> You can charge a capacitor at any voltage as long as you cut the current off when it acquires the desired charge / terminal voltage.
Capacitors aren't charged with a voltage, they're charged with a current. The voltage is irrelevant until breakdown occurs. More current, faster charge, less current, slower charge.
Also, there's the practical constraint that a capacitor's physical construction limits the amount of current that can be applied without overheating the capacitor's materials.
The TL;DR: one does not apply a voltage to a capacitor, one applies a current. If you doubt this, try applying a constant-voltage source to a capacitor. See what happens.
> one does not apply a voltage to a capacitor, one applies a current
This is meaning less. You have to apply an EMF (https://en.wikipedia.org/wiki/Electromotive_force) to get the current flowing in the first place. There is no current without EMF (measured in volts).
When a capacitor is directly connected to a real world voltage source (with finite internal resistance) it shorts the power source and causes a massive amount of current to flow. I = VEmf/Rbatt (but only at t=0).
This current causes charge to accumulate on the plates of the capacitor causing the voltage across the plates to rise while reducing the current flowing through it at the same time.
The voltage across the capacitor rises (asymptotically) until it matches the potential of the power source. The voltage across the plates at any given time t is given by the equation: V = VEmf(1-e^(-t/(Rbatt * C))). See: https://en.wikipedia.org/wiki/RC_time_constant
Also the current at any given time is: I = VEmf/Rbatt * (e^(-t/(Rbatt * C)))
Thus an ideal capacitor can be charged by a voltage source of any value (VEmf) to any voltage (less than or equal to VEmf) across its plates. There is no theoretical limit imposed by physics. Practical capacitors will experience a dielectric breakdown above their rated voltages: https://en.wikipedia.org/wiki/Electrical_breakdown
So all you have to do is remember to disconnect the power source before the voltage across the capacitor crosses its safe operating area. So if a capacitor is rated at 50V max terminal voltage then you need to disconnect the power source when you get 50v across the cap. You will reach 50v earlier if you use a 200v power source instead of a 50v source. The only thing that changes is the the time required to charge the cap (and the current like you said). A higher voltage power source or one with lower internal resistance can drive larger currents.
In a circuit without resistance, current can flow with no potential difference, so the above claim is false. As to circuits other than ideal ones, I draw your attention to the behavior of superconductors, in which currents flow endlessly until interrupted.
> The voltage across the capacitor rises (asymptotically) until it matches the potential of the power source.
For a constant-current source, the rise is linear. For a constant-voltage source, the current at time zero is infinite, barring stray resistances.
> You will reach 50v earlier if you use a 200v power source instead of a 50v source.
False! The issue is current, not voltage. The voltage on a capacitor is the time integral of past applied currents.
Capacitors have an upper limit to inrush current, and to charge one, you stay below that limit. The ideal charging source for a capacitor is a constant-current source, and the voltage driving the source is irrelevant until the capacitor's voltage limit is reached.
Consider a capacitor that can tolerate 10 amperes of current and has a voltage limit of 50 volts. Let's say I want to charge it as quickly as possible. I will apply 10 amperes of constant current, and cut off the charge as the capacitor's voltage limit is approached. Will I be able to charge the capacitor faster from a source of 200 volts than from a source of 50 volts? Of course not.
> Also the current at any given time is: I = VEmf/Rbatt * (e^(-t/(Rbatt * C)))
I am mystified as to why you posted a battery equation, and/or reference to resistance, in this capacitor discussion, in particular when constant current sources are the current state of the art. Surely you realize one doesn't attach capacitors to voltage sources, or is this still not understood?
> A constant-current power supply is the preferred way to charge a capacitor wihtout destroying it.
Sure, I agree. I just wanted to use a simpler circuit to keep my explanation simple.
>As to circuits other than ideal ones, I draw your attention to the behavior of superconductors, in which currents flow endlessly until interrupted.
I think you have cause and effect mixed up. How do you get this current started in the super conductor? What is the magnitude of this current? Infinite amps? You will still need an initial jolt (voltage source) to get the current started.
>For a constant-current source, the rise is linear
You do realize there no such thing as a practical constant current source? A current source is just a voltage source which can adjust its terminal voltage to keep it's current output constant with changing load? Like I said you need to understand the cause and effect relationships here.
> Capacitors have an upper limit to inrush current
Sure practical capacitors, conductors, resistors, ... everything has an Imax. I did say raise the voltage as long as you are within the safe operating area. SOAs are defined in terms of Imax, Vmax and Pmax.
>I will apply 10 amperes of constant current
As I have already explained a constant current source is just a voltage source with current feedback. You are actually changing the terminal voltage to maintain your 10A. I agree charging at Imax is the fastest way of charging a capacitor but my circuit was simpler (constant voltage source with varying current vs your constant current source with varying terminal voltage).
> Will I be able to charge the capacitor faster from a source of 200 volts than from a source of 50 volts?
This is a resounding yes. A constant voltage source rated at 200V will charge a capacitor faster than a source rated at 50V by virtue of the capacitor charging equation (assuming internal resistances are comparable).
> Surely you realize one doesn't attach capacitors to voltage sources, or is this still not understood?
There is no reason why you cannot attach a voltage source to a capacitor. See: http://imgur.com/gallery/sOZ1F. Let me remind you a so called current source is also a voltage source. Current sources are a theoretical construct. Internal resistances have to be considered in a practical circuit otherwise you will have to deal with infinite currents at t=0.
> You will still need an initial jolt (voltage source) to get the current started.
Yes, but that wasn't what I replied to. Your claim was this:
> There is no current without EMF (measured in volts).
That's false. There can be massive currents without any potential difference. Again, superconductors show this behavior.
>> For a constant-current source, the rise is linear
> You do realize there no such thing as a practical constant current source?
Oh, really? So all those power supplies I designed for NASA were not really what they seemed? Constant-current sources are a normal part of electrical engineering practice and have been for decades.
Also, by using a switching inverter and manipulating the relationship between voltage and current in reactive elements, a constant-current source can be made very efficient. This is the ideal way to charge a supercapacitor, and it is standard practice. But I think I already said that.
> I did say raise the voltage as long as you are within the safe operating area.
Yes, but that's not how you charge a capacitor -- you use a controlled current, not a voltage. The voltage follows the current, not the other way around.
>> Will I be able to charge the capacitor faster from a source of 200 volts than from a source of 50 volts?
> This is a resounding yes.
Quite false. For a given current that a capacitor can tolerate, the charging rate is the same regardless of the source voltage. The reason? You don't charge capacitors with voltage, you charge them with current.
The example I gave earlier was a capacitor rated at 50 volts and two current sources -- one that can deliver 200 volts and one that can deliver 50 volts. The charging rate is the same. The reason? You don't charge capacitors with voltage, you charge them with current.
> There is no reason why you cannot attach a voltage source to a capacitor.
No reason at all. But don't be in the same room with a large capacitor and a voltage source that will supply substantial current to maintain a specified voltage. But, you know what? I already said that.
I've been designing power supply circuits for 40 years. My man-rated designs flew on the NASA Space Shuttle. I hold several patents. You're arguing with the wrong person.
> Current sources are a theoretical construct.
Current sources are an everyday, trivial design task that all competent designers must learn to be regarded as employable. See the linked schematic above.
> Internal resistances have to be considered in a practical circuit otherwise you will have to deal with infinite currents at t=0.
That is true for a voltage source. It is not true for a current source, and this is by design.
> They're charged with power, which is current x voltage.
No, they are charged by currents. The normal power source for capacitor charging is a constant current source, one in which the voltage follows the desired current. The least desirable source is a constant-voltage supply, which (in the ideal case) will deliver infinite current at time zero.
The voltage on a capacitor is the time integral of past applied currents. The ideal charging scheme is a constant current source, which cuts off as the capacitor's voltage limit is reached.
I cant figure out the best way to explain this, but the power - voltage - current relationship is not being grokked well in this thread. Mismatched voltages between sources and components in circuits generally lead to pyrotechnics, never magical charging schemes.
I understand your conviction but sorry you will want to revise your outlook on this. Last attempt to explain -
suppose you can charge a cap with a high voltage source, but stop charging when it reaches a lower voltage. The cap itself is never subjected to a high voltage in this situation (this is certain). It will receive current at the voltage it is at, and the power transferred to it will be the current * its voltage. No fast charging magic present.
You cannot charge a battery or capacitor (directly) at voltages higher than their own voltage.
If source voltage is suitable, resistive current limiting is plausible, but power is dissipated by resistance (within the cap and path to it) proportional to the square of the current. This is why efficient voltage transformation is required if source has a significantly higher potential.
ok I see it, but this statement is just fundamentally wrong here:
>capacitor can be charged at any voltage as long as you cut off the supply when the voltage across the capacitor reaches a certain threshold
The cap in your circuit is not charging at any voltage, it is charging at Vn001 - its own voltage (the electrical potential created by virtue of its capacitance). In your circuit the only modelled resistance (and the drop in voltage between supply and the cap) is across R1.
This is all to explain the point which you disagreed with but is iron cast - you cant run power into a phone at 50volts to fast charge its 4volt battery without a missing, quite impossibly cool and small voltage transformer inside the phone to make the idea work.
Im not pointing this out to you to win an argument, Im explaining something which I didnt understand either when I began experimenting with ltspice and making charge circuitry. Take the advice or leave it now, but if you keep on with electronics you will find out sooner or later - "power stores charge and discharge > directly < at their own voltages" and if you dont have enough resistance between different electric potentials - take heed 'fun' will ensue of the dwarven kind ;)
To me the fundamentals have always been based on batteries being chemical reactions with their characteristic cell voltages. A cell definitely has its own discharge voltage, and can be used as a reference because the voltage remains relatively unchanged until the available chemicals are becoming effectively exhausted. It is an energy source based on the amount of chemicals available, plus a storage medium based on the reliable reversibility of the reaction.
In the case of rechargeable batteries, the desired reverse reaction can often be best obtained by carefully matching the recharge voltage & current dynamically, against the quantity of chemical you wish to react, including the amount undischarged and remaining to be reacted at any time. The recharge/maintenance voltages will need to be above the discharged/characterisitc voltages any time that cell recharging is expected to occur. The upper limit on charging voltage is based on the cell's characteristic voltage & the ability of the particular chemical system to withstand overvoltage. This is an electrochemical limit. The electrodes are separated by an electrolyte where having a low DC resistance is a desired property. As to the polarity of the DC voltage being provided or stored, this should always be respected, at your peril.
The working voltage of a battery cell can not be adjusted to match a particular circuit, the circuits must instead be closely designed to match the cell's characteristic voltage and chemical behavior. Increased energy storage requires more voluminous chemicals at the same voltage.
A capacitor whose plates can store charge across a dielectric has no characteristic voltage.
A capacitor can be a power store but there's no electrochemistry required.
Semantically, you can not "recharge" a capacitor since there is no original charge to begin with. You must first charge it with an energy source, which a battery by definition contains but a capacitor does not. A capacitor stores energy but it is not an actual energy source.
When charging a simple solid-state capacitor the upper limit on charging voltage is based on the electrical resistance of the particular dielectric & its ability to withstand overvoltage. This is a physical limit. The electrodes are separated by a dielectric where having a high DC resistance is a desired property. A simple solid-state capacitor functions equally well for operation at reversed polarity.
For an electrolytic capacitor, it is polarized because the chemicals it contains provide an effective increase in storage per volume, at the disadvantage of having high DC resistance in only one direction. An additional limit in maximum voltage is imposed by the electrochemical nature of these type capacitors, but they are still rated as against a physical limit. Plus the polarity of the DC voltage being stored in them should always be respected, your peril is once again at stake.
But no capacitor has its own ideal voltage, they are not at all like batteries in this regard.
A battery's maximum charging voltage is usually limited to a range close to its nominal voltage, imposed by the natural scientific ability of the chemicals to withstand overvoltage while retaining composure. There is very little leeway to work with here. To change the working voltage of a battery storage bank requires addressing the granularity of a different number of matched cells in series, plus circuitry carefully re-optimized for the incremented working battery voltage.
A capacitor's maximum charging voltage is only limited to a range within its rated voltage,
which is an engineering value assigned based on the natural scientific ability of the dielectric to withstand overvoltage while retaining composure. The order(s) of magnitude more leeway should be easily recognized. To change the working voltage of a capacitor storage bank requires only addressing the dielectric strength, which would not need to be changed if it was over-specified to begin with.
The working voltage of capacitor storage can be any arbitrary voltage selected for various engineering reasons, the working voltage only needs to be selected below the electrical rating of the components. Increased energy storage requires more voluminous dielectric, whether more capacitors having the same voltage rating, or bigger packages handling higher chosen voltages.
One of the differences between electronics and electrochemistry, capacitors are nominally marked with a usually conservative rating, but some parts may often be capable of truly handling twice the rated voltage reliably. Batteries not so much.
As a natural scientist designing circuits primarily using natural intelligence through experimentation & discovery, key efforts in computer science seem to be 100% helpful when invested in calculations which are too heavy or numerous otherwise.
As a computer scientist designing circuits primarily using software simulation, when key efforts in physical prototype building fail, this might reveal the investment in computer science itself to be less than 100% helpful.
Don't get me started on electrochemistry . . . the potential for reaction could be unlimited ;-)
Thanks, from reading your comment, I can see that I was a bit loose with the term 'charging voltage' - I was focusing on the internal electrical potential of the cap or battery(stabilised), but 'charging voltage' should pinpoint the external potential which is applied to the component.
In programdudes circuit, the internal and external voltage of the cap are the same, because the cap has no internal resistance modelled. But really the resistive and inductive functions of current stores like caps and batts are what moderates current flow , into or out of from them, and therefore they are what determines suitable charging voltages.
The resistive function of a cap is much simpler than a battery, but the capacity of each to store and release current makes their voltage (the ~pressure of their current) persistent in the respect that internal voltage will only change by releasing or storing current. Well there are caveats of course that chemical and other reactions may alter the stores potential over time, but current exchange is the fundamental cause of potential, and purpose of electrical storage.
Id like to edit this line but to late now:
> its own voltage (the electrical potential created by virtue of its capacitance)
It would be better to write "electric potential sustained by virtue of its capacity".
For caps vs batteries, the only modelling difference I see between them is their resistance functions - that function in batteries tends to be very complex, approximated very roughly by charging curves. Caps tend to be much simpler, but can still involve significant internal resistance and inductance especially to maximum performance applications.
In summary what ive struggled to explain for the record is - the charging(external) voltage of a battery or cap, is only separated by its internal voltage by its internal resistance. A store which will charge moderately at +0.5v above its internal potential, is liable to blow very quickly if 10 or 100 times that voltage is applied.
Then the capacitor would have to be able to withstand 50V. That usually means proportionally more spacing between electrodes.
There's something called "breakdown voltage" (or more precisely dielectric strength), which in air is about 30 kV/cm, and is the reason for those ceramic spacers that keep high voltage lines apart from grounded metal.
Current varies inversely with voltage. So the 3000mAh battery (typically at 3.6 volts) is 10.8 watt hours. To charge that in a minute, is 648 watts, or 5.4 amps at 120 volts. So a typical household plug can handle it. Of course, the problem still remains when you step the voltage down to something appropriate to charge the battery.
What might work, is to have the phone battery (capacitor) removable, then snap it into a box that has a larger capacitor that is pre-charged. It can then dump its whole load into the phone's capacitor in a short amount of time (assuming large enough connectors that can handle the current).
And you'll find that a 1.5mm wide, 1mm thick, trace can carry 18Amps. That's obviously pretty large for a PCB trace, but not untypical for a microUSB-like connector.
But... this is all assuming we're using caps at ~5v. Seems like you might want to use significantly higher voltages and buck converters to power the phone.
18A is not a problem in itself, because the voltage isn't that high. Your drone likely draws more than 18A.
Your car starter draws 10x that.
High current just needs thick wires. 120V wall power, with a proper power supply, needs 1A or less to generate 18A at 5V.
Switching DC converters are great!
What I'm most worried about is actual capacity. 1F is 1A for 1v second with a drop of 1V. Let's assume we need 3 amp hours at a drop from 5V to 2V. That means 3600 F of capacitance! We're not there yet.
a typical plug in a home can provide 10A.
So to charge the phone you will need some big connector,
not a tiny microUSB like connector.
Current capacity for conductors depends on conductor length as well as conductor size. Short runs of a given size have a higher current capacity than longer lengths due to the larger losses in larger lengths. Also, current capacity depends on the length of time the current has to be carried: a given length of a given conductor size will carry a larger current for a short amount of time compared to the carrying capacity for a longer duration. While a micro-USB connector will not suffice, the needed connector would not be that much bigger: a 3.5mm jack would suffice to carry 18A for a few seconds given a good socket. It would not be hard to design a plug/socket combination with a high-enough capacity which would fit in a phone.
To recharge the phone once a week,
I guess you will need a 20000mAH battery
I recharge my phone (a Motorola Defy+) once a week. It has a ~1500 mAh battery, the thing is 5 years old. I don't think you'd need a 20.000 mAh battery to be able to keep your phone running for a week unless you plan to use it as a hand warmer.
The original source is dead, so I can't tell whether these points are addressed or not.
Yep, you'd probably need a connector with a positive mechanical clamping action rather than just a spring connector (as in a standard 3.5 mm jack or USB port), but that seems doable.
Also, there's no reason you couldn't play internal series/parallel (or transformer, or DC-DC converter) games to charge the thing up at a higher voltage (and thus lower amperage) while discharging it at normal phone operating voltage. Nothing says the charger has to operate at USB voltage levels. Heck, why not charge it at full line voltage? (it goes without saying that you'd need to engineer it so there's no way that full line voltage could possibly reach the phone internals).
> Also, there's no reason you couldn't play internal series/parallel (or transformer, or DC-DC converter) games to charge the thing up at a higher voltage (and thus lower amperage) while discharging it at normal phone operating voltage.
Capacitors aren't charged by voltage, they're charged by current. One does not apply a voltage to a capacitor, one applies a current. If you doubt this, try connecting a capacitor to a high-powered constant-voltage power supply. But first, step back.
A capacitor's charging rate is determined by its value in farads and the applied current. Nothing else matters. One farad, one ampere, one second produces one volt and one joule of stored energy. Nothing could be simpler.
If you raise the voltage expecting to see a faster charging rate, you will see the opposite, because a higher voltage for a given power rating means a lower current, and a lower current will require more time to charge the capacitor.
> Heck, why not charge it at full line voltage?
Why not indeed? But don't be in the same room when this idea collides with reality.
> What other laws of physics have changed since I was in school?
Sorry for the late reply -- I'm traveling. What I said in my original post is quite uncontroversial. When charging a capacitor one must not think in terms of voltage (which is an effect, not a cause of charging) but of current.
> The important factor for charging time is how many joules/second you can stuff in, not how much current (coulombs/second) is flowing.
You're confusing cause and effect. Capacitor charging is accomplished with current -- voltage is an effect, not a cause.
The voltage on a capacitor is the time integral of past applied currents. Want to change the charge level on a capacitor? Apply a current and let the voltage change in response.
> You can pull a helluva lot more joules/second out of a wall outlet than you can a micro-USB connector.
Non sequitur, the issue is how much current the capacitor can tolerate. And attaching a capacitor to a wall outlet will not work for multiple reasons.
The ideal charging source for a capacitor is a constant-current supply set to provide a high, but not damaging, level of current, then when the capacitor's voltage limit is approached, shut down the current supply.
The worst possible source for a capacitor is a constant voltage with substantial power available, which will destroy the device.
As I originally said, "Capacitors aren't charged by voltage, they're charged by current." Concise, and quite accurate.
How I wish there were a 4G phone that could do what you describe. My OnePlus One has double that battery capacity (3100mAh) but I can really only make it last two days on that charge at maximum. Typically it needs to be recharged every day, or my usage habits need to change.
So 20000mAh accurately describes my average weekly cellphone power consumption... and if I changed my habits a bit, I'd still expect to require 10000mAh.
Please, if you know of a 4G phone that can last for more than two days, no matter how "dumb" it is (it could even be a flip phone, just as long as it can tether) I'd love to know about it.
FWIW, I have a Samsung A6 at work that I almost never use. It lasts 5-7 days on one charge; but this only works when display and network are off 99.5% of the time.
I can get 3, maybe 4 out of a OnePlus Two(Cyanogenmod 13) provided I run Greenify and only use it for text messages, emails (sync every 15 minutes), and occasional calls.
A Deans T connector is rated at 50A long term. For shorter time, it can carry more. (It's a matter of current times resistance minus thermal dissipation.)
The internal resistance of supercaps is higher than that, they cannot discharge in microseconds.
Think about it: 5V 3F into 10 microseconds would require a resistance of micro-ohms, which is less than the resistance of the connecting wires. That's not even counting the ESR (internal resistance.)
TIL. Not sure where I heard that. I'm having trouble finding good information on caps vs. batteries in this respect but I was definitely wrong. Thanks!
You wouldn't use USB cables for that kind of current.
As for the power available from the charger, you'd probably want to build a LiPoly battery into the charger itself, and have the charger trickle-charge its internal battery and then dump-charge the phone's battery from the internal battery.
Yeah, and that would be plenty for a 'super fast' charge. You don't really need your phone to charge 'in seconds', this would get you there in around 5 minutes depending on battery size which is plenty quick enough.
(Working: ~3.6V * 3000mAh = 10.8Wh = 38.9kJ, at 100W that's 389 seconds for a full charge.)
That's "Low Power Device" to "High-power Superspeed device". The Battery Charging 1.2 specification increases that to 5A@5V, and Power Delivery 2.0 (over standard or type C connectors) to 5A@20V.
Because not everyone carries around a wireless charger in their pocket? Also, I'm fairly sure the lower wattage of a wireless charger would make charging slower.
Apparently they are on par with energy density and power density. They are way ahead on cycle stability (30k charges). So that pretty much leaves two things:
1. Charge stability. Does it leak like a sieve even without a load after being charged?
2. Manufacturability. I expect this is the big problem. It's a chemical engineering problem to scale up a "nano" process. The article says it's not ready, but doesn't say what the biggest challenge is going forward.
Anyone know this particular supercapacity tech? Or supercaps in general?
A leaky capacitor could still be useful in combination with a battery - you can dump power into the capacitor for a super-fast "charge", then slowly move the energy into the battery. You don't get the capacitor's cycle stability, but at least you get the charging speed.
Maybe. Depends on the power density of supercapacitors, and also on whether you can increase the power density of the battery when you're not worried about making sure you can charge it as far as possible.
>2. Manufacturability. I expect this is the big problem. It's a chemical engineering problem to scale up a "nano" process. The article says it's not ready, but doesn't say what the biggest challenge is going forward.
Quoting from the article: "The team at UCF has experimented with applying newly discovered two-dimensional materials only a few atoms thick to supercapacitors. Other researchers have also tried formulations with graphene and other two-dimensional materials, but with limited success."
I'm not sure if they used Graphene specifically (the only other 2D matl I know is silicine) but there are manufacturing issues involved with Graphene especially around mass production but these are being solved. Graphene used to be one of the most expensive materials in the world to produce (per unit weight). It is now (2015) "only" about $100/kg. compare this with something like thermal coal (another carbon based matl) at under $100/t. There is still orders of magnitude in the price before economics look good.
edit: That is not to say it is worthless the price has fallen fast in last 5 years it just needs to keep at the same trajectory for a while longer. Graphene is after all an extremely new material. I don't think there is anything fundamentally impossible to solve manufacturing wise and economies of scale should kick in as demand for it increases.
There's also a third problem, which is that capacitors typically don't have constant voltage. Unless there's something special about these capacitors that they're not mentioning in the articles.
Nor do batteries (although batteries are far flatter than capacitors).
That problem of "sagging voltage" is solved with either a buck-boost converter, or a buck-boost inverting converter... or a SEPIC converter. Good ones get ~95% efficiency or so.
Well they are about an order of magnitude below state-of-the-art batteries in terms of energy density, and about the same distance above batteries in terms of power density. For phones this doesn't seem to make sense unless you want your iPhone to be 3cm thick. Maybe an application with greater power demands?
Energy density is the energy stored per unit volume or mass. Power density is the same thing, but per discharged per unit time. It is essentially a volumetric statement of the internal resistance of the battery or capacitor. Supercapacitors can be discharged very quickly compared to batteries of the same size, but the same sized battery contains more energy.
I was working on a controls project where we had large banks of supercaps that would charge up to ~900v. Have to treat those with some respect. However as long as they are kept cool the caps had no trouble being charged and discharged with a couple minute cycle with each bank providing 2kWh of energy. In the particular application I was working on we actually had 17 2kWh banks providing ~34kWh of energy every couple minutes. Would not fit in a cellphone though :)
wow, it sounds like it would make Samsung's Note 7 battery problems seem elementary. Who would want to carry a device like that in their back pocket or hand it to kids to play with? Would a good drop on hard pavement or sheet metal cause this thing to go off?
They have some equivalent series resistance and internal resistance, but yeah, try touching the two conductors of a Li ion cell together. Batteries and bombs are basically the same thing -- store energy and release it very quickly.
The original article is
High-Performance One-Body Core/Shell Nanowire Supercapacitor Enabled by Conformal Growth of Capacitive 2D WS2 Layers
DOI: 10.1021/acsnano.6b06111
>>> Anyone with a smartphone knows the problem: After 18 months or so, it holds a charge for less and less time as the battery begins to degrade.
Really? That's still a thing? These aren't nicads. I've found that my phone doesn't report full charge as often, but it still lasts for a similar amount of time. My 5+yo netbook's battery is still reporting 80% of its design capacity.
Imho, such apparently dramatic falls in capacity often have more to do with running apps rather than physical degradation of the battery. Talk to me after a reset to factory settings.
If it's under warranty, that's worth taking in. When I got my MBP repaired under warranty they also noticed my battery was doing worse than it should for the number of cycles and replaced it no questions without me having to ask. I believe rated is 80% health after 1000 cycles, and mine was somewhere around 75% after 8 or 900---you can check the these values in system information, I believe.
Warranty was only for a year. I had it repaired at some point for some heatsink issue, and they offered they could swap out the battery while at it (i.e. no extra labor). But it would've cost me 300$ CAD just for the battery, which seemed expensive at the time. Now I regret not paying for that..
No, it has to do with temperature. Li charged when cold/hot will significantly degrade the battery. See Tesla(with liquid thermal cooling) vs Leaf degradation over the last couple years.
Actually, it has to do with the oxide layer in the Lithium battery. Good temperatures make it degrade slower, but it still degrades. LiFePO4 degrades slower than LiPo by a factor of 5 or better, but has lower power density (3.2V instead of 3.7V.)
Yes, LiPos degrade with use. My five year old car has 80% capacity. My One Plus from beta has about 85% capacity.
If you want to charge it in "seconds", let's use 60 seconds as an upper bound.
So if the charger has a 100% efficiency, then it has to provide 3000mAh/60sec = 3000mAh * 3600 sec/h / 60sec = 180000mA = 18A.
An USB can provide between 0.1A and 0.9A. For comparison, a typical plug in a home can provide 10A. So to charge the phone you will need some big connector, not a tiny microUSB like connector.
But it's worse. From the article:
> "If they were to replace the batteries with these supercapacitors, you could charge your mobile phone in a few seconds and you wouldn't need to charge it again for over a week," said Nitin Choudhary, a postdoctoral associate who conducted much of the research published recently in the academic journal ACS Nano.
To recharge the phone once a week, I guess you will need a 20000mAH battery, and a few seconds is something like 5, so the connector must survive to 1000A, that is a ridiculous current.