• 3 Replies

0 Members and 1 Guest are viewing this topic.


Offline Stby

  • First timers
  • *
  • 4
    • View Profile
« on: 14/03/2008 23:53:54 »

Can I have some help understanding capacitors?

I have a charging and discharging circuit. 

I insert a 2000microF cap into the circuit and record how long it takes to charge and discharge.

I take the 2000microF cap out of the circuit and put a 100microF cap in into the circuit and record how long that one takes to charge and discharge.

How does the length of discharge time change based on the size capacitor?




  • Guest
« Reply #1 on: 15/03/2008 02:54:42 »
It depends on the resistor you use to go with the cap.

The resistor limits the amount of current that can flow for a given voltage, and the current x time = charge accumulated by the capacitor, so the larger the capacitor value the more charge it will be able to hold, so the longer you can drive the current in before it gets saturated.

Have a quick read of



  • Guest
« Reply #2 on: 16/03/2008 21:17:24 »
Both the voltage across and the capacitor and the current through the resistor decay exponentially. The 'time constant' is RC. This means that every (R times C) seconds, the value decreases by 1/e. It's in all the textbooks.


Offline techmind

  • Hero Member
  • *****
  • 934
  • Un-obfuscated
    • View Profile
« Reply #3 on: 17/03/2008 19:00:28 »
In short, for a fixed resistor-value, the charging time scales with 1/C.
In other words your 100uF cap will charge in 1/20th the time of the 2000uF.

Note that in the real world, low-cost electrolytic capacitors often have a tolerance as poor as -50%/+100% so your experimental results may deviate somewhat from the theory!
« Last Edit: 21/03/2008 11:04:46 by techmind »
"It has been said that the primary function of schools is to impart enough facts to make children stop asking questions. Some, with whom the schools do not succeed, become scientists." - Schmidt-Nielsen "Memoirs of a curious scientist"