Naked Science Forum

Non Life Sciences => Technology => Topic started by: Stby on 14/03/2008 23:53:54

Title: Capacitors
Post by: Stby on 14/03/2008 23:53:54
Hello,

Can I have some help understanding capacitors?

I have a charging and discharging circuit. 

I insert a 2000microF cap into the circuit and record how long it takes to charge and discharge.

I take the 2000microF cap out of the circuit and put a 100microF cap in into the circuit and record how long that one takes to charge and discharge.

How does the length of discharge time change based on the size capacitor?

Thanks!
***Stby
Title: Capacitors
Post by: another_someone on 15/03/2008 02:54:42
It depends on the resistor you use to go with the cap.

The resistor limits the amount of current that can flow for a given voltage, and the current x time = charge accumulated by the capacitor, so the larger the capacitor value the more charge it will be able to hold, so the longer you can drive the current in before it gets saturated.

Have a quick read of http://en.wikipedia.org/wiki/RC_circuit
Title: Capacitors
Post by: lyner on 16/03/2008 21:17:24
Both the voltage across and the capacitor and the current through the resistor decay exponentially. The 'time constant' is RC. This means that every (R times C) seconds, the value decreases by 1/e. It's in all the textbooks.
Title: Capacitors
Post by: techmind on 17/03/2008 19:00:28
In short, for a fixed resistor-value, the charging time scales with 1/C.
In other words your 100uF cap will charge in 1/20th the time of the 2000uF.

Note that in the real world, low-cost electrolytic capacitors often have a tolerance as poor as -50%/+100% so your experimental results may deviate somewhat from the theory!