Diagram:

https://drive.google.com/file/d/1pqa8dq ... sp=sharing
Note: this problem requires calculus. (It's possible there's a particularly clever solution that bypasses the calculus, but I can't think of one.)

You are an electrical engineer tasked with charging a capacitor

, which is connected in series with resistor

and in parallel with resistor

. You want to charge the capacitor to a final voltage

, and you have at your disposal a current source

. (See the diagram for the circuit configuration, and note that at time

, the capacitor is fully discharged. Also note that

is constant; i.e. the current cannot vary with time.)

In the aim of efficiency, you want to dissipate as little power as possible in the resistors as you charge the capacitor to voltage

. However, your colleague, who designed the circuit with

,

, and

and chose the value of

, won't let you change any of those values. The only value you can play around with is the charging current

.

a) Find an equation, in terms of

,

,

, and

, and

, for the time

at which the capacitor is fully charged to voltage

.

b) Find an equation, in terms of

,

,

, and

, and

, for the amount of power dissipated in the resistors in the course of charging the capacitor to voltage

.

c) Find an equation, in terms of

,

,

, and

, for the value of

that offers minimum power dissipation. (I'm not convinced this part has a closed-form solution -- if not, do what you can.)