1. The problem statement, all variables and given/known data Calculate the expected input voltage for each set of resistances below. For the CMOS gate, assume the input resist of the gate is infinite. 2. Relevant equations i=v/R Voltage divider rule v*R1/(R1+R2)=v1 3. The attempt at a solution 1)For R1=1K and R2=infinity, would V simply be 0 since the resistance is infinity? I'm mostly confused with how the CMOS gate and the infinity resistance will effect the voltage. 2)For R1=1k and R2=10K v1=5V*1K/(1K+10K)= 0.45V? I assume for all the rest, the values with infinity I'd calculate as in step 1, and the values with two specific numbers I'd calculate as I would in step 2?