This isn't really a homework question, but I was fiddling around on the virtual electronics lab at 123D circuits to familiarize myself with the various electrical components used in making circuits and just to familiarize myself with building electronics in general since I'm in more of a beginner state right now. Here is a link to the circuit: https://123d.circuits.io/circuits/1414295-simple-led-adjuster#breadboard
9 V Battery
500 Ohm resistor
10 kOhm potentiometer
1 RGB LED - not sure about its voltage rating or its current rating, but I just went off the basic fact that most LEDs can only take 20 milliamps at the most.
My question is - mainly because I cannot remember exactly why it is and have been unlucky in searching for this answer - this: Why would I still need the 500 Ohm resistor to power the LED with the potentiometer since the potentiometer acts as a resistor in itself, so shouldn't I be able to just use a potentiometer to function as this variable resistor without needing an actual resistor?
For this one, not really sure but if someone can enlighten me as to how exactly I should go about using calculations to test problems like this, it would be great since I haven't really applied my general knowledge of the following equations to building these circuits (besides Ohm's law of course):
Ohm's Law: U (voltage) = P (current) * R (resistance)
I think Kirchoff's Law's also apply here but haven't really practiced those concepts as much so I'm still unsure about how they apply to this case.
The Attempt at a Solution
I tried to think of why it wouldn't work, but wasn't sure how to approach this issue. Sorry if this is not satisfactory.