You aren't understanding the relationship between voltage, current, and resistance.
Take Ohm's law. Ohm's law states that the current through a conductor between two points is directly proportional to the voltage across the two points divided by the resistance. In math terms this comes out to: I=V/R (I is current)
Now, we cannot apply current. What we do is we apply a VOLTAGE. Voltage is the potential difference between two points, and it is the force that causes charges to move in a circuit. This makes sense if you realize that the charges require a force to move just like anything else. So we apply this force, the voltage, and depending on how much resistance the circuit has we have some amount of current that flows.
So, if your power supply applies 12 volts and can handle up to 6 amps, it pretty much doesn't matter if the device you're powering requires 1 amp or 5. It will only draw however much it needs. (Note that this is only a very basic summary and the real world is always more complicated)