Yes, terribly sorry, I really should have put the formula to avoid confusion over my original question, C=Q/V is the formula in question.
Though, I would like to know exactly how the change of voltage with a constant charge affects the capacitance. I can understand that, the larger the charge is, the more electrons there are to fit into the capacitor, and therefore the capacitance is larger.
However, thinking it through, I would also assume (leaving aside that the calculations wouldn't work out, more as a thought experiment) that the greater the voltage, the greater will the capacitor's capacitance would be. Let me elaborate why my mind is inclined to think that:
The greater the voltage, the greater would essentially be "electron surplus-shortage" the difference between the power source's poles. I would say that, since one side of the power source is positively charged, therefore the same panel of the capacitor (assuming it's consisted of two panels with an insulator in between, easier for me to explain the situation if we assume this type of a capacitor) would also be positively charged, and the other side being negatively charged, the greater difference would amount to a greater electric force, effectively "pulling in" more electrons to the panel, increasing the capacity, and actually increasing the charge.
I know my logic is terribly flawed, but this is the most reasonable solution I can work out when imagining how this plays out on the microscale in a real circuit. I would really love to find out what exactly is happening inside the capacitor though. Thank you very much for your input!