I don't think this counts as homework, if it does then apologies! I'm doing some simulation work on the structure of a CCD and noticed something peculiar. If i have an electrode, seperated from a substrate material by a dielectric, and i bias it by 12V then i get a potential underneath it- as i expect. However if i increase the size of the electrode (read as area) then the value of the maximum potential (VMAX) increases. Its position also moves further away from the electrode itself. I've been trying to understand why, as in thoery the charge density remains constant, so although the extent of the field would increase, i can't understand why the value of the maximum changes. Since the field is the derivative of the potential, a change in VMAX would imply that that field itself is changing in magnitude (read as flux density) as well. Any ideas? Perhaps my logic is flawed, if so, then please point it out as i'd like to understand it.