I was wondering about the following problem: You are looking down on a single coil in a constant magnetic field B = 0.9 T which points directly into of the screen. The dimensions of the coil go from a = 6 cm and b = 15 cm, to a* = 20 cm and b* = 19 cm in t=0.028 seconds. If the coil has resistance that remains constant at 1.7 ohms, what would be the magnitude of the induced current in amperes? Now, I have the answer, and I was told how to get it. I used the formula I = (delta A*B)/(delta t*R) What I was wondering was if someone could tell me what rule or law this formula came from? I can't figure out how to derive it from any of the formulas given in this chapter. Thanks a lot.