1. The problem statement, all variables and given/known data We are given that one infinite line of charge is placed horizontally on the x axis. The other line charge is placed parallel to the first at y = 0.4m. The first line has charge density 4.8 * 10^(-6) C/m, and the second -2.4 * 10^(-6) C/m. Find the net electric field at the following points. A) At y = 0.2m and B) At y = 0.6m. 2. Relevant equations The relevant equation would be lambda / (2 * PI * epsilon - nought * radius). Where epsilon - nought = 8.85 * 10^(-12) C^2 / (N*m^2) 3. The attempt at a solution Well.. since we are said to find the net electric field. I assume that we have to sum the two fields at the given point. So we find the field at the given point with respect to the first line, and then the second, then add them. Also do the same for the second point. However, the answer they give in the back of the book is 8.05 * 10^5 N/C, which is a single answer to a two part question. (Young and Freedman University Physics Ed 12) I am at a total loss as how they can get a single answer to a two part question.... Anyhow, taking the zero point as the origin (0.2m) I did (4.8 * 10^(-6)) / (2 * 3.14... * 8.85 * 10^(-12) * 0.2m) + (-2.4 * 10^(-6)) / (2 * 3.14 * 8.85 * 10^(-12) * -0.2m)). I tried all variations of this, including negating the 0.2m in both equations and adding them. I did the same type of calculation for the y = 0.6m charge as well, except for line one 0.2 is changed to 0.6 and for line two it is kept at 0.2m. Any help would be appreciated. Thank you!