1. The problem statement, all variables and given/known data The center of a 1.00 km diameter spherical pocket of oil is 1.00 km beneath the Earth's surface. Estimate by what percentage g directly above the pocket of oil would differ from the expected value of g for a uniform Earth? Assume the density of oil is 8.0*10^2 (800) kg/m^3 2. Relevant equations: Law of Universal Gravitation Fg= G(m1*m2/r^2) Density= mass/volume Volume of a Sphere: 4/3πr^3 Gravitational Field g=Gm/r^2 g on the surface of the earth is 9.80 m/s^2 3. The attempt at a solution The first thing I attempted for this problem was to visualize the given scenario, we happen to have a sphere of oil that is 1.00 km in diameter, the center of this oil deposit is directly 1.00 km beneath the surface of the earth. Which implies that the top of the sphere is only 0.5 km or 500m beneath the surface of the earth. The next step I took was to find all the values of each sphere (earth, oil). Sphere of Oi: Radius: 500m Volume: 5.24×10^8m^3 Density: 800 kg/m^3 Mass: 4.192*10^11kg Earth: Radius: 6.380*10^6m Volume: 1.087*10^21m^3 Density: 5540 kg/m^3 Mass: 5.97*10^24kg Having all this information, I do not know what to do next to figure out the g within the top of the sphere and how I find the percentage it would differ. I know that if I calculate the gravitational field for earth i'll get 9.80 m/s^2. If I do the same for the sphere of oil ill get 0.00011184256m/s^2 which is a reasonably small number for a gravitational field. I do not know if I should subtract both gravitational fields and then do a ratio with my result and 9.80m/s^2 to get a percentage. Any help will be appreciated!