Gbollag said:
I'm trying to determine how bright my projection image will be on a projected surface. <snip>
This type of question seems to pop up fairly regularly. The basic difficulty is that you don't have sufficient information to get a result.
Start with the projector: 4500 lumens. That is a photometric unit, not a radiometric unit. The radiometric equivalent is Watts, but your projector is not emitting 4500W of light- you need to know the spectral distribution of the light in order to convert it.
Then, you are collecting the light and projecting it onto a wall- so a better place to 'start' would be the projected illuminance (lux), which in radiometric units is W/m^2. Again, unless you know the optical properties of the projector lens, you can't easily figure this out. But you may be able to estimate it with simple geometry, since you know the size of the image. Then you know the lm/m^2 incident on the wall.
Now, the wall is scattering the light back to your eye- unless you know the reflectance properties of the wall, you can't estimate how much light is scattered toward you. You may wish to assume the wall is Lambertian, then the flux is scattered uniformly into a half-sphere. Then you can calculate the luminance of the wall (radiometrically, it's the radiance W/(m^2*sr)). Then, knowing how far away from the wall you are and how large the pupil of your eye is, you can calculate the flux of light entering your eye (lm). If your surface is not Lambertian, then you need to know the bi-directional reflectivity function (BDRF), which tells you how the reflectivity varies as a function of incident angle, reflection angle, and wavelength.
As you can guess, it's a lot easier to simply get a light meter and measure it.