So far there's a lot of arm waving but no actual math. So here goes. Imagine a speck of rock, density 3000 kg/m3. It's 1mm across, with a volume 10-9 m3 and mass 3 x 10-6 kg. If it hits at Earth's orbital velocity, 30 km/sec (30,000 m/sec) then its kinetic energy is 1/2 mv2 = 1/2 x 3x10-6 x (3x10^4)^2 = 4.5 x 10^2 joules. Now, if it takes a second to flame out (typical for the small meteors I've seen), then it's radiating 450 joules/sec = 450 watts. Crudely, auto headlights are around 50 watts and can be seen many miles away at night. So our meteor is about ten times that.
How do we convert our meteor to actual brightness? Incandescent lights are very inefficient. But then again, our meteor is also emitting due to incandescence, although more efficiently since it's a lot hotter. The Sun delivers 1361 watts per meter squared on earth. The apparent magnitude of the Sun is -26.7. Sirius is just about 25 magnitudes fainter or 5 steps of 5 magnitudes or one ten billionth as bright, so its energy flux on Earth is 1.36 x 10^-7 W/m2. Now, if our meteor is 100 km away (slant range) it will emit 450W/4pi x 100,000 x 100,000)m2 = 3.6 x 10-9 W/m2 at the observer's location. It's about 1/38 as bright as Sirius, or about 4 magnitudes, about M = 2.5. That's roughly as bright as a star in the Big Dipper.