I am trying to find the temperature of a star given its wavelength in micrometres, but I am not sure if my conversion is right therefore don't know if the answer is correct. Star A has a maximum emission wavelength of 1 μm and Radius 100 Rsun. What is its Effective Temperature and Luminosity? Where 109 is the μm → metre conversion. 1μm → 1000nm So, 1000nm / 109m = 1×10-6m Finding the Temperature: T = 0.0029K m / λmax T = 0.0029K m / 1×10-6m T= 2.9×10-9 K → (how do I put this value into millions and not in scientific notation?) Given the result and assuming it is correct we find the luminosity with the surface area and Stephan-Boltzman Law as shown below: L= 4πR2 σT4 where σ is the Stephan-Boltzman constant. My confusion here is, how do I plug the value of the Radius since I have it in Solar Radius?