I want to know the relationship between Watts and Hertz in radio waves frequency? I have searched and searched for it and no satisfying results up till now. Anyway to be more clear consider that we have an RF generator which of course works with AC power we set it at 200 watts and it gives us for example 14 MHz of radio wave frequency. What I want to know is: Is there a certain rule between Watts and Hertz? And what is the relation between them? What will happen when increasing the wattage to ie.400 watts how can I calculate the Radio frequency Hertz produced? Note: There's something I don't understand: when I asked an engineer about something like that he told me that if we use certain wattage (X) and it produced a certain Radio wave frequency in Hertz (Y) in some electrical appliance it wouldn't equal the hertz of the same wattage in another electrical appliance. This is really confusing for me as I am only 16 and I’m kinda new to the physics area. So kindly give me a response if you're 100 % sure of it and give me related topics in the “radio waves' frequency and its relation with Watts and electrical power” field that i can search and read about.