Cutting down rain forest causes desert, and this is viewed as an upset to the world's climate. What I'm wondering is whether any and all desert is basically bad, or whether some amount of desert somehow contributes to the health of the planet. In other words, if we exclude the man made deserts from consideration, would the natural deserts that exist be considered to have a positive effect on the earth somehow, or are they, too, symptoms of ill health, so to speak? The reason I ask is because you hear about environmentalists trying to protect desert from all kinds of human activities that might upset its balance, but would the whole earth be better off if we were trying to reclaim desert and make it into forest? Do the Sahara and Gobi deserts, for example, perform some important climactic function as is?