No, they don't. The net oxygen production of a rainforest is, on average, zero. Trees produce carbon dioxide at night, when they are not photosynthesizing. They lock up oxygen and carbon into sugars, yes, but when they die, they rot, and release carbon dioxide. Forests can indirectly remove carbon dioxide by removing carbon and locking it up as coal or peat, and by releasing oxygen into the atmosphere. Ironically, that's where a lot of the human production of carbon dioxide comes from, we dig it up and burn it again, using up the same amount of oxygen.
If the theory that oil is the remains of plants from the carboniferous period is true, then our cars are burning up carbon that was once laid down by plants. Even if an alternative theory, growing in popularity, is true, and oil was produced by bacteria, then the problem remains the same. Either way, if you burn a rainforest you add a one-off surplus of carbon dioxide to the atmosphere, but you do not also reduce the Earth's capacity to generate new oxygen. If you want to reduce atmospheric carbon dioxide permanently, and not just cut short-term emissions, the best bet is to build up a big library at home, locking carbon into paper, or put plenty of asphalt on roads. These don't sound like 'green' activities, but they are. You can cycle on the roads if it makes you feel better.
Another important atmospheric component is nitrogen. It is a lot easier to keep track of the nitrogen budget. Organisms, plants especially, as every gardener knows, need nitrogen for growth, but they can't just absorb it from the air. It has to be 'fixed', that is, combined into compounds that organisms can use. Some of the fixed nitrogen is produced as nitric acid, which rains down after thunderstorms, but most nitrogen fixation is biological. Many simple lifeforms 'fix' nitrogen, using it as a component of their own amino-acids. These amino-acids can then be used in everybody else's proteins.
The Earth's oceans contain a huge quantity of water, about a third of a billion cubic miles (1.3 billion cubic km). How much water there was in the earliest stages of the Earth's evolution, and how it was distributed over the surface of the globe, we have little idea, but the existence of fossils from about 3.3 billion years ago shows that there must have been water around at that time, probably quite a lot. As we've already explained, the Earth, along with the rest of the solar system, Sun included, condensed from a vast cloud of gas and dust, whose main constituent was hydrogen. Hydrogen combines readily with oxygen to form water, but it also combines with carbon to form methane and with nitrogen to form ammonia.
The primitive Earth's atmosphere contained a lot of hydrogen and a fair quantity of water vapour, but initially the planet was too hot for liquid water to exist. As the planet slowly cooled, its surface passed a critical temperature, the boiling point of water. That temperature was probably not exactly the same as the one at which water boils now; in fact even today it's not one inflexible temperature, because the boiling point of water depends on pressure and other circumstances. Nor was it just a simple matter of the atmosphere's getting colder: its composition also changed because the Earth was spouting out gases from its interior through volcanic activity.
A crucial factor was the influence of sunlight, which split some of the atmospheric water vapour into oxygen and hydrogen. The hydrogen escaped from the Earth's relatively weak gravitational field, so the proportion of oxygen got bigger while that of water vapour got smaller. The effect of this was to increase the temperature at which the water vapour could condense. So as the temperature of the atmosphere slowly fell, the temperature at which water vapour would condense rose to meet it. Eventually the atmosphere going down passed the boiling point of water going up, and water vapour began to condense into liquid water ... and to fall as rain.
It must have absolutely bucketed down.
When the rain hit the hot rocks beneath, it promptly evaporated back into vapour, but as it did so it cooled the rocks. Heat and temperature are not the same. Heat is equivalent to energy: when you heat something, you input extra energy. Temperature is one of the ways in which that energy can be expressed: it is the vibration of molecules. The faster those vibrations are, the higher the temperature. Ordinarily, the temperature of a substance goes Up if you heat it: all the extra heat is expressed as more vibration of the molecules. However, at transitions from solid to liquid, or liquid to vapour or gas, the extra heat goes into changing the state of the substance, not into making its temperature higher. So you can throw in a lot of heat and instead of the stuff getting hotter, it changes state, a so-called phase transition. Conversely, when a substance cools through a phase transition, it gives off a lot of heat. So the cooling water vapour put more heat back into the upper atmosphere, from which it could be radiated away into space and lost. When the hot rocks turned the water back into vapour, the rocks got a lot cooler very suddenly. In a geologically short space of time, the rocks had cooled below the boiling point of water, and now the falling rain no longer got turned back into vapour, at least, not much of it did.
It may well have rained for a million years. So it's not surprising that Rincewind noticed that it was a bit wet.
Thanks to gravity, water goes downhill, so all that rain accumulated in the lowest depressions in the Earth's irregular surface. Because the atmosphere had a lot of carbon dioxide in it, those early oceans contained a lot of dissolved carbon dioxide, making the water slightly acidic. There may have been hydrochloric and sulphuric acids too. The acid ate away at the surface rocks, causing minerals to dissolve in the oceans; the sea began to get salty.
At first the amount of oxygen in the atmosphere increased slowly, because the effect of incoming sunlight isn't particularly dramatic. But now life got in on the act, bubbling off oxygen as a byproduct of photosynthesis. The oxygen combined with any remaining hydrogen in the atmosphere, whether on its own or combined inside methane, to produce more water. This also fell as rain, and increased the amount of ocean, leading to more bacteria, more oxygen, and so it continued until the available hydrogen pretty much ran out.
Originally it used to be thought that the oceans just kept dissolving the rocks of the continents, accumulating more and more minerals, getting saltier and saltier until the amount of salt reached its current value of about 3.5%. The evidence for this is the percentage of salt in the blood of fishes and mammals, which is about 1%. In effect, it was believed that fish and mammal blood were 'fossilized' ocean. Today we are still often told that we have ancient seas in our blood. This is probably wrong, but the argument is far from settled. It is true that our blood is salty, and so is the sea, but there are plenty of ways for biology to adjust salt content. That 1% may just be whatever level of salt makes best sense for the creature whose blood it is. Salt, more properly, the ions of sodium and chlorine into which it decomposes, have many biological uses: our nervous systems, for instance, wouldn't work without them. So while it is entirely believable that evolution took advantage of the existence of salt in the sea, it need not be stuck with the same proportion. On the other hand, there is good reason to think that cells first evolved as tiny free-floating organisms in the oceans, and those early cells weren't sophisticated enough to fight against a difference in salt concentration between their insides and their outsides, so they may well have settled on the same concentration because that was all they could initially manage, and having done so, they were rather stuck with it.