The following is some background on radiocarbon dating. Keep in mind that for most practical considerations, the rate of radioactive decay does not vary and cannot be increased or decreased.
The radiocarbon clock is based on the known decay rate of the unstable isotope of carbon, namely Carbon-14, which is formed when cosmic rays interact with nitrogen in the atmosphere. The radiocarbon combines with oxygen to form a radioactive form of carbon dioxide. In today’s atmosphere one atom of Carbon-14 exists for every million million atoms of the most abundant isotope of the element - Carbon 12.
Radiocarbon enters the food chain when it is absorbed by plants during photosynthesis. The Carbon-14 concentration of living tissue is fixed as it is formed. Thereafter the cells and bone carbonate in animals are renewed slowly by metabolic processes, while radioactive decay of the fixed Carbon-14 is continuously lowering the initial level. The net result is that the Carbon-14 content lags the atmosphere by up to a few decades. In the case of growing trees, cells formation happens only in a narrow zone under the bark, so the innermost wood may already be centuries old before the tree dies. In some, but not all, species there is a clear annual ring boundary. This forms the basis of dendrochronology and explains why wood is so widely employed for radiocarbon calibration studies. The situation is different (and more complex when dating is involved) when the life form exists in the ocean due to the slow exchange of carbon dioxide from the atmosphere to the ocean.
When a living organism dies, the carbon exchange stops. Hence, by measuring the residual Carbon-14 concentration in organic samples, and provided they have not been contaminated by younger material (e.g. via bacterial action, soil organic acids) or by older material (e.g. geologic calcium carbonate), it is possible to calculate the time elapsed since the material was originally formed. It takes 5,730 years for half of the radiocarbon originally present to be lost by decay.
The use of radiocarbon for dating began some 50 years ago and was based on the detection of the decay of the isotope. Nuclear particle counting techniques determine the Carbon-14 activity in a sample. In the last 20 years radiocarbon dating by accelerator mass spectrometry (AMS) has become the preferred method. AMS counts the atoms of the different carbon isotopes directly, is far more sensitive than the decay counting method. It can be used on samples as small as 50 mg.
Looking at LtCmdrLore's comment:
IF there was a canopy of water covering the earth interfering with the radiation... wouldn't it REDUCE the amount of radiation reaching the surface, Therefore making the carbon atoms degradate at a SLOWER pace instead of faster? And that therefore would make things seem YOUNGER from carbon dating not older....Right?
If a "canopy of water" covered the earth, yes, this would reduce the amount of radiation available to interact with the atmospheric nitrogen - so you would have less Carbon-14 formed and material dated from that time would appear to be older by virtue of the lower Carbon-14 content. BUT this effect would only last for as long as the "canopy of water" existed. This would mean that the available records from trees (dendrochronology) would show a period when there was a lower Carbon-14 abundance relative to the period before and after the "canopy" event. To my knowledge, such an event has not been reported. By the way, it is the tree rings that are measures, so a seasonal Carbon-14 picture can be built up.
h9k