Naturally occurring radionuclides (NOR) such as 238U, 232Th and their decay-products are abundant in the environment. Increased environmental NOR levels can arise from various anthropogenic activities that exploit raw materials for commercial purposes (e.g. metal mining and smelting, the phosphate industry, etc.).
The ability to estimate the activity concentration of NOR in biosphere components is a key step in evaluating the long-term impacts of these contaminants on human health and the environment. Radiological assessment models have integrated modules to predict NOR accumulation in soil and vegetation. These modules are often formulated in terms of simple parameters such as the equilibrium solid-liquid distribution coefficient (Kd) and the soil-to-plant transfer factor (TF). Although simple to measure and to use in radiological impact assessments, the Kd and TF are highly variable. This variability can be attributed to several sources including soil and plant properties, radionuclide speciation, the time after contamination, land management practices and the experimental procedure used to determine these parameters. Several approaches have been proposed to reduce the variability in the Kd and TF data with varying degree of success. One approach is to group available data by soil texture (i.e. sand, silt and clay), organic matter content and crop type. Another approach is to relate the Kd to physicochemical parameters that can be used as surrogates for the underlying mechanisms that govern radionuclide retention in soils (e.g. pH, cation exchange capacity, organic matter content). This cofactors approach has shown a greater potential to reduce the variability in the Kd data for some radionuclides.