Measurement of the Bioavailable Fraction

Only a tiny fraction of the total content of the macronutrients in soil is available for absorption by plant roots and the soil to soil variability in available macronutrients is far greater than the variability of total concentrations. To some extent this is less true for the micronutrients where for example, analyses of total copper or zinc can be used to assess the likelihood of plant deficiencies or toxicities occurring. However, the assessment of plant-available levels using soil extractants is more useful in practice. These are generally dilute (< 1 m) solution of mineral or organic acids, simple salts of organic and inorganic complexing agents (Jones and Benton, 1990; Peck, 1990).

Extractants are used in two ways. Historically the earliest, and still the most common use is to identify soils where a yield response may be expected if a micronutrient fertiliser is applied or to confirm a micronutrient deficiency following field observations of growth abnormalities. Thus, Junus and Cox (1987) began their paper dealing with a zinc soil test by saying 'A soil test is desirable to determine whether or not fertilisation with Zn is required. Calibration of a Zn soil test would best be based on yield response to applied Zn'. The second use is to attempt to predict crop uptake of trace elements and this is especially important where plants are growing on soils contaminated by elements like lead or arsenic which are toxic to animals. A third use, already described, is in the chemical fractionation of soil in order to identify and evaluate the major chemical pool of micronutrients and toxic metals (see Table 1-6). The same extractants are used for all three purposes and none is based on any firm theoretical foundation. Yet, despite the empiricism involved in their use they have proved of great practical assistance both in advisory work and in laboratory studies. There are many published reports on the use of extractants to diagnose micronutrient responsive soils. The reader is advised to consult Shuman (1991) and Sims and Johnson (1991) for recent reviews and to obtain general details of methods. For specific information alluded to in the following paragraphs the reader should refer either to the papers quoted or to Needham (1983).

Soil extractants have proved least successful for those elements whose oxidation state is easily altered in the soil environment. There are no reliable soil tests for iron and a determination of soil pH combined with a knowledge of the susceptibility of a particular crop to either iron deficiency or toxicity is generally more helpful. In the case of manganese, deficiencies are possible where soil pH is greater than 6.0 and where humus levels are moderate to high. However, a determination of 'active' or easily reducible manganese has proved valuable on some occasions. This is done by adding a mild reducing agent to an extractant which accesses the readily exchangeable soil fraction. Hydroquinone (0.2%) in neutral molar ammonium acetate is widely used when <25 mg Mn per kg is a critical level. But the widely differing susceptibilities of crops to manganese deficiencies make the method at best only a rough guide. Where micronutrients are less liable to changes in their oxidation state or exist in only one valency extractants have proved more helpful.

The strong association of copper and zinc with soil humus has stimulated a wide interest in solutions of chelating agents (eg, Viro, 1955; Tills and Alloway, 1983). These are supposed to act by mimicking the activities of the plant roots in decomplex-ing metals from the organic matter. However, the quantities of trace metals which they liberate are far higher than those mobilised by the roots. In England and Wales the National Agricultural Advisory Service presently uses disodium EDTA (ethylene-diamine-tetraacetic acid) for copper and soil concentrations in the range 0-1.6 mg Cu L_1 indicate the likelihood of deficiencies while concentrations between 1.7 and 2.4 mg L ~1 suggest the possibility of crop deficiencies. In the USA a mixed reagent containing (0.005 m) DTPA (diethylenetriaminepentaacetic acid) with (0.1 m) TEA (triethanolamine) and 0.01 m calcium chloride (CaCl2) at pH 7.3 has been widely adopted (Lindsay and Norvell, 1978). Copper deficiencies are associated with soil levels <0.2 mg kg-1 and zinc deficiencies with concentration <0.5 mg kg-1 while 0.5-1 mg Zn kg"1 are regarded as marginal. The same reagent is used to test for manganese when the critical level is 1 mg kg"1. Trierweiler and Lindsay (1969) have proposed the use of a mixture of EDTA with ammonium carbonate as a test for zinc and they established 0.4 mg Zn kg"1 as the critical soil level.

Although cobalt is involved in nitrogen fixation in legumes there is no evidence that low soil cobalt levels are anywhere limiting crop yields. However, the widespread occurrence of cobalt deficiencies in cattle and sheep (pining) has stimulated a search for a soil extractant (Mitchell, 1974). Both in Britain and the USA it is accepted that herbage cobalt contents less than 0.07-0.08 mg kg"1 dry matter indicate the likelihood of pining. The most useful extractant appears to be 2.5% acetic acid when soil concentrations <0.25 mg Co kg-1 imply the probability of low cobalt herbage and 0.25-0.35 mg Co kg"1 are marginal. MoIybdenum(IV) has a charge/radius ratio of 57 which indicates its tendency to accumulate in weathering residues together with iron and manganese and it is interesting that the most widely used extractants involve low pH oxalate solutions which dissolve this fraction. The results need interpreting carefully, particularly with regard to soil pH and, as a rough guide, for soils of pH 5 an oxalate extractable value of 0.2 mg Co kg-1 indicates the possibility of plant deficiencies and at soil pH 6.5 the critical value falls to 0.05 mg Co kg"1 since the availability of molybdenum is pH dependent. Total molybdenum concentrations can be used to identify soils where cattle may suffer from molybdenosis. Most normal soils tend to have 1 - 3 mg Mo kg-1 and a value >20 mg kg-1 indicates the need for veterinary investigations.

Finally, the most popular extractant for boron is boiling water (eg, McGeehan et al., 1989) and it is agreed in many countries that concentrations below 1 mg B kg-1 soil imply the likelihood of deficiencies in susceptible crops. In England and Wales, the National Agricultural Advisory Service classifies boron levels as follows: 0-0.5 mg L_1 indicates severe deficiency, 0.6-1.0 mg L_1 implies a likelihood of deficiency and values >4.1 mg L ~1 indicate the possibility of boron toxicity.

0 0

Post a comment