Farmers rely on irrigation, but water is a limited resource and little is know about how to manage it best. NIWA has developed a hydrology and soil science model to show how much to water, and when, to get the best results.
Key findings
- Increasing demands – now and in the future – threaten the sustainability of irrigation for agriculture.
- The TopNet hydrology model uses ‘scenarios’ to enable experimentation with water supply and demand.
- With the monitor-match-manage, or 3M scenario, we were able to conserve water and improve the usefulness of irrigation.
The problem
For farmers, irrigation provides insurance against weather failure. It also improves the marketability and market value of produce. However, reliability of water supply can substantially affect the likelihood of success of a crop. This reliability is influenced not only by the availability of water within a catchment but also the demands for its use.
Many of the current demands on water resources have resulted directly from changes in agricultural activities. For example, conversion of a farm from sheep to dairy, or increasing herd size to increase returns, can significantly alter irrigation requirements. In addition to these demands, predicted changes in rainfall availability associated with climate change may influence the sustainability of irrigation. Predictions such as warmer climatic conditions and reduced rainfall during growing seasons in future years suggest that meeting the demands for irrigation will be even more challenging.
In this project, we addressed these issues using a modelling approach which combines an understanding of hydrology and soil science.
The solution
For efficient and effective irrigation, farmers need to know when and how much water to apply to a field. This decision is often based simply on the availability of water for irrigation. To calculate the water available in various locations – soil, lakes, streams – over time we used the TopNet hydrology model. In simple terms, TopNet is a catchment water-balance calculator. Using this computer-based model we can test scenarios based on specific ‘variables’, such as size of reservoir, amount and timing of rainfall, size of the catchment, amount and timing of irrigation, and so forth. Our model allows us to calculate how these variables will interact over time.
We tested two scenarios in which we continuously simulated the water storage status of an irrigation reservoir, the soilwater (moisture) in the irrigated fields, and the fate of the irrigation water supplied. We set our two scenarios in a South Island agricultural catchment and ran the model over four irrigation seasons, 2000 to 2004.
In Scenario 1 (‘unmanaged’ irrigation) water is applied in a fixed cycle and fixed amount. This is typical of flood irrigation, where water is supplied every two or three weeks to the farmer through irrigation races. In this scenario, uncertainty of timing and amount of irrigation, associated with the availability of water, is investigated. In Scenario 2 (‘managed’ irrigation), we introduced a management variable – crop-available soilwater – to control the timing, demand, and irrigation availability.
The result
Under the ‘unmanaged’ conditions of Scenario 1, we found that only a quarter of the water supplied early in the irrigation cycle, during the wet spring season, was stored in the soil for crop use; the rest was lost to streams and groundwater. As the irrigation season progressed, there was less and less water available in the reservoir for irrigation; by summer there was less than a tenth of the water that the modelled agricultural system demanded. This led to extreme dry conditions, where the soilwater available for crop use was reduced to zero. This severe condition recurred every summer in the four years that we modelled.
In Scenario 2 (‘managed’ irrigation) we used the 3M approach – monitor-match-manage - to better manage and conserve water. Irrigation demand and supply were limited to those times when the crop-available soilwater fell below 30% of its capacity, and the irrigation was stopped when it reached 80% of the capacity. We found that this approach reduced the irrigation frequency by a third compared with the unmanaged scenario, and completely eliminated the irrigation losses to streams and groundwater. All irrigation demands were met. Reducing irrigation frequency meant not only that the reservoir was not depleted, but also conserved power where irrigation water had to be pumped. This approach saves dollars as well as ensuring crops get enough water.