The limitations of using fresh water for irrigation purposes in arid and semiarid zones of the United States become increasingly apparent with the continued rapid population growth and urban development. Ultimately, finding new sources of water for irrigation would allow potable water to be saved for human consumption. According to Lensford et al. (1990) only a fourth of the approximately 24.7 Mm3 (20 billion acre-feet) of groundwater reserves in New Mexico can be classified as “fresh” or “slightly saline”. About 18.53 Mm3 (15 billion acre-feet) is classified as moderately saline to very saline and whether or not this huge reservoir of water can be used for irrigation purposes needs to be determined (Lensford et al., 1990). Some efforts have been made to use these aquifers for irrigation purposes in the wake of growing restrictions on allocations of potable water. In addition to saline groundwater, several other sources of alternative water, such as recycled water (also referred to as effluent or reclaimed water) and gray water, are being used for irrigation purposes.
In the United States, golf is a major revenue-producing industry which accounted for $33.2 billion dollars worth of goods and services in the year 2002, generated 483,649 jobs nationwide (Haydu et al., 2008). In New Mexico, the turfgrass and golf sector contributed a total of $975,000,000 in revenues to the state's economy during the fiscal year of 2004–2005 (Diemer, 2006) and represents a sizeable portion of tourism in the state. Despite the economic importance and growing public demand for these and other green areas, turfgrass water consumption has been a major point of political debate. Many water rights activists insist that golf courses are strictly for recreation and serve no other purpose. Water restrictions on the amount of potable water allocated are often the result of such debate (City of Albuquerque, NM, 2000). Consequently, attention is being focused on using saline water for turf irrigation. The latest survey by the Golf Course Superintendents Association of America (Throssell et al., 2009) indicates that 37% of all golf courses in the southwestern United States are irrigated with saline recycled water.
The feasibility of using saline water for turfgrass irrigation has been studied intensively. Various experiments have determined salinity thresholds for warm- and cool-season grasses. Generally speaking, cool-season grasses (C3) produce lower quality turf than most warm-season grasses (C4) when irrigated with saline water. The results of studies conducted by Dean et al. (1996) supported the selection of bermudagrass [Cynodon dactylon (L.)] over tall fescue in arid climates under saline irrigation. Alshammary et al. (2004) showed that salinity tolerance of saltgrass [Distichlis spicata (L.)] was greater than that of three cool-season species.
The species within the C3 group also show a wide range in salinity tolerance, as indicated by the ranking of these grasses for salinity tolerance reported by Carrow and Duncan (1998). Based on the results of greenhouse container and hydroponic experiments comparing three cool-season grasses, Alshammary et al. (2004) ranked alkaligrass [Puccinellia distans (L.) Parl] most salinity tolerant, followed by tall fescue [Festuca arundinacea (Schreb.)], and then Kentucky bluegrass (Poa pratensis L.). Salinity tolerances among cultivars within a given species can also vary widely. Suplick-Ploense et al. (2002) studied five Kentucky bluegrass cultivars to determine the variability in salt tolerance within and among two Poa species and their hybrids. The authors found greatest salt tolerance in the aggressive and compact ecotypes, but Kentucky bluegrass was still considered a salt-sensitive turfgrass species when compared with other species, such as tall fescue.
One of the limitations of many of these salinity tolerance studies is that they were conducted under controlled-environment greenhouse conditions (e.g., Nabati et al., 1994; Suplick-Ploense et al., 2002) and additional environmental stresses such as drought, cold, or heat, all of which typify arid and semiarid regions and can exacerbate the effects of salt stress, were not considered. The high altitude desert Southwest is characterized by extreme diurnal and seasonal weather conditions with high summer temperatures which makes cool-season grasses difficult to sustain because of heat stress. Exposure to the added stresses from a harsh climate in a field trial might change the outcome of a salinity trial and lead to decreased survival of a turfgrass which it would otherwise tolerate.
In addition to using nonpotable water, and selecting salt tolerant species, a third strategy to conserve potable water is to optimize irrigation efficiency. Disadvantages of sprinkler irrigation such as sprinkler overlap, wind drift, and evaporation losses during the irrigation process increase overall water consumption. Alternative irrigation systems, such as subsurface drip irrigation, avoid such disadvantages and may provide higher efficiencies. The benefits of subsurface irrigation have been extensively studied in agriculture (e.g., Camp et al., 1993; Schwankl et al., 1990; Malash et al., 2008). While some authors suggested a superior uniformity and efficiency of drip irrigation in turf (Beard, 1973; Leinauer, 1998; Duncan et al., 2009), only few published studies investigated the performance of cool-season turf in combination with drip irrigation in field studies. Schiavon et al. (2010) investigated turf quality of several cool-season grasses over a 4-yr period and reported neither a decline in the performance of drip irrigation systems nor in the quality of tested grasses. Contrary to these findings, Gibeault et al. (1985) reported a significant reduction in quality of Kentucky bluegrass, perennial ryegrass, and tall fescue when drip irrigated as opposed to sprinkler irrigated. Moreover, some limitations may exist when saline water in combination with subsurface drip irrigation is used on cool-season grasses. In a study by Palacios-Diaz et al. (2009) irrigation with reclaimed municipal waste water (electrical conductivity [EC] 2.4 dS m−1) resulted in salt accumulation between the irrigation lines for alfalfa Medicago sativa (L.), causing plant mortality. Drip irrigation with saline water can lead to non-uniform salt distribution and accumulation in soil (Bernstein and Francois, 1975; West et al., 1979).
As the demand for potable water and the need for its conservation continue to grow, it is imperative that efforts be made to use nonpotable, recycled, or other impaired water sources and to increase irrigation efficiency to sustain quality and functionality of turfgrass areas. Although some studies have examined survival and performance of warm-season grasses under drip irrigation with saline water, similar information is scarce for cool-season grasses. New Mexico's climate is characterized as transitional semiarid to arid with wide seasonal and diurnal temperature fluctuations. Because of the cold winters, cool-season grasses are widely grown and are commonly found on residential turf areas and athletic fields in New Mexico. Furthermore, almost all golf courses grow cool-season turf on greens, tees, and fairways. A study was conducted at New Mexico State University to assess the effects of water quality and type of irrigation on root zone salinity and turf quality of several cool-season grasses in the arid Southwest. Moreover, we investigated whether or not salinity accumulation in the root zone can be used to predict turfgrass quality for several turfgrass species and varieties.
MATERIALS AND METHODS
The study was performed at the University's golf course in Las Cruces, NM (USDA Plant Hardiness Zone 8) from 2005 to 2007. Monthly average temperature, precipitation, and reference evapotranspiration during the research period are listed in Table 1. Grasses were established in 2004 and included tall fescue cultivars Southeast and Tar Heel II; perennial ryegrass Lolium perenne (L.) cultivars Brightstar SLT and Catalina; alkaligrass cultivars Fults and Salty; and fine fescue Festuca rubra (L.) cultivar Dawson. Plots were irrigated with potable, moderately saline, or saline water. Saline water was pumped from a nearby saline aquifer to the research site. Moderately saline water was prepared by mixing municipal water with the saline groundwater to an EC of 2.0 dS m−1. According to the U.S. Salinity Laboratory (U.S. Salinity Laboratory Staff, 1954) the moderately saline water is classified high in salinity and low for sodium hazard (C3-S1) and the saline irrigation water very high in salinity and medium for sodium hazard (C4-S2). A detailed description of ion concentrations in the irrigation waters are listed in Table 2.
|30 yr mean||10||10||8||6||8||15||39||48||32||20||11||14|
|Electrical conductivity, dS m−1||0.6||2.0||3.5|
|Total dissolved solids, mg L−1||400||1300||2200|
|Magnesium, meq L−1||0.8||1.68||2.52|
|Calcium, meq L−1||2.8||3.19||5.05|
|Sodium, mg L−1||48||230||400|
|Sodium adsorption ratio (SAR)||1.55||6.41||8.94|
|Potassium, mg L−1||4.6||28.0||51.2|
|Carbonate, meq L−1||0.00||0.00||0.00|
|Bicarbonate, meq L−1||2.84||6.43||9.95|
|Residual sodium carbonate, meq L−1||not detected||1.56||2.38|
Grasses were irrigated with either a sprinkler or a subsurface drip system. From February to November irrigation was scheduled daily at 120% of reference evapotranspiration (ETo) (Allen et al., 2005) using irrigation software (Nimbus II Central Control System, Rainbird Corp., Tucson, AZ) that also scheduled the golf course irrigation system. During December and January irrigation was scheduled manually twice weekly for approximately 10 min. Climate data used to calculate ET0 were collected at a weather station located on the golf course in close proximity to the study site. Irrigation for each sprinkler and subsurface drip main block was regulated by a separate solenoid valve and pressure regulator. The sprinkler system was comprised of 8 Walla Walla MP2000 Rotators (Walla Walla Sprinkler Company, Walla Walla, WA) operated at 200 kPa and spaced 3.8 m apart to allow for uniform irrigation. Irrigation audits conducted bimonthly during each year ensured that distribution uniformity (DU) was never lower than 0.7 and provided data necessary to compare actual water delivery rates with computer settings. The subsurface drip system consisted of porous emitterless line source pipes (Precision Porous Pipe, McKenzie, TN) with a diameter of 1.27 cm operated at 200 kPa. Each subsurface drip irrigated block had a flush valve installed to prevent sediments from potentially clogging the drip lines. The flush valve was located at the opposite corner of the water inlet and allowed for a 10 to 15 s long flush cycle at the beginning of each irrigation cycle. The pipes were installed at a soil depth of 7.5 cm and spaced 30 cm apart. Irrigation water use on subsurface drip irrigated blocks was recorded by means of a water meter (Invensys Process Systems Inc., Plano, TX) and run times were calculated based on recorded water delivery rates minus the amounts that were lost in the flush cycles. Uniform water distribution on subsurface drip irrigated blocks was monitored three times over each growing season by taking 24 volumetric soil moisture readings at depths of 0 to 6 cm with a hand-held ThetaProbe soil moisture sensor (Delta-T Devices Ltd., Cambridge, England) 24 h after an irrigation cycle. Soil moisture values were subsequently analyzed for distribution uniformity, similarly to DU calculations on sprinkler irrigated blocks.
The soil at the site consisted of a sandy loam, a sandy, skeletal, mixed, thermic Typic Torriorthent, an entisol typical for arid regions. Chemical properties of the soil before turfgrass establishment and irrigation are listed in Table 3. During the growing season (March–November) plots were mowed biweekly at a height of 7.5 cm and clippings were collected. Plots were fertilized at a rate of 5 gN m−2 with 15–15–15 quick release fertilizer in April, June, August, and October. A micronutrient fertilizer (Pro-Mate, Helena Chemical Company, Collierville, TN) containing Ca (1.0%), Mg (4.3%), S (18.2%), Cu (0.3%), Fe (14.3%), and Mn (2.6%) was applied in the summer at a rate of 10 gm−2. The pre-emergent herbicide Pendulum (active ingredient 37.4% pendimethalin) was applied at a rate of 0.65 mL m−2 in April to prevent weed germination. The systemic insecticide Merit (active ingredient 75% imidacloprid) was applied at 0.5 g m−2 in June and August to prevent grub damage.
|EC†, dS m−1||0.22||0.21||0.23|
|Mg, meq L−1||0.4||0.4||0.4|
|Ca, meq L−1||2.4||2.2||1.7|
|Cl, meq L−1||0.2||0.14||0.1|
|K, meq L−1||0.3||0.1||0.1|
|Sodium adsorption ratio||0.59||0.53||1.17|
|Organic matter, %||0.4||0.4||0.4|
Turfgrass color and quality was assessed by means of a visual rating scale recommended by the National Turfgrass Evaluation Program (Krans and Morris, 2007). Turfgrass quality was determined monthly from March to November on a scale of 1 to 9, with 1 = dead turf and 9 = dark green, uniform turf. The monthly ratings were averaged every 3 mo (March–May, June–August, and September–November) and analyzed as three different seasons. Photographic images were taken monthly from March to November at full sunlight from 1 h before until 1 h after solar noon by means of a digital camera. The camera was mounted on a frame at a height of 150 cm which allowed for capturing the same area of 1.5 by 1.5 m in the center of each plot every time photographs were taken. Digital images were analyzed for percent green coverage (SigmaScan Pro, Systat Software 132 Inc., San Jose, CA) (Karcher and Richardson, 2003). Coverage data were averaged every 3 mo and correlated with visual quality. Normalized difference vegetation indices (NDVI) readings were collected by means of a Greenseeker (NTech, Ukiah, CA) from March to October in 2007. Correlations between coverage and NDVI data and visual quality data were subsequently run.
Composite soil samples were collected bi-annually in mid-June and mid-November from depths of 0 to 10 cm, 10 to 20 cm, and 50 to 60 cm using a 4.5-cm-diam. soil auger. A mid-June sampling date was deemed appropriate as it fell halfway through the growing period and historically marks the beginning of the rainy season. Therefore, root zone salt accumulation from saline irrigation is expected to be highest in June, before the onset of the rainy season. The November sampling date was selected because it generally marks the end of the active growing period of cool-season grasses. Chemical analysis of the soil samples was conducted at a commercial soil testing laboratory (AgSource Cooperative Services, Lincoln, NE). Solutions were extracted with distilled water from the soil saturated paste and analyzed for EC (conductivity bridge) and Ca, Mg, and Na (plasma emission spectroscopy) (Franson, 1989).
The research area was 36 by 70 m in size and was designed as a randomized complete block. A combination of irrigation system and water quality served as the whole block treatments and grasses and soil depths as the subplot treatments. All treatment factors were replicated three times. To test the effects of water salinity level and irrigation system on root zone salinity, turfgrass quality, percent cover, and NDVI data were subjected to a repeated measures analysis using SAS Proc mixed (SAS, Ver. 9.2, 2002). Fisher's LSD test at the 0.05 probability level was used to identify significant differences among means. Proc corr and proc reg were used to correlate visual quality ratings with NDVI and percent coverage. Stepwise linear regression (Proc Reg, SAS Institute, Inc., Cary, NC) was used to investigate the relationship between summer turf quality and EC, Na, and sodium adsorption ratio (SAR) in the top 10 cm of the root zone.
Root Zone Salinity at Depths of 0 to 20 cm
The ANOVA (Table 4) revealed that the three-way interaction between type of irrigation, sampling depth, and sampling date had a significant effect on Na content. The ANOVA further revealed that the two-way interaction between water quality and sampling date had a significant effect on EC, Na content, and SAR, and the interactions between irrigation type and sampling date, and between irrigation type and sampling depth had significant effects on EC and SAR, and on EC, respectively (Table 4). Root zone EC, Na, and SAR data were subsequently pooled over sampling depths and irrigation systems and are displayed separately for each water quality at each sampling date (Fig. 1). Data were also pooled over depths and are shown separately for each irrigation system at each sampling date for EC and SAR (Fig. 2).
|Water quality (W)||***||***||***||***||**||***|
|I × W||ns||ns||ns||ns||ns||ns|
|Sampling date (S)||***||***||***||***||***||***|
|I × S||*||ns||*||ns||ns||*|
|W × S||***||***||***||ns||ns||***|
|I × W × S||ns||ns||ns||ns||ns||ns|
|I × D||***||***||**|
|W × D||ns||ns||ns|
|I × W × D||ns||ns||ns|
|D × S||ns||*||*|
|I × D × S||ns||*||ns|
|W × D × S||ns||ns||ns|
|I × W × D × S||ns||ns||ns|
When data were pooled over both sampling depths and irrigation systems, soil salinity, Na content, and SAR values exhibited a peak-and-decline pattern during 2005 and 2006. Peaks in June reflected salt accumulation from March to June from irrigation and minimal natural precipitation, and declines in salinity values in late summer and early fall were due to leaching of salts from the root zone as a result of the rainy season (Table 1, Fig. 1). A similar trend was observed for EC in plots that were irrigated with potable water (Fig. 1) and plots irrigated with saline water between June and November 2007. Over the 3-yr research period, root zone salinity was at its highest in June 2006, with EC, Na, and SAR values reaching 4.8 dS m−1, 1247 mg L−1 and 20.7, respectively. Salinity values did not change between November 2006 and November 2007 on plots irrigated with potable or moderately saline water. Salinity values within the root zone generally reflected the values of the irrigation water, with highest values measured in plots irrigated with saline water and lowest values in plots irrigated with potable water (Fig. 1). Sodium and SAR values of plots irrigated with moderately saline water fell between those measured in potable and saline irrigated plots from June 2005 to June 2006 but dropped to levels observed in plots irrigated with potable water from November 2006 to November 2007 (Fig. 1). The EC values did not differ between plots irrigated with moderately saline water and those irrigated with potable water on four out of six sampling dates. High amounts of precipitation during spring and summer 2007 (Table 1) resulted in lower peaks of EC, Na, and SAR in summer 2007, and none of the three measured parameters differed over time on plots irrigated with potable or moderately saline water.
When EC and SAR data were pooled over all water qualities and depths but analyzed separately for sampling dates and irrigation types, EC was highest in drip irrigated plots on four of the six sampling dates (Fig. 2). Type of irrigation system did not affect EC in June 2005 or November 2006. Sodium adsorption ratio values were higher in sprinkler irrigated plots than drip irrigated plots on the first sampling date but did not differ between the two irrigation systems from November 2005 to November 2007. When data were averaged over all water qualities and sampling dates and analyzed separately for the two depths and irrigation systems, EC was highest at depths of 0 to 10 cm under drip irrigation compared to sprinkler irrigation. At a depth of 10 to 20 cm, EC did not differ between sprinkler and drip irrigated plots (Fig. 3). In contrast, irrigation system did not affect SAR at 0 to 10 cm but at 10- to 20-cm depths values were higher on sprinkler irrigated plots than on drip irrigated plots (Fig. 3).
Water quality and type of irrigation system affected Na content in the top 20 cm of the root zone differently than for SAR and EC (Table 4, Fig. 4). When Na data were pooled over all three water qualities and analyzed separately for root zone depths, irrigation systems, and sampling dates, Na values were highest in drip irrigated plots at 0 to 10 cm in June and November of 2005 and in June of 2006. Sodium levels in the 10- to 20-cm depths of drip-irrigated plots were either equal (November 2005 and June 2006) or lower (June 2005) than those observed in sprinkler-irrigated plots. Soil depth did not affect Na content in sprinkler irrigated plots throughout the research period or in plots irrigated with a drip system from November 2006 to November 2007.
Root Zone Salinity at Depths of 50 to 60 cm
The ANOVA (Table 4) revealed that the two-way interactions between water quality and sampling date and between type of irrigation and sampling date had a significant effect on SAR. Interactions between sampling date and water quality did not significantly affect EC or Na content. Sodium adsorption values were therefore pooled over irrigation systems and are presented separately for each water quality (Fig. 5), and pooled over water qualities and presented separately for each irrigation type (Fig. 6) at each sampling date. Electrical conductivity and Na content were not affected by irrigation type, but differed significantly among sampling dates and water quality (Table 4). Generally, changes in EC and Na values at 50- to 60-cm depths followed the same irrigation and precipitation pattern as changes at 0- to 20-cm depths. Electrical conductivity and Na was highest in June of 2005 and 2006 and dropped to lower levels in November of both years. Salinity levels (EC and Na) stayed consistently low from November 2006 to November 2007 (Fig. 7). When EC and Na values were pooled over irrigation systems and sampling dates, the highest EC and Na content were measured in plots irrigated with saline water (Table 5). Electrical conductivity and Na did not differ between plots irrigated with potable and moderately saline water (Table 5).
|dS m−1||mg L−1|
Soil SAR at soil depths of 50 to 60 cm mirrored the SAR of the irrigation water on each of the sampling dates (Fig. 6). The values were greatest in plots irrigated with saline water, and lowest in plots irrigated with potable water. With the exception of June 2007, SAR in plots irrigated with moderately saline water fell between those measured in plots irrigated with saline and potable water on all other sampling dates (Fig. 6). When SAR data were averaged over water qualities and displayed separately for each sampling date and irrigation system, sprinkler irrigation resulted in higher SAR than drip irrigation on all but the last sampling date (Fig. 7). Moreover, changes in SAR did not follow the same seasonal peak-and-decline pattern that was observed for EC or Na. Under plots that received sprinkler irrigation, SAR was highest in November 2006 and lowest in June and November 2007 (Fig. 7). June and November SAR values in drip irrigated plots did not differ in 2005 and 2007, but lower levels were found in November of 2006 when compared to June measurements (Fig. 7).
The ANOVA revealed that the three-way interaction between irrigation type, water quality, and sampling date and the two-way interactions between cultivar and water quality and between cultivar and sampling date significantly affected turf quality (Table 6). Data were subsequently pooled over irrigation systems and water qualities and are presented separately for each sampling date (Table 7). Turf quality data were also pooled over irrigation systems and are presented separately for each cultivar for the three water qualities (Table 8).
|C × I||ns||ns||ns||ns|
|Water quality (W)||***||ns||**||**|
|C × W||***||ns||*||ns|
|I × W||ns||ns||ns||ns|
|C × I × W||ns||ns||ns||ns|
|Sampling date (S)‡||***||***||***||***|
|C × S||***||***||*||ns|
|I × S||***||*||***||***|
|C × I × S||ns||ns||ns||ns|
|W × S||***||***||ns||***|
|C × W × S||ns||ns||ns||ns|
|I × W × S||***||*||***||**|
|C × I × W × S||ns||ns||ns||ns|
|Tar Heel II||5.6a||6.6a||7.3a||6.3a||6.8a||7.2a||5.8a||5.9a||6.0a||6.4a|
|Tar Heel II||6.7aA||6.8aA||5.7aB|
Turfgrasses tested in this study exhibited highest quality during 2005 and in spring and summer 2006. From fall 2006 until the end of the investigative period in 2007 overall quality declined to an average of 4.7 when data were pooled over all grasses (Table 7). During spring of 2005, five grasses out of seven performed equally well, while at the end of the 2007 growing season only tall fescue cultivars Tar Heel II and Southeast exhibited highest quality ratings. Alkaligrass Fults was among the grasses displaying poorest quality throughout the research period. When data were averaged over all sampling dates, Tar Heel II and Southeast performed best and Salty and Fults poorest. Turf quality of Brighstar SLT, Dawson, and Catalina fell between those of tall fescue and alkaligrasses (Table 7).
When data were pooled over irrigation systems and analyzed separately for three water qualities, overall turfgrass quality was lowest under saline irrigation and highest under irrigation with potable and moderately saline water (Table 8). Four grasses, Tar Heel II, Fults, Salty, and Brightstar SLT, exhibited the same quality under potable and moderately saline water, while the performance of Catalina and Dawson was affected by moderately saline irrigation water. Turf quality of Catalina declined further with increasing salinity in the irrigation water (Table 8). Salinity did not affect performance of Salty or Fults but both grasses rated lowest in quality for each of the three water qualities (Table 8). Tar Heel II averaged a rating of 6.7 for quality under irrigation with potable water, followed by Brightstar SLT, Catalina, Dawson, and Southeast, with ratings of 5.8, 5.7, 5.9, and 5.7, respectively (Table 8). Fults displayed the poorest visual quality under irrigation with potable water averaging 4.3. Tar Heel II and Southeast had the highest quality under saline irrigation during the investigative period, averaging 5.7 and 5.2, respectively. Brightstar SLT, Catalina, and Fults exhibited poorest quality under saline irrigation. Type of irrigation system had no influence on turfgrasses quality on plots irrigated with potable water or on those irrigated with moderately saline or saline water for most dates (Fig. 8). Plots irrigated from a drip system with moderately saline water rated lower in quality in summer and fall of 2007 and drip irrigated plots irrigated with saline water exhibited lower quality in spring and summer of 2006 (Fig. 8).
Correlations between Turf Quality, Cover, Normalized Difference Vegetation Indices, and Salinity
The correlation between visual turfgrass quality and NDVI was significant (P < 0.001) yielding a correlation coefficient of r = 0.57. When correlations were run separately for each cultivar, Dawson and Tar Heel II were most strongly associated with NDVI, yielding coefficients of 0.71 and 0.60, respectively. The correlation between quality and NDVI was poorest for Southeast, yielding a coefficient of 0.41. Despite the significant correlation between NDVI and turf quality, the different treatments did not always affect the two variables similarly. For example, the interactions between water quality and sampling date had a highly significant (P < 0.001) effect on NDVI, but not on visual quality (Table 6). Furthermore, the interactions between cultivar and sampling date and between cultivar and water quality affected significantly turf quality but not NDVI.
Stepwise linear regression revealed that summer and fall values of EC, Na, and SAR in the top 10 cm of the root zone were predictors of turf quality of Brightstar SLT, Catalina, Fults, and Tar Heel II significantly (P < 0.05), although coefficients of determination were low, ranging from 0.27 (Catalina) to 0.18 (Tar Heel II). No significant relationship between any of the salinity parameters and turf quality could be established for Dawson, Salty, and Southeast. Generally, these results indicate that little variation in quality could be explained by variation in soil EC, Na, or SAR.
Whether pooled over all cultivars or analyzed separately, the linear regression between cover and quality revealed a significant relationship between the two variables (P < 0.001). High coefficients of determination ranging from 0.66 (Southeast) to 0.82 (Brightstar SLT) indicate that between 66 and 82% of the variation in quality could be explained by the variation in coverage. Despite a strong relationship between turf cover and visual quality, ANOVAs revealed that water quality and the interaction between cultivar and water quality affected turf quality differently than turf cover (Table 6).
Irrigating cool-season turfgrasses with saline waters in a climate with limited rainfall necessitates adding a leaching fraction to the required irrigation amount to prevent detrimental levels of salt accumulation (Ayers and Westcot, 1985). In this study we irrigated at 120% ETo and relied on natural precipitation during the rainy season (June–September) to manage salinity in the top 20 cm of the root zone. Generally, changes in soil EC, Na content, and SAR reflected seasonal changes in irrigation and natural precipitation. Higher values for EC, Na content, and SAR were measured in summer of 2005 and 2006 before the onset of the monsoon season. These peak salinity levels were followed by lower values in the fall, following the rainy season, which typically begins in early July and continues into early fall (Fig. 1 and 2). These findings are in agreement with results of Choi and Suarez-Rey (2004), who demonstrated successful salt leaching in a desert Arizonan soil with the help of monsoon rains. These findings are also similar to results obtained in a parallel study conducted by the authors on warm-season grasses, where seasonal changes in soil salinity concomitant with changes in natural precipitation were observed (Sevostianova et al., 2011).
Type of irrigation systems had a greater impact on Na content during the drier, first half of the study period (June 2005–June 2006) than during the wetter, second half (Fig. 2 and 4). Similarly, EC was affected by irrigation systems in November 2005 and June 2006. Turf plots irrigated from a drip system exhibited greater Na content at a soil depth of 0 to 10 cm from June 2005 to June 2006 and greater EC on November 2005 and June 2006 than turf plots irrigated from a sprinkler system (Fig. 2 and 3). These findings corroborate our hypothesis that drip irrigation is less successful in leaching salts from depths above the drip lines than sprinkler systems at similar depths. However, at depths below the drip lines (10–20 cm), EC, Na, and SAR were either lower or similar to values measured on sprinkler irrigated plots (Fig. 2, 3, 4) throughout the study period. Our findings confirm those of Cote et al. (2003), who demonstrated that more water is distributed below than above the emitter plane in highly permeable sand that is drip irrigated. Similarly, Hoffman (1975) reported a non-uniform distribution of salts from drip irrigation with saline water with an accumulation of salts both at the surface and on the periphery of the wetting front. Precipitation during 2007 appeared to be responsible for a successful leaching of Na from both drip and sprinkler irrigated plots. However, precipitation did not affect total salinity, as EC was again higher in drip irrigated plots than in those that were sprinkler irrigated.
During the course of the investigative period highest EC and Na values (6.1 dS m−1 and 943 mg L−1, respectively) were measured on drip irrigated plots at depths of 0 to 10 cm in June of 2006. Highest values recorded on warm-season grasses subjected to the same salinity treatments but irrigated at 110% ETo were 4.3 dS m−1 and 793 mg L−1, respectively (Sevostianova et al., 2011). Electrical conductivity and Na were approximately 30 and 20% lower on warm-season grasses compared to cool season. A longer growing period with a correspondingly longer irrigation period and higher total irrigation amounts contributed to greater salt inputs from the irrigation water into the root zone on cool-season turf. However, a higher leaching fraction on cool-season grasses should have compensated for the greater salinity input. The greater accumulation of salts at depths of 0 to 20 cm in cool-season grasses could be due to their higher ET rates compared to warm-season grasses which results in less remaining water available to leach salts from the root zone.
Irrigation type and water quality did not affect EC and Na at soil depths of 50 to 60 cm on any of the sampling dates. These results differ from our findings on warm-season grasses (Sevostianova et al., 2011) which suggested that water quality affected EC and Na at these depths. However, both irrigation type and water quality affected SAR values. As was observed at root zone depths of 0 to 20 cm, SAR values reflected the quality of the irrigation water at 50 to 60 cm. It remains unclear why water quality influenced all measured salinity parameters at depths 0 to 20 cm but not at 50 to 60 cm. Non-uniform water distribution in drip irrigated plots, as indicated by slightly greener plants on top of the drip lines compared to between the lines, may have affected water movement into deeper profiles. Layering of different soil types at the research site could also have affected water movement and salt accumulation and may have contributed to our results. Further research that includes salinity measurements at depths throughout the soil profile might help elucidate these differences in salt accumulation.
Among all cultivars included in our study, both tall fescue cultivars exhibited highest visual quality, while alkaligrasses had the lowest quality (Tables 7 and 8). Our results differ from those of Lunt et al. (1961), Butler (1972), Torello and Symington (1984), and Alshammary et al. (2004), who all reported the superior salinity tolerance of alkaligrass compared to other cool- season grasses. The low quality ratings of alkaligrasses in this study may not be due to salt stress, as it was lowest even when irrigated with potable water (Table 8). Schiavon et al. (2010) and Leinauer (unpublished data, 2007) reported low turf quality of alkaligrass during summer months in southern New Mexico, even when irrigated with potable water in sufficient amounts to avoid drought stress. Therefore, a poor performance of alkaligrass during the 3-yr research period appears to be the result of inadequate heat tolerance and not necessarily due to salt stress.
Tall fescue provided the highest quality among the cool-season grasses in our study. These findings support those of Lunt et al. (1961) and Harivandi et al. (1992), who rated tall fescue either moderately tolerant or tolerant to salinity. Superior salinity tolerance of tall fescue compared to other cool-season grasses may be the result of salinity avoidance which is achieved by developing a deep root system that remains viable at depths below those at which salt accumulates. Although rooting depth was not measured in our study, tall fescue has been reported to be an excellent drought avoider by other authors based on a deep and extensive root system (Jiang and Huang, 2001; Qian et al., 1997). Moreover, Alshammary et al. (2004) observed a high root/shoot ratio in salt-stressed tall fescue. In our study, tall fescue quality received an average rating of either 6 or 7 when irrigated with saline or moderately saline water (Table 8). Similar turf quality was observed by Sevostianova et al. (2011) for inland saltgrasses A138 and DT16 and bermudagrasses NuMex Sahara and Transcontinental, both of which are generally considered more salt tolerant than cool-season tall fescue. However, more research is necessary to determine whether quality of tall fescue grown in a saline environment can remain as high as that of salt tolerant warm-season grasses on a long-term basis.
Visual quality ratings of Brightstar SLT, Catalina, and Dawson support findings of Carrow and Duncan (1998) and Harivandi et al. (1992) who ranked salinity tolerance of perennial ryegrass as similar to that of slender creeping red fescue cultivars. However, our findings do not concur with those of Schaan et al. (2003). The authors reported no significant loss in quality of perennial ryegrass cultivar Champion during a 2-yr period of alternating irrigation between potable and saline water of 3.3 dS m−1. In our study, perennial ryegrasses Brightstar SLT and Catalina only maintained acceptable quality under irrigation with potable water (Table 8). Irrigation with moderately saline and saline water resulted in turf quality below an acceptable minimum of 6. In our study, the quality of Dawson under saline irrigation was higher than that of both cultivars of perennial ryegrass. These findings are in agreement with those of Marcum (1999) who reported greater salinity tolerance in accessions of strong and slender creeping red fescue compared to perennial ryegrass. Torello and Symington (1984) reported higher NaCl tolerance in Dawson than Fults. As in alkaligrasses, the low quality we observed in both perennial ryegrasses and creeping red fescue may have been due to high summer temperatures and not salinity stress. This would explain why quality ratings for these three grasses never exceeded 6, even under potable irrigation (Table 8). Other studies have also found perennial ryegrass and creeping red fescue to be heat sensitive species (McCarty, 2009; Christians, 2007). Moreover, no recovery was observed for Catalina plots after the summer of 2006, or for Dawson and Brightstar after winter of 2006, despite lower salinity levels in the root zone compared to previous years (Table 8, Fig. 1). Simultaneous heat and salt stress may have been detrimental to both species resulting in little or no recovery. In contrast, tall fescue successfully recovered to acceptable quality levels at the end of the research period. Linear regression between salinity and turf quality further supports our hypothesis that more than one stressor affected visual quality of cool-season grasses in our study. Despite a wide range of salinities measured over the 3 yr research period, quality could only be significantly predicted from soil salinity for four cultivars. Furthermore, low regression coefficients indicated that only between 18 and 27% of the variations in quality can be explained by soil salinity. Further research in cooler climate zones is needed to investigate the role of temperature and salinity on turf quality of cool-season grasses.
Visual quality of turfgrasses was not affected by the type of irrigation when potable water was used (Fig. 8). These results differ from those of Gibeault et al. (1985) who reported a significant reduction in quality of Kentucky bluegrass, perennial ryegrass, and tall fescue when drip irrigated as opposed to sprinkler irrigated. However, our results are in agreement with those of Schiavon et al. (2010) and Sevostianova et al. (2011) who reported no decline in turf quality of subsurface irrigated cool-season grasses during a 4-yr research period and no difference in quality between sprinkler and drip irrigated plots of several warm-season cultivars. Cool-season turfgrass plots irrigated with saline water from a sprinkler system exhibited higher quality than plots irrigated from the drip systems in spring and summer of 2006. The higher turf quality of sprinkler irrigated plots may be due to lower EC and Na content at root zone depths of 0 to 10 cm (Fig. 3 and 4) compared to drip irrigated plots. During summer and fall of 2007, grasses drip irrigated with moderately saline water exhibited lower quality than those that were sprinkler-irrigated. By the end of the research period, turf quality of plots both sprinkler and drip irrigated with moderately saline and saline water was below the acceptable minimum rating of 6 (Fig. 8), suggesting that cool-season grasses generally do not perform well when irrigated with more saline water on a long-term basis in a transitional desert climate. The high correlation between percent ground cover and quality suggests that reduced turf quality is mainly due to lack of green cover. However, loss of green cover may not necessarily be attributed to a complete loss of plants, as leaf firing and loss of pigmentation during the early stages of salinity and heat stress has been reported by several researchers as a result in cool-season turfgrasses (Harivandi et al., 1992; Nabati et al., 1994; Suplick-Ploense et al., 2002).
The weak correlation between visual turfgrass quality and NDVI is the result of a wide spread of NDVI values for each visual quality rating. Similar results have been reported for both cool- and warm-season grasses by Bunderson et al. (2009), Ghali (2011), Haendel and Wissemeier (2008), and Schiavon et al. (2010, 2011). Schiavon et al. (2011) suggested that differences in color and canopy structure between grasses that are readily detected by spectral reflectance may be less noticeable and of lesser importance to assessing overall quality to the person visually rating the plots. Ghali (2011) noted that comparing discrete (quality ratings) with continuous variables (spectral reflectance values) results in a weak correlation. Further research is required to investigate the weak correlation if NDVI measurements are to replace visual and subjective quality ratings.
Our results indicate that most of the cool-season grasses included in this study could not be maintained at an acceptable quality level in a transition zone climate when irrigated with saline water regardless of irrigation system. Salinity levels in our irrigation water were either higher or matched those found in recycled water currently used in the Southwest to irrigate turf areas. Summer heat and exposure to salinity build-up in the root zone were deleterious to growth and quality of all grasses but tall fescue. Over the course of the 3-yr study, quality was affected by both soil salinities and high temperatures, despite a semi-annual cyclic leaching pattern that reduced soil salinity in the second half of the calendar year. Based on these and earlier findings, with the exception of tall fescue, warm-season grasses appear to be the logical choice for turf areas irrigated with saline water from either a drip or a sprinkler system in transitional semiarid or arid climate zones.