Best management practices (BMPs) are land management approaches that aim to protect soil and water resources against degradation. In response to the European legislative requirements to reduce the impacts of agricultural runoff on freshwater systems by 2015, an integrated approach is being developed that combines water and land management, measures programs, and includes a suite of BMPs designed to reduce nonpoint-source pollution (Cherry et al., 2008). In the United States, a variety of BMPs have been developed to reduce sediment and nutrient loss from agricultural fields, including grassed waterways, buffer strips, reduced tillage, and wetlands. These practices are widely promoted through U.S. Department of Agriculture cost-share programs that provide funding and technical expertise for project implementation. However, there is increasing pressure to document water quality and wildlife benefits of federal and state programs that pay landowners to implement BMPs (Mulla et al., 2008). Numerous studies of the effectiveness of agricultural BMPs have been conducted at the field level (e.g., Dinnes et al., 2002), but detailed long-term data are needed to quantify the environmental benefits from specific conservation practices at the watershed scale (Mausbach and Dedrick, 2004).
One of the more rigorous approaches to evaluating the effectiveness of BMPs at the watershed scale involves paired watershed comparisons (King et al., 2008; Mulla et al., 2008). Paired watershed experiments involve two or more adjacent or nearby watersheds that are similar in soils, topography, land use, and climate. Best management practices are established in one of the watersheds, and the other is used as a reference watershed. Monitoring regimes are identical between the two watersheds, and differences in temporal trends between like response variables are compared. One advantage of a paired watershed approach is that watershed differences and year-to-year climatic differences can be accounted for in the experimental design, and therefore watersheds do not need to be identical (Mulla et al., 2008). Disadvantages include difficulty with quantifying number and location of BMPs installed in the treatment watershed, engaging a significant number of farmers in the BMP program, the expense of collecting long-term data, and maintaining minimal changes in the reference watershed (Clausen and Spooner, 1993; Mulla et al., 2008).
Using paired watershed experiments to evaluate BMPs is now a common approach, and previous paired watershed studies have evaluated the impacts of manure management systems, rotational grazing, reduced tillage practices, grass buffer strips, reduced N applications, and erosion control practices (Clausen et al., 1996; Udawatta et al., 2002; Jaynes et al., 2004; Bishop et al., 2005; Gassman et al., 2010). The watersheds in these studies ranged in size from 1.1 to 404 ha, with the exception of a water quality improvement project in Iowa that used watersheds of 9126 and 9804 ha (Gassman et al., 2010). The results of these studies have shown mixed effects of BMPs across a variety of systems, ranging from improved water quality to no improvements (Mulla et al., 2008). Multiple aspects of watershed studies can confound the ability to detect water quality changes, including watershed size, channel morphology, preexisting water quality conditions, and changes in land use. Gassman et al. (2010) reported difficulties in detecting water quality changes in two Iowa watersheds in part because of the large watershed areas and the existence of relatively good water quality before the study. Analyses of four subwatersheds by King et al. (2008) showed that changes resulting from BMPs aimed at reducing pollutant loadings would be easier to detect in channelized versus unchannelized watersheds. In our study, we report on a paired watershed study of two 4000-ha, channelized subwatersheds in the headwaters of the Mackinaw River, Illinois, that are an order of magnitude larger than previous assessments. We conducted intensive outreach in one watershed (treatment) and no outreach in the other watershed (reference). We hypothesized that (i) we would measure increased rates of agricultural BMP implementation in the treatment watershed that received outreach relative to the reference and (ii) increased BMP implementation would result in watershed-scale (4000 ha) changes in nutrients, suspended sediment concentrations, and hydrological measures in the treatment watershed relative to the reference watershed.
Materials and Methods
Our study was conducted in two adjacent subwatersheds near the headwaters of the Mackinaw River in McLean County, Illinois (Fig. 1A). The Mackinaw River watershed covers portions of six counties across central Illinois and is the fourth largest tributary to the Illinois River system. Approximately 90% of the land use in the 295,000-ha Mackinaw River watershed is agricultural, with row crop rotation for corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] production accounting for 75% of all land cover (Illinois Department of Natural Resources, 1997). Subwatersheds in the headwater areas have been most heavily converted to agriculture, with 80 to 93% of the land used for row crops. Similarly, land use in the two headwater subwatersheds of this study was extensively agricultural, with >90% in row crop agriculture for corn and soybeans.
Bray Creek, a 4046-ha watershed (40.54N, 88.63W), was randomly selected as the treatment watershed and received intensive outreach from 2000 to 2003. Frog Alley, a 4197-ha watershed (40.54N, 88.50W), served as the reference watershed and received no targeted outreach. Between 2000 and 2006, watershed area in corn and soybean production averaged 46 and 46% in Bray Creek, respectively, and 48 and 44% in Frog Alley, respectively. Soils were comprised of a silt-loam mesic composition of Parr-Lisbon-Drummer association throughout the entirety of both watersheds (USDA, 1998), with slopes ranging from 0 to 10%. This association consists of 40% well drained Parr soils located on side slopes and summits; 45% somewhat poorly drained to poorly drained Lisbon and Drummer on foot slopes, interfluves, and shallow depressions; and 15% minor extent soils of La Rose (well drained), Peotone (very poorly drained), and Harpster (poorly drained).
Total stream lengths were similar for Bray Creek (19.8 km) and Frog Alley (19.6 km). The majority of stream length in each watershed was channelized with narrow stream buffers and very few trees. Channel maintenance in Bray Creek was primarily the responsibility of individual landowners, whereas channelization and additional channel maintenance were conducted within the framework of an organized drainage district in Frog Alley. Extensive agricultural subsurface tile drainage systems have been installed in Illinois, with an estimated 11.6 million tile-drained acres throughout the state (Sugg, 2007). These subsurface tiles are used to effectively remove excess water from the poorly drained soils and subsequently discharge field runoff directly into streams that traverse agricultural watersheds. Although comprehensive tile maps that designate specific locations of these tile systems within Bray Creek and Frog Alley watersheds are nonexistent, Sugg (2007) used GIS mapping systems and soil drainage class data to estimate that 52 to 82% of the total land area in McLean County is tile drained. According to these estimates, McLean County is among the most heavily tile-drained counties in the United States (Sugg, 2007).
A collaborative team consisting of The Nature Conservancy (Conservancy), McLean County Soil and Water Conservation District, and McLean County Natural Resources Conservation Service conducted outreach to landowners and farmers in Bray Creek (treatment watershed) from 2000 to 2003. A local resident and farmer in McLean County (T. Lindenbaum) coordinated the outreach program. In 1999, an introductory package describing the project goals and the location of the study sites was sent to landowners and producers in the treatment watershed. Annual newsletters were sent to these recipients from 2000 to 2003 that featured articles on soil and water conservation, project updates, and profiles of local conservation-oriented farmers. Promotional flyers for incentive programs, workshops, and tours were sent to landowners and farm operators in the treatment watershed throughout the outreach study period. Between 2000 and 2002, 117 landowner and farm operator visits were conducted in the treatment watershed by the outreach coordinator, representing >90% of those that owned and/or operated farmland in watershed.
Five workshops were conducted from 2000 to 2003 that promoted conservation farming practices and habitat restoration for wildlife. Two county-wide workshops promoted strip-till farming practices, and two workshops conducted in the treatment watershed demonstrated the required management techniques and equipment for no-till farming. Strip-till and no-till farming are conservation tillage practices that leave a minimum of 30% of crop residue after planting. The environmental benefits of these reduced tillage practices include increased organic matter content of soils, reduced erosion, and subsequent reduced runoff of soil-adsorbed chemicals and P (Fawcett et al., 1994; Yates et al., 2006). An additional county-wide workshop promoted prairie restoration, wildlife food plots, and prescribed burning. Four tours were conducted from 2000 to 2003 to demonstrate cost-share opportunities available through the Environmental Quality Incentive Program and to introduce how constructed wetlands can be used to reduce agricultural runoff from subsurface tiles into adjacent waterways. Two county-wide tours were organized for producers using Environmental Quality Incentive Program education funds that highlighted habitat restoration (e.g., restored wetlands, tree plantings, and native prairie restoration) and agricultural conservation practices (e.g., buffer strips, farm ponds, and no-till management). Buffer strips are generally set aside land along the perimeter of cropland and streams that are planted with grasses or other vegetation to reduce sediment and chemical transport from agricultural fields. Two tours were organized for producers in the treatment watershed that demonstrated the use of constructed wetlands for reducing nutrient export from agricultural tile drainage to adjacent streams and rivers.
In 2000, the Conservancy sponsored a county-wide open house at a local USDA Service Center to promote federal and state-funded cost-share programs. Two incentive programs were available through the Conservation Practices Program (CPP) for strip-till farming and grassed waterway construction from 2000 to 2002. Grassed waterways are strips of grass seeded in fields in areas of high erosion potential in an effort to prevent gully erosion and to entrap sediments and nutrients in surface water runoff.
During 2000 and 2001, CPP funded a county-wide program that paid $25 ha−1 to producers that adopted strip-till farming. These payments were limited to a maximum of 16-ha (2000) or 32-ha (2001) parcels of land that had not previously been farmed using strip-till practices. In 2001, the Conservancy offered an additional $25 ha−1 to producers in the treatment watershed to promote adoption of strip-till practices on 32-ha parcels that had not previously been farmed using strip-till practices. In 2001 and 2002, CPP funded a county-wide program that provided producers with 60% cost-share for construction of grassed waterways.
Water Quality and Hydrology
Upstream and downstream monitoring sites were established within each watershed located approximately 4.8 and 1.6 km upstream, respectively, from the confluence of each stream with the Mackinaw River (Fig. 1B). Water samples were collected biweekly for water year (WY) 2000 through WY2006 (1 Oct.–30 Sept.) from each site and analyzed for nitrate (as NO3−–N), total phosphorus (TP), dissolved reactive phosphorus (DRP), and total suspended sediment (TSS) using standard methods (American Public Health Association, 1998). Phosphorus analyses were restricted to WY2004 through WY2006 because higher detection limits in previous years limited the amount of useable data we were able to obtain. On each sampling date, grab samples of stream water were collected from a single point in the center of the stream at 50% depth using a 1-L bottle. Collected samples were stored on ice before transport to the laboratory. Samples were filtered immediately on arrival to the laboratory using Whatman 0.7-μm glass microfiber filters for NO3−–N analyses and 0.45-μm membrane filters for DRP analyses. Filters from these samples were retained and used for TSS analyses; unfiltered water was retained for TP analyses. Refrigerated samples were then analyzed within 24 h on arrival to the laboratory or frozen for future analyses.
Water samples were also collected at 45-min intervals during storm events (WY2000–WY2006) at downstream monitoring sites using Sigma programmable water samplers (900 MAX Portable Sampler; American Sigma, Loveland, CO). Sigma samplers were set to begin collecting samples on a water level increase of 4 to 6 inches and were fitted with 24 bottles in which individual samples were collected over an 18-h period during storm events. If the storm event continued past the 18-h period, samplers were reset to sample an additional 18-h period. Water quality analyses were conducted on the first samples collected during an event, as well as selected samples from the rising limb, the peak limb, and the falling limb of each storm event. Of the 56 storm events that occurred in each watershed from WY2000 to WY2006, 34 and 48% were sampled in Frog Alley and Bray Creek, respectively.
Nitrate analyses were conducted using ion chromatography fitted with an IonPac AS4A-SC anion exchange column at the Illinois Natural History Survey for WY2000 to WY2003 with a minimum detection limit of 0.01 mg L−1 (DX-120; Dionex Sunnyvale, CA). For WY2004 to WY2006, NO3−–N was analyzed at Illinois State University using ion chromatography with a minimum detection limit of 0.01 mg L−1 as NO3−–N (DX-120; Dionex). Phosphorus analyses for WY2004 to WY2006 were conducted at Illinois State University using a PerkinElmer Lambda 35 dual beam spectrophotometer (PerkinElmer, Waltham, MA) with a minimum detection limit of 0.005 mg L−1. The ascorbic acid colorimetric method was used to determine DRP of filtered water samples (Method 4500-P; American Public Health Association, Baltimore, MD). Unfiltered water samples were digested using the persulfate oxidation technique and analyzed using the ascorbic acid colorimetric method to determine TP. Nutrient (NO3−–N, TP, DRP) and TSS concentrations from samples collected on the same day at upstream and downstream stations were significantly correlated (r = 0.44–0.89; all P < 0.001) (Lemke et al., unpublished data); thus, we restrict the analyses presented here to data from the downstream stations.
Water levels were recorded every 15 min at the downstream stations using Campbell data loggers (Model CR510; Campbell Scientific Inc., Logan, UT) connected to CS420-L pressure transducers (Model PDCR 1830–8388; Druck Inc., Houston, TX) that were installed in stilling wells. Pressure transducers had an accuracy of ±0.1% full scale range (combined nonlinearity, hysteresis, and repeatability) with 10 psig range. There were several instances (<10% of possible data points) in which one of the dataloggers failed to record water level due to battery depletion or other technical malfunction. On these occasions, Bray Creek water levels were estimated using regressions from a nearby automatic water sampler that was fitted with a pressure transducer (n = 21,080; R2 = 0.96), and Frog Alley was estimated from Bray Creek gauge data (n = 62,617; R2 = 0.97). Discharge rating curves were developed to convert water level readings to discharge for WY2000 through WY2006. To develop the rating curves, discharge (Q) was calculated as Q = VA using transect surveys in which measurements of cross-sectional area (A, m2) and stream flow (V, m s−1) (FLO-MATE Portable Flow Meter Model 2000; Marsh-McBirney Inc., Frederick, MD) were conducted at a range of representative depths and flows in each watershed (Gordon et al., 1992). Log-transformed Q values from these transect surveys were regressed against respective log-transformed pressure transducer water level readings to develop discharge rating curve equations for the treatment (log treatment Q [m3 s−1] = 0.4396 + 2.5198*log treatment stage [m]; R2 = 0.89; P < 0.001; n = 24) and reference (log reference Q [m3 s−1] = 0.4069 + 2.3926*log reference stage [m]; R2 = 0.71; P < 0.001; n = 17) watersheds (Fig. 2).
Hourly nutrient export was calculated as the product of water discharge (L h−1) and nutrient concentration (mg L−1) for that hour. If an hour contained a measured nutrient concentration, then that measurement applied to the entire hour. Concentrations for hours during which nutrient samples were not analyzed were interpolated from nearby sample points. Two interpolation methods were used according to the procedures described by Vanni et al. (2001). Simple interpolation was used for NO3−–N as:using the linear spline interpolation procedure in Statistix 9 (Analytical Software, Tallahassee FL), where Ch is concentration for hour h, and Cprev, Cnext, hprev, and hnext are the concentration (C) and time (h) of the previous and next samples, respectively. Interpolations for TP and TSS were adjusted for variations in discharge following the slope of the logQ-logC regression, with residuals from that regression linearly interpolated through time and applied to the estimated concentrations:andusing the linear spline interpolation procedure in Statistix 9 (Analytical Software), where B0 and B1 are the intercept and slope of the logQ-logC regression, and Rh is the interpolated residual from the regression. The Q-proportionate method is very similar to the standard technique that has been used for decades at sediment monitoring stations where samples are collected frequently (Porterfield, 1972). Hourly nutrient fluxes were summed to estimate daily export, which were then summed to obtain annual export. Flow-weighted mean concentrations were estimated as total annual export divided by discharge. Annual nutrient export per unit watershed area per year (kg ha−1 yr−1) was estimated by dividing total annual export by watershed area.
Separation of Nutrient Export from Baseflow versus Stormflow
To estimate the proportions of nutrients exported in baseflow and stormflow, we first estimated the relative contributions of baseflow and stormflow to total discharge using methods described by Vanni et al. (2001). Using this method, daily discharge for each water year was first divided into nonoverlapping 5-d blocks. The minimum discharge was obtained for each block and compared with the minimum of the 5-d blocks before and after. If 0.9 times the minimum was less than the neighboring minima, the minimum was considered to be a measure of the baseflow. Baseflow on intervening dates was obtained by linear interpolation. On dates when actual discharge was less than the interpolated baseflow, the actual discharge was used as a measure of baseflow for that date. A baseflow index was calculated as the sum of the daily baseflows divided by the total discharge for each water year.
To estimate the proportions of nutrients exported in baseflow and stormflow, we first obtained the concentrations before and after each storm event that could be considered to be baseflow concentrations. Baseflow concentrations were defined as those occurring when daily discharge was comprised of ≥80% baseflow. Baseflow concentrations were estimated for each date during storm events using linear interpolation. Then, on each date, we estimated baseflow nutrient export by multiplying baseflow discharge (L d−1) and baseflow concentrations (mg L−1). Nutrient export from stormflows was then estimated as the observed export (daily discharge multiplied by daily concentrations) minus the estimated baseflow export.
Statistical Comparisons of Watersheds
Best Management Practice Implementation
Participation in cost-share programs and total numbers and types of BMPs implemented in the treatment and reference watersheds were tracked by McLean County Soil and Water Conservation District. The numbers reported represent the cumulative implementation for each year (1999–2003) and do not reflect unknown “background” acres implemented before 1999. The comparison of BMP implementation rates between the treatment and reference watersheds during the outreach period (2000–2003) was conducted with regression analyses, in which BMP implementation was regressed on year. Follow-up tests as described by Snedecor and Cochran (1980) and performed by Statistix 9 were then used to compare the slopes of the regression lines between the two watersheds. If implementation rates are similar, there should be no difference in the slopes of the two regression lines.
Aerial photographs collected during spring 2005 from both watersheds were georeferenced and rectified to document the total area (ha), lengths (km), and widths (m) of grassed waterways and stream buffers that were present in each watershed by 2005. The aerial imagery was obtained from the Illinois Department of Natural Resources-Illinois State Geological Survey website (http://www.isgs.uiuc.edu/nsdihome/ISGSindex.html). Aerial imagery was created by Surdex Corporation and Science Applications International Corporation and published by the United States Geological Survey. Details of the aerial imagery include: black and white panchromatic, 0.5 m resolution, and Universal Transverse Mercator projection on the North American Datum of 1983. Georeferenced data from 2005 were spatially digitized as polygons and lines for both watersheds to calculate and locate total area, lengths, and widths of grassed waterways and stream buffers.
Biweekly nutrient and TSS data were analyzed using standard paired watershed methods described by Grabow et al. (1998) that are designed to decrease variability due to annual or seasonal effects (Stewart-Oaten et al., 1986; Grabow et al., 1998). Regression analyses were used to test for significant trends over time in the relationship between the treatment and reference watersheds for nutrient (NO3−–N, TP, and DRP) and TSS concentrations. In these analyses, biweekly concentrations of nutrients and TSS estimated for the reference watershed were subtracted from those of the treatment watershed and fitted to a linear regression model. Because stream monitoring data and residuals are often correlated, we included an autoregressive lag term in our regression models similar to that described by Jaynes et al. (2004). Inclusion of the autoregressive term removed the residual autocorrelation that was present in preliminary analyses conducted without the autoregressive term. The statistical significance of the date term in the full regression model (the model including the autoregressive term) was used to test for significant trends over time for the differences (treatment minus reference) between the two watersheds. If nutrient or TSS concentrations decreased in the treatment watershed relative to the reference watershed, then the difference between the biweekly samples of the two watersheds should become greater over time, resulting in a negative trend line. Power analyses were conducted following methods by Zar (1984) to ensure that our biweekly sampling frequency was sufficient to detect significant changes over time, should they exist, using this regression technique. Results from these analyses showed that the power of the test (1 – β) was well above the suggested 0.80 value for NO3−–N (0.94; n = 158; df = 156), TP (0.95; n = 70; df = 68), DRP (0.99; n = 88; df = 86), and TSS (0.94; n = 150; df = 148). We also used pairwise t tests to quantify differences in annual nutrient export per unit watershed area between the treatment and reference watershed. In these analyses, we paired observations according to years to control for interannual differences in export related to precipitation (Vanni et al., 2001).
Analyses were also run to determine if the treatment watershed differed relative to the reference watershed in terms of peak flow, low flow, and base flow. All analyses were run on data summarized by 2-wk periods to remain consistent with the water quality analyses. For peak flow, the highest daily average discharge within the 2-wk interval was used as the unit of analysis. For low flow the lowest daily average discharge within the 2-wk interval was calculated, and for base flow the mean daily discharge for the full 2-wk period was used. The differences in these parameters for each 2-wk period were calculated in a manner similar to the nutrient analyses and regressed against date to check for significant changes in hydrology after the implementation of conservation practices within the treatment watershed.
Best Management Practice Implementation
During the outreach period of 2000 to 2003, implementation of strip-till, grassed waterways and stream buffers increased in both watersheds (Fig. 3). During this time, the percentage of watershed area farmed using strip-till practices increased in the treatment watershed from 1.5 to 12.8%, compared with an increase of 0 to 4.5% in the reference watershed (Fig. 3A). Analyses showed that the rate of strip-till implementation (i.e., the slope of the watershed regression lines) differed significantly between the treatment and reference watersheds (F = 13.69; df = 1.6; P = 0.01). In the treatment watershed, implementation of grassed waterways increased from 1.1 to 15.4 ha (Fig. 3B), and stream buffers increased from 4.6 to 48.6 ha (Fig. 3C), compared with increases of 0 to 4.0 ha for grassed waterways and 8.7 to 16.1 ha for stream buffers in the reference watershed. Implementation rates for both grassed waterways (F = 6.77; df = 1.6; P = 0.04) and stream buffers (F = 22.27; df = 1.6; P = 0.003) were significantly higher in the treatment watershed than in the reference watershed.
Despite the fact that more new grassed waterways were established within the treatment watershed than the reference watershed from 2000 to 2003, digital analyses of aerial photos showed that the total area (ha) and length (km) of grassed waterways were only slightly higher in the treatment watershed by 2005 than in the reference watershed (Table 1). Waterways were distributed throughout each watershed along the main tributaries and where open water channels ended (Fig. 4A, B). Differences were greater between the two watersheds in terms of stream buffers. Digital analyses of aerial photos showed that 95 and 99% of stream lengths had stream buffers in the treatment and reference watersheds, respectively; however, the treatment watershed had more than twice the amount of stream buffer area (100.6 ha) compared with the reference watershed (49.4 ha) (Table 1). This difference was due to a much greater average width of buffers in the treatment than the reference watershed (Table 1, Fig. 4A). A majority of the stream length in the treatment watershed (58%) was buffered by the minimum CRP enrollment width requirement of at least 30.5 m compared with only 6% of stream length in the reference watershed (Table 1).
|Treatment (Bray Creek)||Reference (Frog Alley)|
|Stream length, km||19.8||19.6|
|Total area, ha||54.9||52.2|
|Total length, km||48.1||41.3|
|<9 m width, km||9.8||6.3|
|≥9 m width, km||37.2||35.2|
|No. <9 m width||47||22|
|No. ≥9 m width||61||40|
|Total area, ha||100.6||49.4|
|Total length, km||18.8||19.5|
|<9 m width, km||0.3||8.5|
|9–15 m width, km||5.1||4.8|
|15–30.5 m width, km||1.9||4.8|
|≥30.5 m width, km||11.4||1.1|
Nutrient Concentrations, Export, and Hydrology
Nitrate concentrations varied with season, ranging from <1 to 30 mg L−1 (Fig. 5A). Minimum NO3−–N levels typically occurred during the summer months of August and September. Nitrate concentrations increased during the fall and remained high throughout the winter and spring, with maximum NO3−–N levels typically occurring from April through June. Biweekly concentrations for total P ranged from <0.01 to 0.98 mg P L−1 (Fig. 5B) and TSS concentrations ranged from 0 to 278 mg DM L−1 (Fig. 5C). Concentrations of TP and TSS were generally low except during high discharge events (Fig. 5D) in which concentrations rose rapidly with increasing discharge and subsequently recovered rapidly to pre-storm conditions.
Regression analyses did not reveal any significant changes in the treatment watershed relative to the reference watershed for biweekly NO3−–N concentrations during this 7-yr study (significance of the date term, t = 0.46; df = 135; p = 0.648) (Fig. 6A). There were no significant reductions in TP concentrations (t = 1.40; df = 61; p = 0.167) in the treatment sites relative to the reference watershed sites for water samples collected August 2003 through September 2006 (Fig. 6B). Although not significant, a slight trend existed that indicated possible reduction in DRP for biweekly water samples collected from the treatment watershed relative to the reference (t = 1.86; df = 67; p = 0.067) (Fig. 6C). No significant reduction in TSS concentrations was observed in the treatment watershed relative to the reference watershed during the study (t = 0.97; df = 154; p = 0.333) (Fig. 6D).
Flow-weighted mean nutrient concentrations varied among years and between watersheds (Table 2). Flow-weighted mean NO3−–N concentrations (mg L−1) ranged from 5.8 to 12.3 and from 7.0 to 14.5 in the treatment and reference watersheds, respectively, and were generally slightly lower in the treatment watershed than the reference. Flow-weighted mean TP concentrations were also variable among years and watersheds, ranging from 0.06 to 0.57 mg L−1 between the two watersheds. Flow-weighted mean TSS concentrations were consistently higher in the treatment watershed (24.6–343.2 mg L−1) than the reference watershed (13.1–147.3 mg L−1).
|Water year||Treatment (Bray Creek)||Reference (Frog Alley)|
Annual export of NO3−–N from the two watersheds into the Mackinaw River ranged from 43 to 211 Mg yr−1 and 39 to 299 Mg yr−1 in the treatment and reference watersheds, respectively (Table 3). Annual NO3−–N export was significantly related to mean annual discharge in the treatment (R2 = 0.75; df = 6; p = 0.007) and reference (R2 = 0.94; df = 6; p = 0.0002) watershed, with total export divided between baseflow (47–48%) and stormflow (52–53%) sources. Mean annual export amounts of TP from WY2004 to WY2006 were similar between watersheds (3.8 Mg), ranging from 0.9 to 6.6 Mg yr−1 (Table 3). Mean annual export of TSS was almost 2-fold higher in the treatment watershed than in the reference watershed over the course of the 7-yr study, ranging from 355 to 4003 Mg yr−1 in the treatment watershed and 176 to 2244 Mg yr−1 in the reference watershed. Annual export of TSS and TP were not significantly related to mean annual discharge (P > 0.05); rather, the majority of export (>85%) occurred during storm events. Analysis of flow regimes from WY2000 to WY2006 showed that baseflow comprised approximately 50% of total discharge in both the treatment and reference watersheds (Table 3). The baseflow indices (i.e., the sum of daily baseflows divided by the sum of daily total flows) was 0.515 and 0.501 in the treatment and reference watersheds, respectively. Average discharge over the course of the study did not differ between watersheds (T = −1.66; df = 6; p = 0.15), with mean annual discharges ranging from 0.213 to 0.623 m3 s−1 in the treatment watershed and from 0.176 to 0.786 m3 s−1 in the reference watershed.
|Water year||Total export||Stormflow export||Baseflow index||Mean annual discharge|
|Treatment (Bray Creek)|
|Mean (±1 SE)||136 (24)||3.8 (1.6)||1626 (497)||52.4 (4.7)||89.2 (1.8)||85.7 (6.3)||0.515 (0.062)||0.428 (0.063)|
|Reference (Frog Alley)|
|Mean (±1 SE)||193 (39)||3.8 (1.4)||940 (327)||53.1 (4.6)||86.0 (4.7)||85.4 (4.8)||0.501 (0.056)||0.513 (0.081)|
Nutrient export per unit land area (kg ha−1 yr−1) varied between the two watersheds and across years (Table 4). Nitrate export from the reference watershed was generally higher in any given year than that from the treatment watershed and was significantly higher over the course of the study (T = −2.43; df = 6; p = 0.05) in the reference (46.0 kg ha−1 yr−1) versus the treatment watershed (33.6 kg ha−1 yr−1). Differences in export of TP between the two watersheds were not significant, with annual export estimates ranging from 0.2 to 1.6 kg ha−1 yr−1. Mean annual export of TSS was significantly higher (T = 3.08; df = 6; p = 0.02) in the treatment watershed (400.8 kg ha−1 yr−1) than in the reference watershed (223.9 kg ha−1 yr−1) over the 7-yr study.
|Water year||Treatment (Bray Creek)||Reference (Frog Alley)|
|kg ha−1 yr−1|
|Mean (±1 SE)||33.6 (5.8)||0.9 (0.4)||400.8 (122.6)||46.0 (9.4)||0.9 (0.3)||223.9 (77.9)|
Regression analyses did not show any significant changes in peak flow (Fig. 7A), low flow (Fig. 7B), or base flow (Fig. 7C) in the treatment watershed relative to the reference watershed during the study period (R2 = 0.01, P = 0.18, df = 170; R2 = 0.006, P = 0.32, df = 170; and R2 = 0.0001, P = 0.89, df = 168, respectively).
Intensive outreach conducted during the first 3 yr of this study successfully increased the adoption of several conservation practices that included significant increases in grassed waterways, stream buffers, and strip-till farming in the treatment watershed relative to the reference watershed. Despite the fact that more new grassed waterways were established within the treatment than the reference watershed during the outreach period, aerial photo analyses showed that the total length and area of grassed waterways were only slightly higher in the treatment than in the reference watershed by 2005. Thus, the main difference in these two watersheds after our outreach efforts was the width of stream buffers and the acres of strip-till adoption. Much of the nitrogen removal in stream buffers occurs as subsurface flows come into contact with plant root zones and denitrifying soil conditions (Hill, 1996; Lowrance et al., 1997). Data from meta-analyses suggest that this subsurface removal is not necessarily a function of buffer width but is more likely related to site-specific hydrology and biogeochemistry (Leeds-Harrison et al., 1999; Sabater et al., 2003; Mayer et al., 2007). In contrast, differences in stream buffer width have been shown to influence their effectiveness at removing sediments and associated surface water nutrients. For example, in a recent meta-analysis of 45 peer-reviewed studies, Mayer et al. (2007) found that, although the nitrogen removal effectiveness (primarily NO3−–N) of buffers varied widely, wide buffers (>50 m) more consistently removed significant portions of nitrogen entering the riparian zone from surface flow than narrow buffers (0–25 m). Additional long-term studies suggest that a 30-m buffer is sufficiently wide to trap surface sediments under most circumstances, with an absolute minimum width of 9 m to be effective (Wenger, 1999). By 2005, 61% of stream buffers in the treatment watershed were ≥30.5 m width, whereas 44% of stream buffers in the reference watershed remained <9 m. Reduced tillage practices have been shown to substantially reduce the amount of sediment and nutrients transported in surface runoff from agricultural farmlands (Chichester and Richardson, 1992; Seta et al., 1993; Yates et al., 2006); however, export of dissolved nutrients may increase relative to conventionally farmed systems when combined with subsurface drainage (McIsaac et al., 1995; Tan et al., 1998; Yates et al., 2006). Yates et al. (2006) reported lower amounts of suspended sediments and total phosphorus, yet greater NO3−–N, with increased no-till across 32 subwatersheds within the Upper Thames River Watershed of Ontario, Canada. Similarly, Tan et al. (1998) reported improved soil organic matter and structure with no-till farming but increased tile drainage volume and associated nitrate loss that was attributed to increased soil macropores from earthworm activity.
After 7 yr of stream monitoring, our results showed no significant improvements in water quality in terms of suspended solids, total or dissolved phosphorus, or nitrate concentrations. Multiple studies have shown that subsurface drainage tiles are primary pathways for nutrients, pesticides, and herbicides to surface waters, bypassing the remediation benefits of surface-water BMPs such as grassed waterways and stream buffers (e.g., Logan et al., 1994; David et al., 1997; Xue et al., 1998; Hatfield et al., 1998, 1999; Gentry et al., 2000; Royer et al., 2006). With an estimated 52 to 82% of the total land area in McLean County tile drained (Sugg, 2007), it is likely that subsurface drainage tiles running unimpeded throughout the treatment watershed served as direct pathways of nutrient delivery to Bray Creek, thus bypassing the potential benefits of BMPs that were established within the watershed (e.g., Osborne and Kovacic, 1993). Although some nutrients are likely transported to these watersheds through groundwater, slow seepage pathways through the poorly drained soils of these watersheds are likely rerouted to a large extent by tile drainage that redirects dissolved phosphorus and nitrogen directly to stream surface waters (Gentry et al., 1998; Xue et al., 1998). Xue et al. (1998) reported that drainage tiles contributed between 65 and 69% of the total dissolved phosphorus exported from a row-cropped agricultural watershed in Illinois, in which 75 to 80% of the watershed was estimated to be drained by tiled. Nitrate transport amounts from the treatment and reference watersheds in this study are comparable to those reported from other tile-drained agricultural watersheds by Royer et al. (2006) in Illinois and Schilling (2002) in Iowa (Table 5). Given that McLean County, Illinois, has been designated as one of the most heavily tile-drained counties in the United States (Sugg, 2007; David et al., 2010), we submit that significant improvements in water quality of tile-drained agricultural watersheds such as the Mackinaw River will require additional conservation practices that intercept and retain tile-drained runoff.
|Study site||Nitrogen export||Phosphorus export||Reference|
|kg NO3- N ha−1 yr−1||kg TP† ha−1 yr−1|
|Treatment subwatershed (Mackinaw River, Illinois)||10.7–52.0||0.3–1.6||This study|
|Reference subwatershed (Mackinaw River, Illinois)||9.2–83.6||0.2–1.3||This study|
|Embarras River, Illinois||8.9–56.7||0.2–2.1||Royer et al., 2006|
|Kaskaskia River, Illinois||7.6–57.6||0.1–1.2||Royer et al., 2006|
|Sangamon River, Illinois||9.0–46.8||0.3–0.8||Royer et al., 2006|
|Walnut Creek, Iowa||10.4–43.6||–||Schilling, 2002|
|Squaw Creek, Iowa||13.0–56.3||–||Schilling, 2002|
Previous studies have shown that controlled water management and constructed wetlands have high potential for protecting and restoring water quality in agricultural watersheds. Controlled drainage has been shown to reduce total drain outflow of 79 to 94% and NO3−–N concentrations of 62 to 96% as compared with conventional subsurface drainage (Lalonde et al., 1996; Wesström et al., 2001). Well designed wetlands can effectively intercept and retain tile drainage to remove 46 to 90% of inflowing NO3−–N concentrations (Crumpton et al., 1993; Kovacic et al., 2000; Mitch and Day, 2006; Iovanna et al., 2008). The Mackinaw River in Illinois falls directly within the “corn belt” region of the Midwestern United States, which is depicted as areas of highest NO3−–N flux to the Mississippi River Basin (Goolsby et al., 1999; Mitch and Day, 2006). Wetland studies conducted in central Illinois indicate that to effect N retention within the ranges needed to reduce Gulf Hypoxia, a 5% wetland to watershed ratio would be required (Kovacic et al., 2000). These studies present a compelling case for the need and potential effectiveness of controlled drainage and wetlands at improving water quality at the project site level, although the effectiveness of tile-retention practices at the larger watershed scale has yet to be demonstrated.
A variety of BMPs, including wetlands, were promoted to farmers in the treatment watershed. Although farmers were not willing to take land out of production to construct wetlands, they were amenable to installing well established surface-water practices in cropland areas such as grassed waterways and stream buffers. These experiences suggest that large-scale implementation of BMPs that effectively reduce nutrient export from tile drainage will require aggressive promotion throughout agricultural watersheds. In the United States, NO3−–N fluxes into the Mississippi River Basin are transported primarily from areas within five midwestern states that correspond closely to highest densities of tile drained farmland (Mitch and Day, 2006; Sugg, 2007). Illinois has the highest estimated total area of subsurface drainage than any other state in the basin (4.7 million ha) (Sugg, 2007) and contributes among the highest total nitrogen (16.8%) and phosphorus (12.9%) flux delivered to the Gulf of Mexico (Alexander et al., 2008). Approximately 30 million ha of lands have been drained in the larger Mississippi River Basin, during which 14 million ha of wetlands have been lost (Mitch and Day, 2006). Our study suggests that taxpayer dollars supporting BMPs that are commonly implemented in midwestern tile-drained agricultural watersheds are not solving water quality problems and that tile drainage systems need to be more integrated into future implementation efforts. Increased awareness among policymakers is evident in the most recent Farm Bill, which promotes wetland creation for nitrogen reduction purposes under the Conservation Reserve Program. However, policies and incentive programs that address nutrient reduction specifically from tile drainage sources are still needed.
Outreach is essential to successfully increasing BMP implementation within large agricultural landscapes and targeting BMP placement to landscape areas with the highest contributions to water quality impairments and where BMPs will be most effective (Walter et al., 2007; Lemke et al., 2010). Even with 95% of the stream length protected by buffers, our results showed consistently higher export of TSS in the treatment watershed than in the reference watershed. These results suggest that within the treatment watershed, localized areas remain that continue to contribute to high sediment inputs from surface runoff or instream bank erosion and emphasize the need to identify and target these high-impact areas. Models that integrate key watershed components, such as land use, hydrology, climate, and soil types, are one way to approach targeting areas for nutrient reduction (e.g., Arheimer et al., 2005; Leone et al., 2008; Jha et al., 2010). Targeting can also be considered in terms of timing of nutrient export associated with high-discharge events. The majority of TP export in our study occurred during storm events (86–89%), as has been reported in similar watershed studies by Royer et al. (2006) (84% dissolved phosphorus [DP]), Vanni et al. (2001) (84–99% DP and particulate P), Pionke et al. (1999) (66% DP), and Sharpley et al., 2008 (80% TP). Information on the timing and frequency of storm events could provide increased ability to target watershed conservation efforts that reduce P export during high-discharge events (e.g., Sharpley et al., 2008). In contrast, storm event export of NO3−–N is generally lower, ranging from 44 to 75% of annual export (this study; Royer et al., 2006; Vanni et al., 2001; Pionke et al., 1999). Thus, NO3−–N might be better mitigated using nutrient management to reduce inputs onto the landscape in addition to wetland systems that would serve to capture storm flow as well as the large amounts of NO3−–N exported through baseflows. Given the voluntary nature of conservation practice adoption, it is probably not feasible for state agencies or private organizations to directly conduct intensive, targeted, large-scale outreach, as in this study. As a cost-effective alternative, several studies suggest using landowners to conduct the outreach (Upadhyay et al., 2003; Habron, 2004; Prokopy et al., 2008), and perhaps the role of agencies and organizations should be to serve as coordinators of local outreach efforts and networking opportunities.
Economic incentives are a critical component to BMP implementation (e.g., Napier, 2001; Wossink and Osmond, 2002; Baerenklau, 2005; Kurkalova et al., 2006; Doll and Jackson, 2009). Collectively, these studies show that adoption premiums play a significant role in farmers’ adoption decisions and that some nonadopters will choose not to use conservation practices because the expected profit gain alone does not fully compensate them for the perceived risk associated with changing from current practices. In our study, the years with the greatest increase in adoption of strip-till (2000–2002) were the years when incentives were the highest. Even though a substantial outreach effort targeted increased awareness of the effectiveness of constructed wetlands for retaining tile-drain runoff, land managers were not willing to invest in wetlands as a BMP during this study. Perhaps the economic incentives for wetlands were not great enough or landowners viewed wetland creation as a more permanent change in land use that could not be easily returned to cropland if desired.
Overall, this study emphasizes the need to measure conservation outcomes, not just whether conservation practices have been implemented. Our outreach efforts were successful at promoting implementation of more traditional conservation practices (i.e., buffer strips and grassed waterways), yet water quality was relatively unchanged. Even with 95% cover of relatively wide buffer strips within our treatment watershed, NO3−–N concentrations regularly exceeded the USEPA standard of 10 mg L−1. This emphasizes the importance of quantifying the effect of BMPs before wide-scale implementation efforts. Clearly there is a need for watershed plans that combine conservation practices that target agricultural runoff from tile drains with more traditional surface water practices. Potential benefits that conservation practices may provide to aquatic and terrestrial biota, as well as water quality benefits, should also be considered during watershed scale planning and implementation. Riparian buffers can benefit instream habitat quality for aquatic species (e.g., Teels et al., 2006; Heatherly et al., 2007) and can provide habitat for terrestrial species. Field borders of appropriate widths (30 m) have been shown to provide habitat for a number of overwintering bird species in southern United States (Conover et al., 2007), and small set-aside grassland areas (<150 ha) within cropland landscapes can be nesting habitats for certain declining grassland birds in areas of the midwestern United States where large patches of grassland cover no longer exist (Walk et al., 2010). In southern Appalachian streams of the United States, research recommends riparian buffer widths of 92.6 m to protect stream-breeding salamander species (Crawford and Semlitsch, 2007).
Our study also demonstrates the importance of watershed scale studies. There is an abundance of data from small research plots showing the effectiveness of various BMPs. Skepticism about the relevance of research from these plots has led to increasing use of on-farm research and even a few studies from small watersheds (Mulla et al., 2008). However, the question of how much it will take to affect water quality at larger watershed scales remains unanswered. Most management practices achieve only sparse, nontargeted implementation at the watershed scale (Mulla et al., 2008). Effective large-scale conservation may require the identification of critical areas within the watershed that generate the largest proportion of nutrient and sediment export and targeted implementation of BMPs appropriate to identified sources of agricultural pollution. Headwater areas have been shown to have profound influences on downstream water quantity and quality (Alexander et al., 2007) and are effective target areas for restoration to reduce nutrient loads (Mitch and Day, 2006; Craig et al., 2008). Long-term data from watershed studies, such as this one, can then be used to develop watershed models that target areas of high pollution potential, project watershed responses to BMP implementation, and help to guide watershed scale conservation management.