***NOTE ABOUT THE UNBUFFERED VALIDATION ACCURACY TABLES BEGINNING IN 2016: The training and validation data used to create and accuracy assess the CDL has traditionally been based on ground truth data that is buffered inward 30 meters. This was done 1) because satellite imagery (as well as the polygon reference data) in the past was not georeferenced to the same precision as now (i.e. everything "stacked" less perfectly), 2) to eliminate from training spectrally-mixed pixels at land cover boundaries, and 3) to be spatially conservative during the era when coarser 56 meter AWiFS satellite imagery was incorporated. Ultimately, all of these scenarios created "blurry" edge pixels through the seasonal time series which it was found if ignored from training in the classification helped improve the quality of CDL. However, the accuracy assessment portion of the analysis also used buffered data meaning those same edge pixels were not assessed fully with the rest of the classification. This would be inconsequential if those edge pixels were similar in nature to the rest of the scene but they are not- they tend to be more difficult to classify correctly. Thus, the accuracy assessments as have been presented are inflated somewhat. Beginning with the 2016 CDL season we are creating CDL accuracy assessments using unbuffered validation data. These "unbuffered" accuracy metrics will now reflect the accuracy of field edges which have not been represented previously. Beginning with the 2016 CDLs we published both the traditional "buffered" accuracy metrics and the new "unbuffered" accuracy assessments. The purpose of publishing both versions is to provide a benchmark for users interested in comparing the different validation methods. For the 2019 CDL season we are now only publishing the unbuffered accuracy only publishing the unbuffered accuracy assessments within the official metadata files and offer the full "unbuffered" error matrices for download on the FAQs webpage. Both metadata and FAQs are accessible at <https://www.nass.usda.gov/Research_and_Science/Cropland/SARS1a.php>. We plan to continue producing these unbuffered accuracy assessments for future CDLs. However, there are no plans to create these unbuffered accuracy assessments for past years. It should be noted that accuracy assessment is challenging and the CDL group has always strived to provide robust metrics of usability to the land cover community. This admission of modestly inflated accuracy measures does not render past assessments useless. They were all done consistently so comparison across years and/or states is still valid. Yet, by providing both scenarios for 2016 gives guidance on the bias. If the following table does not display properly, then please visit this internet site <https://www.nass.usda.gov/Research_and_Science/Cropland/metadata/meta.php> to view the original metadata file.
USDA, National Agricultural Statistics Service, 2019 Washington Cropland Data Layer
STATEWIDE AGRICULTURAL ACCURACY REPORT
Crop-specific covers only *Correct Accuracy Error Kappa
------------------------- ------- -------- ------ -----
OVERALL ACCURACY** 491,752 89.1% 10.9% 0.866
Cover Attribute *Correct Producer's Omission User's Commission Cond'l
Type Code Pixels Accuracy Error Kappa Accuracy Error Kappa
---- ---- ------ -------- ----- ----- -------- ----- -----
Corn 1 14,756 86.9% 13.1% 0.866 87.7% 12.3% 0.875
Soybeans 5 0 n/a n/a n/a 0.0% 100.0% 0.000
Sunflower 6 136 34.3% 65.7% 0.343 76.8% 23.2% 0.768
Sweet Corn 12 1,049 48.0% 52.0% 0.479 72.3% 27.7% 0.723
Mint 14 1,696 76.3% 23.7% 0.762 85.4% 14.6% 0.854
Barley 21 5,158 63.9% 36.1% 0.637 81.5% 18.5% 0.814
Durum Wheat 22 0 0.0% 100.0% 0.000 n/a n/a n/a
Spring Wheat 23 45,082 90.5% 9.5% 0.900 88.8% 11.2% 0.882
Winter Wheat 24 180,151 96.8% 3.2% 0.961 96.2% 3.8% 0.953
Other Small Grains 25 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Rye 27 54 16.5% 83.5% 0.164 31.8% 68.2% 0.317
Oats 28 145 23.0% 77.0% 0.230 63.3% 36.7% 0.633
Canola 31 5,171 79.9% 20.1% 0.798 92.3% 7.7% 0.922
Flaxseed 32 0 0.0% 100.0% 0.000 n/a n/a n/a
Mustard 35 111 36.2% 63.8% 0.361 72.5% 27.5% 0.725
Alfalfa 36 25,535 82.0% 18.0% 0.814 82.8% 17.2% 0.823
Other Hay/Non Alfalfa 37 13,984 58.8% 41.2% 0.580 74.5% 25.5% 0.738
Buckwheat 39 90 66.7% 33.3% 0.667 51.4% 48.6% 0.514
Sugarbeets 41 180 70.9% 29.1% 0.709 83.7% 16.3% 0.837
Dry Beans 42 1,540 64.3% 35.7% 0.642 72.3% 27.7% 0.722
Potatoes 43 8,930 86.5% 13.5% 0.864 86.4% 13.6% 0.863
Other Crops 44 71 37.8% 62.2% 0.378 46.4% 53.6% 0.464
Misc Vegs & Fruits 47 31 49.2% 50.8% 0.492 77.5% 22.5% 0.775
Watermelons 48 16 66.7% 33.3% 0.667 57.1% 42.9% 0.571
Onions 49 1,471 87.0% 13.0% 0.870 83.8% 16.2% 0.838
Cucumbers 50 3 15.0% 85.0% 0.150 18.8% 81.3% 0.187
Chick Peas 51 8,834 88.4% 11.6% 0.883 91.6% 8.4% 0.915
Lentils 52 5,576 89.3% 10.7% 0.892 91.8% 8.2% 0.918
Peas 53 5,843 75.2% 24.8% 0.750 83.5% 16.5% 0.834
Tomatoes 54 0 n/a n/a n/a 0.0% 100.0% 0.000
Caneberries 55 634 66.8% 33.2% 0.668 60.0% 40.0% 0.600
Hops 56 4,207 95.8% 4.2% 0.958 93.6% 6.4% 0.936
Herbs 57 19 43.2% 56.8% 0.432 65.5% 34.5% 0.655
Clover/Wildflowers 58 10 7.5% 92.5% 0.075 27.8% 72.2% 0.278
Sod/Grass Seed 59 4,233 70.8% 29.2% 0.706 78.4% 21.6% 0.783
Fallow/Idle Cropland 61 125,214 95.6% 4.4% 0.949 96.9% 3.1% 0.964
Cherries 66 3,643 80.3% 19.7% 0.803 84.4% 15.6% 0.843
Peaches 67 84 40.2% 59.8% 0.402 65.1% 34.9% 0.651
Apples 68 16,382 93.3% 6.7% 0.931 89.2% 10.8% 0.889
Grapes 69 7,022 94.5% 5.5% 0.944 90.3% 9.7% 0.902
Christmas Trees 70 410 64.5% 35.5% 0.644 75.6% 24.4% 0.756
Other Tree Crops 71 27 8.3% 91.7% 0.083 14.1% 85.9% 0.141
Walnuts 76 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Pears 77 1,176 74.3% 25.7% 0.742 80.9% 19.1% 0.808
Triticale 205 407 32.1% 67.9% 0.321 70.8% 29.2% 0.707
Carrots 206 269 56.0% 44.0% 0.560 79.8% 20.2% 0.798
Asparagus 207 36 63.2% 36.8% 0.632 48.0% 52.0% 0.480
Cantaloupes 209 0 n/a n/a n/a 0.0% 100.0% 0.000
Broccoli 214 27 93.1% 6.9% 0.931 58.7% 41.3% 0.587
Peppers 216 1 1.1% 98.9% 0.011 14.3% 85.7% 0.143
Greens 219 222 54.4% 45.6% 0.544 73.5% 26.5% 0.735
Plums 220 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Strawberries 221 8 20.0% 80.0% 0.200 40.0% 60.0% 0.400
Squash 222 1 2.7% 97.3% 0.027 20.0% 80.0% 0.200
Apricots 223 24 38.1% 61.9% 0.381 28.2% 71.8% 0.282
Vetch 224 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Dbl Crop WinWht/Corn 225 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Dbl Crop Triticale/Corn 228 692 66.9% 33.1% 0.668 63.5% 36.5% 0.634
Pumpkins 229 41 60.3% 39.7% 0.603 65.1% 34.9% 0.651
Blueberries 242 1,303 77.2% 22.8% 0.772 79.5% 20.5% 0.794
Cabbage 243 26 36.1% 63.9% 0.361 44.1% 55.9% 0.441
Cauliflower 244 3 8.6% 91.4% 0.086 27.3% 72.7% 0.273
Radishes 246 0 n/a n/a n/a 0.0% 100.0% 0.000
Cranberries 250 18 52.9% 47.1% 0.529 56.3% 43.8% 0.562
*Correct Pixels represents the total number of independent validation pixels correctly identified in the error matrix.
**The Overall Accuracy represents only the FSA row crops and annual fruit and vegetables (codes 1-61, 66-80, 92 and 200-255).
FSA-sampled grass and pasture. Non-agricultural and NLCD-sampled categories (codes 62-65, 81-91 and 93-199) are not included in the Overall Accuracy.
The accuracy of the non-agricultural land cover classes within the Cropland Data Layer is entirely dependent upon the USGS, National Land Cover Database (NLCD 2016). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover. For more information on the accuracy of the NLCD please reference <https://www.mrlc.gov/>.
Attribute_Accuracy_Value:
Classification accuracy is generally 85% to 95% correct for the major crop-specific land cover categories. See the 'Attribute Accuracy Report' section of this metadata file for the detailed accuracy report.
Attribute_Accuracy_Explanation:
The strength and emphasis of the CDL is crop-specific land cover categories. The accuracy of the CDL non-agricultural land cover classes is entirely dependent upon the USGS, National Land Cover Database (NLCD 2016). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover.
These definitions of accuracy statistics were derived from the following book: Congalton, Russell G. and Kass Green. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices. Boca Raton, Florida: CRC Press, Inc. 1999. The 'Producer's Accuracy' is calculated for each cover type in the ground truth and indicates the probability that a ground truth pixel will be correctly mapped (across all cover types) and measures 'errors of omission'. An 'Omission Error' occurs when a pixel is excluded from the category to which it belongs in the validation dataset. The 'User's Accuracy' indicates the probability that a pixel from the CDL classification actually matches the ground truth data and measures 'errors of commission'. The 'Commission Error' represent when a pixel is included in an incorrect category according to the validation data. It is important to take into consideration errors of omission and commission. For example, if you classify every pixel in a scene to 'wheat', then you have 100% Producer's Accuracy for the wheat category and 0% Omission Error. However, you would also have a very high error of commission as all other crop types would be included in the incorrect category. The 'Kappa' is a measure of agreement based on the difference between the actual agreement in the error matrix (i.e., the agreement between the remotely sensed classification and the reference data as indicated by the major diagonal) and the chance agreement which is indicated by the row and column totals. The 'Conditional Kappa Coefficient' is the agreement for an individual category within the entire error matrix.