After you export your data from Didger, you can use the exported data to produce a map. You can determine the standards you want to apply to your projects, or you can use official standards. For example, the 1947 revision of the United States Map Accuracy Standards states that no more than 10 percent of points on a map are more than 0.033 inches off on a 1:20,000 scale or smaller map, or no more than 0.02 inches off on a map at a larger scale than 1:20,000.
Let’s say you want to determine your own standards using a 1:24,000 scale base map and you want the digitized points to be within 0.05 inches of their true map position. Your standards require that you are 95 percent sure that your points are within this limit. In other words, when you produce a 1:24,000 map of the digitized data, you are 95 percent sure that the plotted positions are within 0.05 inches of their true location. With this information, you can determine the acceptable RMS error for your project.
If you calibrate the map using feet as the calibration point units, the RMS value is also in feet. To calculate an acceptable RMS value, you must translate the acceptable map error (0.05 inches) to acceptable error on the ground (the error at true scale). Ground error is based on map error, so you can convert required map error to required ground error. This is done using the formula
Acceptable Ground Error = Acceptable map error * Scale * units conversion
so
Acceptable Ground Error = 0.05 inch * 24,000 * 0.083 feet/inch = 99.6 feet
From this you can determine the acceptable RMS. Remember that you want to be 95 percent sure that your digitized information is within the limit, which corresponds to about two RMS distances (two standard deviations).
so
See Also
Digitizing Accuracy and Acceptable Error