Calculating Allowable or Acceptable RMS Error

 

Allowable RMS error defines the accuracy standard you want to apply to your Didger project. It is a measure of the level of confidence you can apply to the data you digitize. Before you begin a project, you should determine the level of acceptable error. In addition, regardless of the standards you apply, it is a good idea to always check the RMS value before proceeding with your project. For example, if you inadvertently enter incorrect coordinates for your calibration, this can be reflected in the RMS value.

 

Different projects require different allowable RMS errors.

§   Some projects might require a high level of confidence as to the digitized data accuracy. For example, you might need to record the positions of underground utility lines before you begin excavating in an area. This would require that you had a high degree of confidence in your data before proceeding with the project. In this case, you might want a relatively low RMS value for your project.

§   Some projects do not require very stringent error parameters. For example, if your project is digitizing sample points you recorded by hand on a topographic map (the map positions were only approximated to begin with), you might not care as much about the digitizing accuracy because the point location is slightly suspect. Under these conditions, the actual RMS value is not as important, although you should at least look at the RMS value to determine if it appears realistic for your project.

 

The key point with RMS values is to establish what the acceptable error for your project is, and make sure the calibration RMS error falls within the acceptable parameters. If your company uses established standards, then they can be employed for your Didger project. If you have no established standards, you can define them any way you want. Just decide how accurate you want your data to be, and follow the guidelines discussed below to achieve those standards.

 

You can think of allowable RMS error as acceptable error on the ground. If you were to locate a digitized point on the ground, how far off can you afford to be, and with what certainty? This acceptable error on the ground determines the allowable RMS. Allowable RMS error is defined by the formula

 

image\RMSeq.png

where

§   Allowable RMS is the RMS value that is at or within the standards you set for your project. RMS is the standard deviation of errors reported for all digitized calibration points.

§   Acceptable Error on the Ground is the distance, at true scale (1:1), from the true location that digitized points represent. If you determine the exact real-world location of a digitized point from your document, the acceptable error represents the possible distance from the true location for the point.

§   The RMS Factor is the number of standard deviations (RMS distances) represented by an acceptable chance of occurrence. From the table shown previously, the RMS factors are 0.5 for 38.3 percent, 1.0 for 68.3 percent, 1.5 for 86.6 percent, 2.0 for 95.4 percent, 2.5 for 98.8 percent, and 3.0 for 99.7 percent. These values are equivalent to the Z score from a standard normal probability table.

 

For example, let’s say you wanted to be 95.4 percent sure (RMS factor = 2) that digitized points were within five units (acceptable error on the ground) of their true location. In this case, the allowable RMS would be

 

image\RMS4.png

 

During calibration of this example, you would need to be sure that your RMS value was at or below 2.5.

 

 

See Also

Digitizing Accuracy and Acceptable Error

RMS Error Value

An Example of Allowable Error Based on Map Scale

An Example of Allowable Error Based on a Percentage Value