How is 3D Point Accuracy Calculated?

How is 3D Point Accuracy Calculated?

3D point accuracy in LiDAR systems, including the Leica RTC360 3D Laser Scanner, is calculated based on several factors that influence the precision of the measured points in a three-dimensional space. These factors include the range accuracy, angular accuracy, and the overall system performance. Here’s a detailed breakdown of how 3D point accuracy is typically calculated:

Factors Influencing 3D Point Accuracy

1. Range Accuracy:
  1. This is the accuracy of the distance measurement from the scanner to a point in space. It is often represented as a combination of a constant term and a term proportional to the distance (e.g., 1.0 mm + 10 ppm).
2. Angular Accuracy:
  1. This refers to the precision of the angular measurements in both the horizontal and vertical directions. It is usually expressed in arcseconds (") or degrees.
3. Laser Beam Divergence:
  1. The spread of the laser beam as it travels, which can affect the size of the spot on the target and consequently the accuracy of the measured point.
4. Environmental Factors:
  1. Conditions such as temperature, humidity, and atmospheric pressure can impact the measurement accuracy.

Calculation Method

To calculate the 3D point accuracy, the following formula can be used:

3D Point Accuracy = √((Range Accuracy)2 + (Angular Accuracy)2 × (Distance)2)


  1. Range Accuracy is the precision of the distance measurement.
  2. Angular Accuracy is the precision of the angular measurement.
  3. Distance is the range from the scanner to the measured point.

Example Calculation for the Leica RTC360

Given the following specifications from the Leica RTC360 datasheet:

  1. Range Accuracy: 1.0 mm + 10 ppm (parts per million)
  2. Angular Accuracy: 18 arcseconds (")
  1. Let’s assume a distance of 10 meters.
Step 1 - Convert Angular Accuracy to Radians:

18" = 18/3600 degrees = 18/3600 × π/180 radians ≈ 8.726×10^(−5 radians


Step 2 - Determine the Range Accuracy for a Specific Distance:
Range Accuracy = 1.01.0 mm + 10×10−6×10,00010 \times 10^{-6} \times 10,000 = 1.11.1 mm

Step 3 - Calculate the 3D Point Accuracy:

3D Point Accuracy = √((1.1 mm)2 + (8.726 × 10-5 radians × 10,000 mm)2)

3D Point Accuracy = √((1.12 + (8.726 × 10-5 × 10,000)2)
3D Point Accuracy = √((1.1)2 + (0.8726)2)
3D Point Accuracy = √(1.21 + 0.7614)
3D Point Accuracy ≈ √(1.9714)
3D Point Accuracy ≈ 1.4 mm

Thus, the 3D point accuracy for a distance of 10 meters is approximately 1.4 mm.

This calculation illustrates how the combination of range and angular accuracy contributes to the overall 3D point accuracy in a LiDAR system. The actual values will vary based on the specific conditions and the specifications of the scanner being used.