Abstract
Objectives
To evaluate the performance of periapical radiography assessed under different radiographic brightness and contrast variations in the detection of simulated internal (IRR) and external (ERR) root resorption lesions. Additionally, observers' preferences related to image quality for these diagnostic tasks were evaluated.
Methods
Thirty single-root teeth were divided into two groups (n = 15): IRR, in which lesions were simulated using mechanical and biochemical processes; and ERR, in which cavities standardized with drills of different sizes were performed on the root surfaces. Digital radiographs were obtained and subsequently adjusted in 4 additional combinations, resulting in 5 brightness/contrast variations (V1–V5). Five radiologists evaluated the radiographs. The observers' preference on the image quality was also recorded.
Results
For both conditions, there were no differences in the accuracy and specificity between the five brightness/contrast variations (p > 0.05), but the sensitivity for ERR was significantly lower in V4 (+ 15% brightness/−15% contrast) in the large size (p < 0.05). The observers classified V2 (− 15% brightness/+15% contrast) as the "best" image quality for IRR and ERR evaluation.
Conclusions
For IRR and ERR lesions, brightness and contrast variation does not affect the diagnostic performance of digital intraoral radiography within the tested range. The observers prefer images with a reasonable decrease in brightness and increase in contrast.
Clinical relevance
Brightness and contrast enhancement tools are commonly applied in digital radiographic assessment. The use of these tools for detection of root resorptions can be applied according to the observer preference without influence on diagnostic accuracy.
https://ift.tt/2G46aNY