Defense Date


Graduation Date

Spring 5-5-2023


Immediate Access

Submission Type


Degree Name



Biomedical Engineering


Rangos School of Health Sciences

Committee Chair

Bin Yang

Committee Member

Rana Zakerzadeh

Committee Member

Yvonne Weideman


Burn, Segmentation, Structured Light Imaging, Face-ID


Skin lesions, including burn wounds, are characterized by several key physical properties including the size, shape, and coloration of the wound. Assessing the state of the wound is critical to ensure the best chance of patient recovery. Initial assessments are typically done with visual inspection, which, given human eyes, is subjective and non-repeatable.

In this study, we investigated the feasibility of characterizing skin wound properties, such as surface area, based on the 3D model obtained with an iOS device, aiming for providing more accurate skin wound assessment than conventional 2D imaging based assessment. Since 2D images lack depth data, they cannot be used to characterize variable heights across a wound, and they would not be able to account for curved or irregularly shaped body surfaces when calculating area.

We achieved 3D scanning capability with an iOS device by repurposing its dot projector module, which is an essential component in the ‘Face ID’ function. We chose to use dot projector scanning over other existing 3D scanning methods, such as photometry, due to dot projection offering consistent scaling and faster scan speed.

We developed a custom tool to accurately segment the affected skin regions in a scanned model based on user inputs and color data. Segmentation was accomplished by utilizing global color thresholding to objectively segregate damaged skin area from healthy tissues. Geometrical information such as surface area, heights of specific deformations, and perimeters of specific sections could be quantified.

We determined that a 3D model with high accuracy can be obtained if the distance between the iPad and the object is less than 18 inches. Segmenting such models works consistently so long as the region is evenly lit and uniformly colored, though utilizing multistep segmentation can aid in varied colored regions. We conducted a calibration study on objects with known surface areas to minimize the discrepancy between the ground truth and the quantified values. Our results demonstrated that the error of the calibrated results is within 5% and the variation is less than 3% when it was used to quantify objects with known flat surface areas. On the complex geometry of simulated skin burns, we found that the coefficient of variation was about 10% for multiple scans. Our study demonstrated that it is feasible to accurately characterize skin wounds using 3D models acquired with an iOS device. This technique can be applied in clinical settings to assess/document the severity of skin injury and track the recovery and response to the treatment. It also has the potential to facilitate telemedicine with high fidelity 3D imagery.