Abstract
Three-dimensional stereophotogrammetry is commonly used to assess volumetric changes
after facial procedures. A lack of clear landmarks in aesthetic regions complicates
the reproduction of selected areas in sequential images. A three-dimensional volumetric
analysis was developed based on a personalized aesthetic template. The accuracy and
reproducibility of this method were assessed. Six female volunteers were photographed
using the 3dMDtrio system according to a clinical protocol, twice at baseline (T1)
and twice after 1 year (T2). A styrofoam head was used as control. A standardized aesthetic template
was morphed over the baseline images of the volunteers using a coherent point drift
algorithm. The resulting personalized template was projected over all sequential images
to assess surface area differences, volume differences, and root mean square errors.
In 12 well-defined aesthetic areas, mean average surface area and volume differences
between the two T1 images ranged from −7.6 mm2 to 10.1 mm2 and −0.11 cm3 to 0.13 cm3, respectively. T1 root mean square errors ranged between 0.24 mm and 0.62 mm (standard deviation 0.18–0.73 mm). Comparable differences were found between the T2 images. An increase in volume
between T1 and T2 was only observed for volunteers who gained in body weight. Personalized
aesthetic templates are an accurate and reproducible method to assess changes in aesthetic
areas.
Key words
To read this article in full you will need to make a payment
Purchase one-time access:
Academic & Personal: 24 hour online accessCorporate R&D Professionals: 24 hour online accessOne-time access price info
- For academic or personal research use, select 'Academic and Personal'
- For corporate R&D use, select 'Corporate R&D Professionals'
Subscribe:
Subscribe to International Journal of Oral and Maxillofacial SurgeryAlready a print subscriber? Claim online access
Already an online subscriber? Sign in
Register: Create an account
Institutional Access: Sign in to ScienceDirect
References
- Precision and accuracy of the 3dMD photogrammetric system in craniomaxillofacial application.J Craniofac Surg. 2010; 21: 763-767
- Three-dimensional imaging of the face: a comparison between three different imaging modalities.Aesthet Surg J. 2018; 38: 579-585
- Registration of 3-dimensional facial photographs for clinical use.J Oral Maxillofac Surg. 2010; 68: 2391-2401
- Variation of the face in rest using 3D stereophotogrammetry.Int J Oral Maxillofac Surg. 2011; 40: 1252-1257
- Volumetric changes of the mid and lower face with animation and the standardization of three-dimensional facial imaging.Plast Reconstr Surg. 2019; 143: 76-85
- Reliability and validity of measurements of facial swelling with a stereophotogrammetry optical three-dimensional scanner.Br J Oral Maxillofac Surg. 2014; 52: 922-927
- The boomerang lift: a three-step compartment-based approach to the youthful cheek.Plast Reconstr Surg. 2018; 141: 910-913
- Response to comment on: A novel noninvasive three-dimensional volumetric analysis for fat-graft survival in facial recontouring using the 3L and 3M technique.J Plast Reconstr Aesthet Surg. 2016; 69: 1161-1162
- The safety and efficacy of cell-assisted fat grafting to traditional fat grafting in the anterior mid-face: an indirect assessment by 3D imaging.Aesthetic Plast Surg. 2015; 39: 833-846
- Joint face detection and alignment using multitask cascaded convolutional networks.IEEE Signal Process Lett. 2016; 23: 1499-1503
- Point set registration: Coherent point drift.IEEE Trans Pattern Anal Mach Intell. 2010; 32: 2262-2275
- Deformable vessel-based registration using landmark-guided coherent point drift.In: Medical imaging and augmented reality. Springer, Berlin, Heidelberg2010: 60-69
- Assessment of the reproducibility of facial expressions with 3-D stereophotogrammetry.Otolaryngol Head Neck Surg. 2009; 140: 76-81
Article info
Publication history
Published online: February 18, 2020
Accepted:
January 14,
2020
Identification
Copyright
© 2020 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.