Development and validation of a recommended checklist for assessment of surgical videos quality: the LAParoscopic surgery Video Educational GuidelineS (LAP-VEGaS) video assessment tool
Smart, Neil J.
© The Author(s) 2020 This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
CC0 1.0 Universal
CC0 1.0 Universal
Introduction: There has been a constant increase in the number of published surgical videos with preference for open-access sources, but the proportion of videos undergoing peer-review prior to publication has markedly decreased, raising questions over quality of the educational content presented. The aim of this study was the development and validation of a standard framework for the appraisal of surgical videos submitted for presentation and publication, the LAParoscopic surgery Video Educational GuidelineS (LAP-VEGaS) video assessment tool. Methods: An international committee identified items for inclusion in the LAP-VEGaS video assessment tool and finalised the marking score utilising Delphi methodology. The tool was finally validated by anonymous evaluation of selected videos by a group of validators not involved in the tool development. Results: 9 items were included in the LAP-VEGaS video assessment tool, with every item scoring from 0 (item not presented in the video) to 2 (item extensively presented in the video), with a total marking score ranging from 0 to 18. The LAP-VEGaS video assessment tool resulted highly accurate in identifying and selecting videos for acceptance for conference presentation and publication, with high level of internal consistency and generalisability. Conclusions: We propose that peer review in adherence to the LAP-VEGaS video assessment tool could enhance the overall quality of published video outputs.
Celentano V et al. Development and validation of a recommended checklist for assessment of surgical videos quality: the LAParoscopic surgery Video Educational GuidelineS (LAP-VEGaS) video assessment tool [published online ahead of print, 2020 Apr 6]. Surg Endosc. 2020;10.1007/s00464-020-07517-4. doi:10.1007/s00464-020-07517-4
This article is freely available via Open Access. Click on the Publisher URL to access it via the publisher's site.