Tuesday, 23 June 2009 13:16
Over the last decade there has been increasing interest in quality issues in Endoscopy. The GRS has served to drive up the quality of service and there is ongoing work under the auspices of JAG to derive a framework for quality assurance of individual Endoscopists. What has been lacking hitherto is an infrastructure to facilitate collection and analyses of data to allow practitioners to compare themselves with others ("benchmarking").
Several pilot projects are now in progress internationally. One is the "ERCP Quality Network", which was initiated by Peter Cotton and is supported by Olympus. Practitioners (or their staff) upload key data points (indications, sedation/anesthesia, therapies, successes and adverse events), on each case, to a central web site, without identifying the patients. Data are analyzed and updated immediately to provide a “report card” which is available to the person who reported it (and to no one else). That person can then compare their own data with the average of all other people submitting data (again without being able to identify them). As of August 2009, 81 endoscopists had entered over 12,000 ERCP procedures. 8 UK endoscopists had contributed 1000 of that total. The figure below is an example of being able to compare one’s own (in blue) biliary cannulation rate with that of all of the other contributors (in red, not individually identified).
Sceptics worry about the quality of self-reported data, and some are concerned that the data may show them in a bad light. One answer to these criticisms stems from the fact that the data are submitted anonymously. That being the case, the only person who could be deceived by submitting inaccurate data is that person. And, as can be seen in the figure, some people are indeed reporting poor success rates. There is no intention to publish "league tables".