Purpose To assess the complementary natures of (a) a peer review (PR)-mandated database for physician review and discrepancy reporting and (b) a voluntary quality assurance (QA) system for anecdotal reporting. Materials and Methods This study was institutional review board approved and HIPAA compliant; informed consent was waived. Submissions to voluntary QA and mandatory PR databases were searched for obstetrics and gynecology-related keywords. Cases were graded independently by two radiologists, with final grades resolved via consensus. Errors were categorized as perceptional, interpretive, communication related, or procedural. Effect of errors was assessed in terms of clinical and radiologic follow-up. Results There were 185 and 64 cases with issues attributed to 32 and 27 radiologists in QA and PR databases, respectively; 23 and nine radiologists, respectively, had cases attributed to only them. Procedure-related entries were submitted almost exclusively through the QA database (62 of 64 [97%]). In QA and PR databases, respectively, perceptional (47 of 185 [25%] and 27 of 64 [42%]) and interpretative (64 of 185 [34%] and 30 of 64 [47%]) issues constituted most errors. Most entries in both databases (104 of 185 [56%] in QA and 49 of 64 [76%] in PR) were considered minor events: wording in the report, findings already known from patient history or prior imaging or concurrent follow-up imaging, or delay in diagnosing a benign finding. Databases had similar percentages of moderate events (28 of 185 [15%] in QA and nine of 64 [14%] in PR), such as recommending unnecessary follow-up imaging or radiation exposure in pregnancy without knowing the patient was pregnant (nine of 64 [14%] in PR and 28 of 185 [15%] in QA). The PR database had fewer major events (one of 64 [1.6%]) than the QA database (32 of 185 [17%]). Conclusion The two quality improvement systems are complementary, with the QA database yielding less frequent but more clinically important errors, while the PR database serves to establish benchmarks for error rate in radiologists' performance. © RSNA, 2014 Online supplemental material is available for this article.