Background and Objective: Evaluation is a systematic process of gathering, analyzing and interpreting data for making assessment and judgment. This study aimed to evaluate the efficacy of a test development workshop on the quality of final written exams.Materials and Methods: In this semi-experimental study, written final exams of all theoretical courses before and after presenting a workshop were gathered and a booklet titled “Standard rules of questions” was distributed. Taxonomy, content relevance, and content coverage of exams were analyzed by three course lecturers of each department. Other indices such as difficulty and differential coefficients, value of distracter and borderline choices, error coefficient, and considering structural rules were calculated by means of Millman’s checklist and standard formulae. Independent-Samples T-Test, Mann-Whitney U Test, and ANOVA were used to analyze the data.Results: Before intervention, content relevance and difficulty coefficient were in acceptable levels. Proportion of multiple choice questions, content relevance, and differential coefficient increased after intervention, whereas consideration of structural rules, difficulty coefficient, number of distracters requiring revision, error coefficient, number of questions requiring revision, and coverage relevance decreased. Generally, the intervention just decreased the error coefficient practically.Conclusion: A short- term course on standard rules of question development does not affect quality of exam questions considerably and a long term education is suggested in this regard.