Acessibilidade / Reportar erro

Evaluation and classification of residence programs in Otorhinolaryngology

Abstracts

Residence training is defined in Brazil as a full time learning in practice process, developed in public health institutions, private hospitals and clinics, under surveillance of qualified medical staff.

OBJECTIVE:

To present the development process of a protocol of evaluation of residence programs in otolaryngology and its classification by quality.

METHOD:

Design a comprehensive protocol of evaluation to cover the broad aspects of medical education in otolaryngology. Classify the training programs by quality. Evaluate board certification performance of its residents. Analyze the correlation between program quality classification status and residents scores at the board certification exam.

RESULTS:

Eighty two residence programs were evaluated across the country in 2004, as follows: level A (12.20%), B+ (7.3%), B (19.5%), C+ (20.7%), C (17.1%), D (20.7%) and E (2.4%) p < 0.005.

CONCLUSION:

The evaluation program was able to discriminate, quality-wise, the training programs. The grades from those who passed the ABORL-CCF Board's Exam have the same ranking trend as that of their training institution. There was an improvement in the ranking of training programs after the program was implemented.

education; education, medical, undergraduate; internship and residency; specialization


A residência médica é ensino de pós-graduação caracterizado como treinamento em serviço.

OBJETIVO:

Apresentar o protocolo de avaliação da ABORL-CCF dos cursos de residência ou especialização.

MÉTODO:

Desenvolvimento de um protocolo de avaliação; Classificação por qualidade; Comparação do desempenho dos alunos egressos no concurso para obtenção de título de especialista, segundo a classificação de seus programas de origem.

RESULTADOS:

Foram avaliados 82 programas de ensino em 2004 classificados como A+ (12,20%), B+ (7,3%), B (19,5%), C+ (20,7%), C (17,1%), D (20,7%) e E (2,4%). Houve discrepância em qualidade e distribuição geográfica. A classificação dos candidatos para obtenção do título de especialista foi melhor para os egressos dos programas mais qualificados. A curva das notas das instituições tem a mesma tendência decrescente que a obtida com as médias dos seus alunos. Dos programas inicialmente classificados como E, D, C ou C+, 77% melhoraram de classificação quando reavaliados após três anos.

CONCLUSÃO:

O protocolo de avaliação foi capaz de discriminar por qualidade os programas de ensino. As notas dos aprovados no concurso para título de especialista têm a mesma tendência da classificação de suas instituições. Houve melhora na classificação dos programas de ensino após a implantação do protocolo.

educação; educação superior; especialização; internato e residência


INTRODUCTION

ENT training can happen in Otorhinolaryngology residency programs accredited by the Ministry of Education and Culture (MEC), with an automatic right to the Board's Title of Specialist; or ENT can be taught in Specialization Programs accredited by the Brazilian Association of Otorhinolaryngology and Neck and Facial Surgery (ABORL-CCF), which grants the Title of Specialist to those approved in its Board's exam1Sociedade Brasileira de Otorrinolaringologia. Censo 2002. São Paulo: Sociedade Brasileira de Otorrinolaringologia; 2002. 87p.,2Brasil. Coordenadoria de Aperfeiçoamento de Pessoal de nível Superior-CAPES - Resoluções [Acessado 19 outubro 2013]. Disponível em: http://www.capes.gov.br/sobre-a-capes/legislacao/2341-resolucoes
http://www.capes.gov.br/sobre-a-capes/le...
after they graduate from the program.

Specialization programs, as well as medical residencies, are defined as graduate programs for physicians, and characterized as a full-time theoretical and practical on-the-job training (OJT), offered in public or private healthcare institutions linked to universities or not, under the guidance of highly qualified and ethical medical professionals1Sociedade Brasileira de Otorrinolaringologia. Censo 2002. São Paulo: Sociedade Brasileira de Otorrinolaringologia; 2002. 87p.

Brasil. Coordenadoria de Aperfeiçoamento de Pessoal de nível Superior-CAPES - Resoluções [Acessado 19 outubro 2013]. Disponível em: http://www.capes.gov.br/sobre-a-capes/legislacao/2341-resolucoes
http://www.capes.gov.br/sobre-a-capes/le...
-3Comissão Estadual de Residência Médica - CEREM [Acessado em 19 de outubro de 2013]. Disponível em: http://www.cerem.org.br/
http://www.cerem.org.br/...
.

Program accreditation is conditioned on compliance with the minimum requirements established by MEC or the ABORL-CCF, which is more strict about the syllabus. The criteria established by MEC are more strict concerning the legal aspects of teaching, such as the mandatory payment of the scholarship, compliance with the established workload, having breaks and others3Comissão Estadual de Residência Médica - CEREM [Acessado em 19 de outubro de 2013]. Disponível em: http://www.cerem.org.br/
http://www.cerem.org.br/...
,4Dias Sobrinho J. Paradigmas e políticas de avaliação da educação superior. Autonomia e heteronomia. In: Universidad e investigación científica: convergências y tensiones. Vessuri H, org. Buenos Aires: CLACSO, Consejo Latinoamericano de Ciencias Sociales; 2006..

These requirements are intended to ensure the desired educational profile of an ENT specialist, a goal to be achieved at the end of training; involving a certain knowledge and skills that the students must acquire on the job, enabling them to properly perform in the specialty2Brasil. Coordenadoria de Aperfeiçoamento de Pessoal de nível Superior-CAPES - Resoluções [Acessado 19 outubro 2013]. Disponível em: http://www.capes.gov.br/sobre-a-capes/legislacao/2341-resolucoes
http://www.capes.gov.br/sobre-a-capes/le...
.

The Commission on Higher Education Professional Development (CAPES ) evaluates and ranks Master's and Doctoral programs through a rigorous and well-known process, which has continuously heightened the quality of research in Brazil3Comissão Estadual de Residência Médica - CEREM [Acessado em 19 de outubro de 2013]. Disponível em: http://www.cerem.org.br/
http://www.cerem.org.br/...
.

The National Commission on Medical Residency (NCMR) has not developed such an encompassing evaluation system, affecting the accreditation of residency programs to comply with "minimum criteria". The NCMR criteria level off training programs at minimum parameters, do not highlight their quality, do not foster improvements, and thus do not stimulate the qualitative improvement of medical residency programs5Dias Sobrinho J. Universidade e Avaliação Entre a ética e o mercado. Florianópolis: Insular; 1997. 54p..

Openings for medical residency in otorhinolaryngology offered by universities or medical schools do not meet the demand for this specialty. Approximately 50% of those enrolled to take the ABORL-CCF Board's Title of Specialist Exam are alumni from training programs taken in general hospitals or private clinics. These training programs are numerically significant in the education of Brazilian otorhinolaryngologists and, consequently, in the specialty's profile in Brazil. Of these, some are accredited by MEC and others by the ABORL-CCF. For not being inside academic settings and, eventually, being taught by program coordinators without academic training, they some times may provide insufficient training to young specialists, without proper content oversight4Dias Sobrinho J. Paradigmas e políticas de avaliação da educação superior. Autonomia e heteronomia. In: Universidad e investigación científica: convergências y tensiones. Vessuri H, org. Buenos Aires: CLACSO, Consejo Latinoamericano de Ciencias Sociales; 2006.

Dias Sobrinho J. Universidade e Avaliação Entre a ética e o mercado. Florianópolis: Insular; 1997. 54p.
-6Entrevista de Antonio Carlos Lopes. J Clin. 2004;66:1..

It is likely that there are residency programs not fully integrated into the educational activities that focus on care practice and service delivery. In extreme conditions, perhaps not so uncommon, the residents act as mere low-cost workers, in exchange for some poorly qualified learning activity.

There are programs which preceptors are celebrated physicians acting in only one field of Otorhinolaryngology, exposing their students to a partial training, with focal excellence, lacking the teaching of the other fields of the specialty.

The new doctors, upon finishing their undergraduate programs, are usually unaware of the characteristics of the medical residency programs to which they are applying.

The reputation of the institutions, or even that of its members, are not full guarantees of the teaching provided there.

How do medical residency programs in Brazil compare to those developed in other countries? Data inaccuracy does not allow such comparison, contrary to what happens with our Master's and Doctoral programs.

Just as we have a rampant opening of medical schools in Brazil, there is no control and service demand policy, and less so regarding geographic distribution4Dias Sobrinho J. Paradigmas e políticas de avaliação da educação superior. Autonomia e heteronomia. In: Universidad e investigación científica: convergências y tensiones. Vessuri H, org. Buenos Aires: CLACSO, Consejo Latinoamericano de Ciencias Sociales; 2006.,6Entrevista de Antonio Carlos Lopes. J Clin. 2004;66:1.,7Brasil. Ministério da Educação. Resoluções e atas da Comissão Nacional de Residência Médica- CNRM [Acessado em 18 de outubro de 2013]. Disponível em: http://portal.mec.gov.br/index.php?option=com_content&view=article&id=12263&Itemid=507
http://portal.mec.gov.br/index.php?optio...
.

Having an assessment process based on specific and well-defined criteria, that considers comprehensive quantitative and qualitative aspects on the technical, scientific and ethical training, would provide conditions for continuous self-assessment of the programs by their leaders and also by the students. Programs admittedly deficient in specific areas may establish cooperation agreements with others as a means of complementing their underperforming educational stages. Identifying shortcomings will provide subsidies for the improvement of weak programs, providing for the establishment of quality goals and the possible disqualification of those training programs that can not reach them8Crespo A. The training of otorhinolaryngologists in Brazil. Braz J Otorhinolaryngol. 2008;74(6):802..

The harmonization of national programs, together with a centralized control, based on full knowledge of supply and demand will enable the eventual adoption of a system to allocate program openings.

In this paper we present the steps of the development and application of the Protocol for Assessment and Classification of Medical Residency and Specialization Programs in Otorhinolaryngology (PACRE), adopted by the ABORL-CCF in 2004, and its results in the training programs all the way to 2007.

METHOD

From March to June of 2004, the author and the ABORL-CCF Teaching, Training and Residency Commission developed this Assessment Protocol, covering the following domains:

  • General Description of the educational program and academic environment;

  • Infrastructure;

  • Faculty;

  • Training activities;

  • Research and scientific production activities;

  • Student assessment;

  • Evaluation of the program by the student body;

  • Program self-assessment.

Each domain covers specific items that characterize it. The values or points assigned to each item were determined by consensus from the members of the Teaching, Training and Residency Commission, who considered the importance of each item in the training of medical residents, often times in comparative ways. This comparison was based on the authors' experience in coordinating residency programs, heading services, and dedication to education, healthcare and scientific research.

The items assessed in domain I (General Description of the Program) include the institution where the teaching program is taken, the academic environment, the institution's level of complexity and geographic location (Chart 1).

Chart 1.
General Program Characterization (item I).

Residency and specialization programs in the Northern region received a value of 10, due to the lack of regional training opportunities. Conversely, programs in Southeastern states received no score due to the excess of specialists and training programs. The programs located in other regions were given intermediate scores, according to the same criterion.

The items assessed in domain II (Infrastructure) are depicted on Chart 2.

Chart 2.
Infrastructure (item II).

The items evaluated in domain III (Faculty) are presented on Chart 3.

Chart 3.
Faculty (item III).

Note: Institutional Bond - Contractual relationship of the instructors as preceptor or medical worker. All preceptors must be qualified as a specialist in Otorhinolaryngology or in the related fields logged at the Brazilian Medical Association or the Regional Board of Medicine, of which 50% must have at least ten years of experience in that field of specialization.

To be an instructor in surgery for diseases of the head, neck, skull base surgery and in orthodontic, maxillofacial trauma, facial aesthetic and restoration surgeries, the preceptor must be an otorhinolaryngologist with five years of experience in the field, or not an ENT but having a Title of Specialist in the field and who always works in the team with the ENT.

The score assigned for item: "Faculty Title" (item 2 of Chart 3) is obtained by the formula that indicates the sum of the percentages of each category, multiplied by their corresponding weights and then divided by 100. We have a value which ranges from zero to eight to score this item.

Institutions with a graduate program receive a score of two. A score of one is assigned when the instructors only participate in graduate activities at other institutions.

The aspects assessed in item IV (Training Activities) are depicted on Chart 4A-C.

Chart 4A.
Training activities (item IV): lectures, meetings to discuss journal papers and regular discussion of clinical cases.
Chart 4B
Medical care activities (item IV), residents participation in the surgeries performed.
Chart 4C.
Training activities (item IV): activities in experimental surgery, research activities and participation in meetings.

Note: points are assigned according to the assessments of activities involving: lectures, meetings for the presentation of journal papers and to discuss cases - only when such activities are offered regularly and in separate sessions.

Chart 4C shows the training activities involving experimental surgery, scientific production of the residents or trainees and participation in conferences.

Note: in the first part of item 7 of Chart 4C, we considered the total volume of resident-borne scientific production in relation to the number of residents. In the second, we considered how many residents participated in the production, in relation to the total number of residents. This distinction helps avoid the possibility of only one resident being responsible for the entire production and the entire student body receiving credit for it.

The aspects assessed in Item V (Research Activities and Scientific Production at the Institution) are depicted on Chart 5.

Chart 5.
Institution's research and scientific production activities (item V) where the residency or specialization program happens. The numbers in blue correspond to the weights assigned to each analyzed item.

Item VI (Student Body) evaluates the performance of graduates from residency or specialization programs at the Board's Title of Specialist in Otorhinolaryngology exam, held by the ABORL-CCF. It is presented on Chart 6.

Chart 6.
Student body (item VI): assessment of performance in the ABORL-CCF Board's Title of Specialist Exam.

Chart 7 presents the items considered in the evaluation of the training program, carried out by the residents or trainees.

Chart 7.
Assessment of the residency or specialization program made by the student body (item VII).

In item VIII (Training Program Self-assessment) there is room for the final remarks of the training program's preceptor (Chart 8).

Chart 8.
Program's self-assessment (item VIII).

(Does not score)


Items VII and VIII received no score. However, they make up an important set of information that allow examiners to analyze aspects of the residency or specialization program not covered in the completed protocol.

The information provided upon completing the evaluation protocol must be proven. The training programs' preceptors receive in advance a list of documents to be submitted at the time of the evaluation visit. Unproven information is not considered during the visit (Chart 9).

Chart 9.
Documents to be shown during the visit.

Weights were assigned to each domain evaluated, as per shown on Chart 10.

Chart 10.
Aspects assessed in the Protocol for Assessment and Classification of Residency and Specialization Programs in Otorhinolaryngology in Brazil.

The points allocated to items contained in each domain are then summed up. The resulting value is converted into a score from zero to one hundred, applying the rule of three.

This value is then multiplied by its corresponding weight (Chart 10) to compose the score for each domain.

The training program's final evaluation score varies from zero to one hundred, and is obtained by the weighted average of the scores assigned in all domains.

Classification of residency or specialization programs

Residency or specialization programs are classified according to the final grades in descending order intervals, in levels A, B+, B, C+, C, D and E.

Institutionalization and process procedure - final stage

The implementation of the Protocol for Assessment and Classification of Residency and Specialization Programs followed a routine bureaucratic and policy procedure, which ensured its acceptance, execution and implementation in the ABORL-CCF's activities program:

Full presentation of the Protocol for Assessment and Classification of Residency and Specialization Programs to the ABORL-CCF Board for appreciation and approval;

Prior proposal of a reassessment, re-registration and reclassification schedule of the institutions according to their initial rank (Chart 11);

Chart 11.
Reassessment schedule of the training programs and sequential procedures.

Presentation at the specialty's Mini-Forum held on August 21, 2004 for appreciation, review and approval.

Note: The Mini Forum is a joint committee formed by the ABORL-CCF Board members, coordinators of all technical committees and guest opinion leaders. They meet regularly to present, discuss and work on the Association's activities program.

Institutional approval has made The Protocol for Assessment and Classification of Residency and Specialization Programs in Otorhinolaryngology part of regular ABORL-CCF activities, rather than being just a project of the Teaching, Training and Residency Commission.

After final approval of the proposed methodology, the ABORL-CCF's Protocol for Assessment and Classification of Residency and Specialization Programs and its rules were widely publicized, with special emphasis on the method's impartiality. The purpose of this communication was to encourage its acceptance by the ENT community.

All preceptors of the training programs received a letter of invitation to enroll their residency or specialization programs in the ABORL-CCF's Protocol for Assessment and Classification.

Assessment Team and Visits Schedule

We set up a team of evaluators to inspect the teaching programs that complied with the protocol. We had 40 otorhinolaryngologists who were carefully selected according to their teaching experience in Otorhinolaryngology, rotating in pairs. The number of programs visited by each pair of evaluators varied according to their availability.

They were all trained simultaneously in a five-hour long program, for presentation and detailed discussion of all items in the protocol and its possible deviations and inaccuracies.

Within a three-week period in August of 2004, all teaching programs were visited by pairs of evaluators, fulfilling a predetermined schedule. The institutions were numbered between 1 and 85.

After the visits were completed, all evaluators were simultaneously gathered for criteria homogenization, presentation of general impressions and pointing out difficulties with the protocol. They all reported their experiences, set comparisons and leveled off differing interpretations.

A, B+, B, C+, C, D and E concepts assignments

The three members of the ABORL-CCF's Teaching, Training and Residency Commission reviewed all completed protocols, compiled, digitized and processed the data obtained in the software developed for this purpose. Then, in a consensus they assigned the final grades for each program and their corresponding classification: A, B+, B, C+, D and E.

The programs are ranked by assigning concepts according to ranges of established grades (Table 1). The final classification included, in some cases, subjective aspects, mainly related to ethical issues in medical care and education, which are exclusive prerogatives of the ABORL-CCF.

Table 1.
Classification levels and corresponding grade ranges.

Intermediate or partial grades from each program were kept confidential, in the custody of the Teaching, Training and Residency and are not to be disclosed.

Institutionalization and process procedures - final stage

The overall results, without identifying the programs, were presented at the Forum of Malpractice Defense held during the Brazilian Congress of Otorhinolaryngology in 2004.

Note: the decisions of this forum, which guide the ABORL-CCF actions, confirmed the Protocol for Assessment and Classification of Residency and Specialization Programs as a tool for defense against malpractice.

The protocol and its overall results, without identification of the participating institutions, were presented and approved at the ABORL-CCF General Assembly, held at the Brazilian Congress of Otorhinolaryngology in 2004. This official acceptance institutionalized the continuity of our Protocol for Assessment and Classification of Residency and Specialization Programs in Otorhinolaryngology in Brazil.

A letter was sent to all preceptors from participating institutions, containing the classification of their training program, their ranking in relation to the general classification, and information about how they ranked vis-à-vis the different items.

The preceptors of programs classified as D or E and all others who requested guidance and clarifications were summoned to a joint debate.

The programs classified as D and E were reassessed in 2006. Those which remained in this classification were reevaluated by the Teaching, Training and Residency Commission in 2007.

The programs initially classified as level C+ and C were reassessed in 2007.

Those ranked as A, B+ and B will be reassessed after 5 and 4 years, respectively, thus remaining unchanged for now.

The performance of these programs and their evolution in the ranking scale will be followed.

Rating the programs and their graduating students

The applicants' performances in the ABORL-CCF Board's Exam for the Title of Specialist in Otorhinolaryngology was compared to the ranking of their training programs in 2007. We analyzed the mean values of the grades received by those who passed the exam, the percentage of failure and the order the candidates ranked according to the different levels of classification of their training programs.

Data Analysis

To compare the levels of classifications and assess whether there are differences between the average scores corresponding to the programs ranked as A, B+, B, C+, C, D and E, we used the Student t-test for independent samples. We adopted a significance level of 5%.

Claim: The mean scores from A-ranked institutions are higher than the mean scores for B+ programs.

Test Type: single tail.

Hypothesis H0: The mean value of A-ranked programs is equal to that of the B+ programs.

Hypothesis H1: A-ranked programs mean value is higher than that of concept B+ programs.

H0 is rejected if the calculated p-value is less than 0.05.

To compare the mean scores of the applicants to the ABORL-CCF Board's Title of Specialist Exam, and assess whether there are differences between them according to the classification of their training programs, we used the Student t-test for independent samples. We adopted the significance level of 5%.

Claim: The mean scores of A-ranked program alumni is different from the mean scores of those alumni from grade B+ programs.

Hypothesis H0: The mean scores of A-ranked is equal to that of B+ programs.

Hypothesis H1: The mean scores of concept A programs is different from the mean scores of B+ programs.

Level of significance: 0.05.

H0 is rejected if the calculated p-value is less than 0.05.

Comparing performances in the ABORL-CCF Board's Title of Specialist Exam, we calculated the correlation coefficient between the curve of the mean scores of the candidates with the curve of the average scores of their training programs.

The remaining data was analyzed in a descriptive fashion.

RESULTS

We found 85 residency or specialization programs in ENT in Brazil, of which 82 agreed to participate in the ABORL-CCF's Protocol for Assessment and Classification, and were evaluated. Sixty were medical residency in Otolaryngology accredited by MEC; 25 were specialization programs accredited by the ABORL-CCF and 19 were residency programs accredited by both. Three educational institutions (33, 40 and 43) refused to participate in the evaluation and were accredited only by MEC.

In order to avoid misleading comparisons, the scores that determined the classification of each training program shall not be disclosed, by decision of the ABORL-CCF. The final grades of each training program were grouped by levels of classification.

The mean scores of the different levels of classification are presented on Chart 12.

Chart 12.
Mean values of the final grades according to the training program ranking in 2004.

The statistical analysis comparing the mean scores of the programs, according to their classification is presented on Table 2.

Table 2.
Statistical analysis of the comparison among the mean grades of programs ranked as A, B+, B, C+, C, D and E in 2004.

The significant difference between the tests states that the classification system qualitatively discriminates the residency and specialization programs evaluated.

Chart 13 shows the training programs classified according to the established criteria.

Chart 13.
Ranking of residency or specialization programs in Otorhinolaryngology according to the ABORL's Protocol for Assessment and Classification employed in 2004.

The distribution of the general qualitative classification of medical residency or specialization programs in Brazil is presented in Figure 1.

Figure 1.
Numerical and percentage distribution of the general qualitative classification of medical residency or specialization programs in ENT in Brazil in 2004.

The distribution of the qualitative classification of programs in Southeastern Brazil is presented in Figure 2.

Figure 2.
Numerical and percentage distribution of the qualitative classification of medical residency or specialization programs in ENT in Southeastern Brazil in 2004.

The classification of training programs in Southern Brazil is presented in Figure 3.

Figure 3.
Numerical and percentage distribution of the qualitative classification of medical residency or specialization programs in ENT in Southern Brazil in 2004.

The classification of programs in the Northeast of Brazil is presented in Figure 4.

Figure 4.
Numerical and percentage distribution of the qualitative classification of medical residency or specialization programs in ENT in Northeastern Brazil in 2004.

The distribution of the qualitative classification of training programs in the Brazilian Midwest is presented in Figure 5.

Figure 5.
Numerical and percentage distribution of the qualitative classification of residency or specialization programs in ENT in the Midwest region of Brazil in 2004.

The qualitative classification of the programs in the Northern region of Brazil is presented in Figure 6.

Figure 6.
Numerical and percentage distribution of the qualitative classification of medical residency or specialization programs in ENT in Northern Brazil in 2004.

The comparative percentage distribution by qualitative classification and geographic region is shown in Figure 7.

Figure 7.
Comparative percentage distribution by regions of Brazil, the residency or specialization programs in Otolaryngology, according to the qualitative classification in 2004.

The numerical and percentage distribution of programs by state, according to the ABORL-CCF's qualitative classification in 2004 is depicted on Table 3.

Table 3.
Numerical and percentage distribution per Brazilian state, in relation to the entire country, according to the qualitative ranking of residency or specialization programs in otorhinolaryngology in 2004.

The numerical and percentage distribution by state, of the qualitative ranking of the training programs is depicted on Table 4.

Table 4.
Rankings numerical and percentage distribution of residency and specialization programs in Otorhinolaryngology in 2004, per state in Brazil.

The state of São Paulo concentrates 60% of the country's A-ranked programs. Overall, 70.5% of its 34 training programs have been ranked as A, B+, B or C+ in 2004.

Of the 13 accredited programs in the state of Rio de Janeiro, 69.3% were ranked as C or D.

The state of Minas Gerais has 11 ENT training programs, 63.7% of them were ranked as B+, B, or C+.

In the state of Rio Grande do Sul, with four residency programs, 100% were ranked as A, B or C+.

Sixty percent of the the five training programs in Paraná were ranked as A or B.

In Bahia, where there are five programs, 60% were classified as level D.

Qualitative, numerical and percentage distribution of residency or specialization programs in states with less than three accredited institutions can be seen on Tables 3 and 4.

The ratio between the training programs exclusively accredited by MEC or by the ABORL-CCF in 2004, according to their rating, is presented on Figure 8.

Figure 8.
Numerical distribution of medical residency or specialization programs in Otolaryngology according to their accreditations from MEC or ABORL-CCF and qualitative classification in 2004.

The comparative distribution of the programs previously classified as E, D, C and C+ in 2004, three years after the implementation of the Protocol of Assessment, is presented on Figure 9.

Figure 9.
Comparative distribution of the classification of medical residency or specialization programs in ENT in 2007 after the reclassification of programs previously ranked as E, D, C and C+ in 2004.

Figure 10 shows the proportional distribution of the applicants ranked among the top 50% and bottom 50%, who passed the ABORL-CCF Board's Title of Specialist Exam, according to the qualitative classification of their training institutions in 2006.

Figure 10.
Proportional comparative distribution of the applicants ranked among the top 50% and the bottom 50% who passed the ABORL-CCF Board's Title of Specialist Exam, according to the qualitative classification of their training institution. N/A: programs not assessed.

Figure 11 shows the mean values of the grades obtained by successful applicants in the ABORL-CCF Board's Title of Specialist Exam in Otolaryngology in 2007, according to the qualitative classification of their training programs.

Figure 11.
Average grades in the ABORL-CCF Board's Title of Specialist Exam according to the qualitative ranking of the applicant's training institution, in 2006.

The comparison among the mean scores of the applicants, according to their training institution, was statistically analyzed and the results are shown on Table 5.

Table 5.
Statistical analysis of the comparison among the mean grades of the graduates at the 2006 Title of Specialist Exam, according to the ranking of their training program.

The mean scores of graduates from A-rated institutions is different from the mean scores of alumni from C and D-rated programs. Also the mean scores of the alumni from B-rated programs are different from the scores of those from C-rated institutions.

There was no statistically significant difference between the means of successful candidates in the ABORL-CCF Board's Title of Specialist exam in 2007, when compared to the classification of their training programs, in the remaining comparisons.

Figure 12 illustrates the curve of mean scores from the successful candidates to the ABORL-CCF Board's Title of Specialist Exam in 2006 and the curve of the mean scores assigned to their training programs. The training programs' mean scores were divided by ten to make it easier to graphically compare them with the alumni mean scores.

Figure 12.
The mean grades curve of successful applicants to the ABORL-CCF Board’s Title of Specialist Exam in 2006 and curve of the mean scores assigned to their training programs.

In the graph, we can notice that the two curves have the same trend. As the institutions' ranks decrease, the mean scores of their alumni also decrease.

We calculated the correlation test between the mean scores of the institutions and the mean scores of their alumni in the ABORL-CCF Board's Title of Specialist Exam in 2007 and obtained a p = 0.92, which strongly indicates that the curves have the same trend.

Figure 13 shows the number of applicants who failed and the percentage of failures in the Exam that granted the ABORL-CCF Board's Title of Specialist in 2006, distributed according to the qualitative classification of the training institution.

Figure 13
Numerical and percentage distribution of the applicants who failed the ABORL-CCF Board’s Title of Specialist Exam in 2006, according to the ranking of the applicant's training program. N/A = programs not assessed.

Graphically, we can see that the percentage of failure is minimal in A-ranked institutions and maximum in those who refused to participate in the ABORL-CCF's Protocol for Assessment and Classification.

The qualitative evolution of the programs previously classified as E, D, C and C+ in 2004, and after their subsequent reclassifications, are presented on Chart 14.

Chart 14.
Qualitative evolution of residency and specialization programs ranked as E, D, Cand C+, after the reclassifications in the corresponding periods.

Three institutions previously ranked as C and as C+ in 2004, did not accept the revaluation. Three other institutions, which had refused participation in the program in 2004, remained absent in the assessment protocol.

DISCUSSION

The entire assessment process is subject to errors, it must be changeable and continually improved. Although imperfect, it is better than no evaluation.

There are no published reports on methodical evaluation of medical residency or specialization programs prior to the Protocol for Assessment and Classification of Residency or Specialization Programs developed by the ABORL-CCF.

This process evaluated the training structure and the pedagogical proposal in its various aspects. Its objective was not to evaluate the quality of the service where it fits, but rather, the medical residency program developed there. Also, this assessment does not include the quality of the tutors, their reputations or medical activities.

In otorhinolaryngology, the previous lack of evaluation criteria, made previous opinions and impressions mere speculations. These were established concepts, without defined bases, influenced by the collective imagination formed from the projection of the services that offered residency or specialization programs.

Briefly, they were just favorable or unfavorable prejudices.

Habituation to the spontaneously adopted modus operandi, added to the lack of quality metrics, were hurdles to the development of teaching programs in medical residency. The performance of the applicants in the ABORL-CCF Board's Title of Specialist exam was not considered an indicator of the residency programs' quality, despite the evidence. The success or failure of an applicant on these tests was intuitively interpreted as evidence of his/her own merits. Their training institutions went unnoticed as directly responsible for such performance.

One of the merits of the Protocol for Assessment and Classification of Residency and Specialization Programs in ENT was to provide information that may correct this imbalance. Shielded on the good reputation of their preceptors, training programs which prioritized the mere provision of medical care in ENT cropped up, unaccompanied by a pedagogical offer and lacking focus on specialized training. This reality sometimes concealed weaknesses induced by the injudicious and mercantile order of unnecessary complementary medical tests, nonetheless being cash making machines. In specific circumstances, the students were trained in effective practices to extract undue advantage from the Brazilian Public Healthcare System (SUS). By decree and definition, medical residency is a mode of non-degree graduate program, characterized by on-the-job training at a full-time basis, in university settings or not, under the tutoring of highly qualified and ethical medical professionals.

The lack of information, the strenuous workload to be accomplished and academic isolation of some training programs hindered their self-assessment. These conditions limited the critical observation of the residents and preceptors on the appropriateness and adequacy of their training programs. An adverse reality that was often only perceived by the failure of their alumni in passing the Title of Specialist Exam, in those extreme cases.

It is forbidden to use the term "Medical residency" to designate any medical training program that has not been approved by the National Commission on Medical Residency (NCMR). These are termed "Specialization Programs" and are accredited by their own Specialty Societies.

Only those students coming from specialization programs are required to take the Board's Exam to get the Title of Specialist, conferred by the Specialization Societies. Graduates from residency programs automatically receive the Title of Specialist, conferred by MEC at the end of their training, without the need for a Board's Exam.

Historically, the ABORL-CCF, like other Specialty Societies, strengthened the reputation of its Exam as a means to recognize professional quality and good academic background. For several years now, resident physicians have also been taking the ABORL-CCF Board's Title of Specialist Exam. This practice enabled us to interpret the success and failure rates of residency programs as potential metrics of the quality of the training program the student attended.

Master's and Doctoral programs are periodically evaluated by CAPES with rigor and method, and they can be moved up the rankings that qualify them or be penalized with termination. As an education regulator, and despite the importance and complexity of the training of skilled professionals, the NCMR does not have a sensitive and effective assessment system, promoting qualitative improvement for the programs or penalizing poor performance. Granting the Tittle of Specialist occurs automatically to graduates from the programs accredited by MEC, with no criteria to assess proficiency at the end of training. The NCMR certainly decertifies a residency program that does not provide the mandatory regular scholarship pay or does not respect working hours, vacation time and other formalities, but hardly finds and punishes a program that does not develop an appropriate pedagogical program. It lacks structure, universal method to medical specialties, and it does not have enough workers trained for this purpose.

We believe that the NCMR plays a fundamental regulatory role in ensuring and maintaining the minimal institutional conditions that guarantee the legal development of residency programs and adequate working conditions for medical residents. However, the scientific and technical contents of the training, are not covered in depth by the general guidelines issued by the NCMR. This difficulty arises from the multiplicity of existing medical specialties and lack of compatible structure in the NCMR.

In July 2005, the NCMR published Resolution Nº. 2, which provides for its structure, organization and operation. Standardized the invitation to representatives of the Medical Societies to join their Technical Assistance Body. This participation is an open channel to the progressive contribution of Specialty Societies. Also in this ordinance, it established the minimum requirements for the institution and for the residency program. These requirements are general guidelines essential to the functioning of the program, but do not include all the particulars of each medical specialty.

Through resolution NCMR 01 of January 2006, the NCMR discusses the structure, organization and functioning of the State Commissions on Medical Residency (SCMR), created in 1987. It confirms that the SCMR are agencies that report to the NCMR, with decision power regarding medical residency issues within each state. The SCMR is responsible for guiding and analyzing the processes of accreditation, reaccreditation, request for optional years, increase the number of openings, suggest measures that improve program performance, monitor the selection process, manage transfers and conduct on-site inspections to fulfill the aforementioned purposes. Additionally, it oversees compliance with the minimum criteria established by the NCMR, reporting, propose sanctions, judge the penalties imposed by the local committee on medical residency (COREME) on the institutions that offer residency programs, hear appeals arising from the selection process and other functions.

When we consider the number of existing specialties, with their own and distinct characteristics, and the multiplicity of residency programs, it becomes clear the inability of these agents to assess the content of training and the practical aspects of the residents' daily activities.

In December 2004, the National Commission of Medical Residency (NCMR), the Department of Medical Residency and Special Projects in Health at the Secretariat of Higher Education (SESu/MEC) and entities associated with medical residency programs, organized the 1st National Forum on Medical Residency. Among other issues, there was an intention to revisit the medical residency situation in Brazil and the criteria used to assess residency programs. At the time, they had a very generalized assessment, which only confirmed the need to setup criteria and their lack thereof, as well as the inability of MEC in performing assessments without the participation of the relevant specialty societies. Among the difficulties pointed out, one that stood out was the deep evolutionary difference between the medical associations as to the need to homogenize criteria, develop and apply evaluation methods. This inequality prevented the widespread participation of these associations in a joint action with the NCMR.

In September of 2006, the NCMR issued the Resolution NCMR Nº. 06, which provides for the assessment of medical residency programs.

It confirmed that a medical residency is a mode of graduate education, created and regulated by federal law, which purpose is to train doctors in service under proper supervision. It aims to meet the country's needs for training of qualified professionals within the medical field. This educational mode should be regularly assessed through appropriate instruments in order to tailor and improve the content of educational and medical care programs. For this purpose, it must use qualifiers to provide for the maximum reliability and minimum injunctions outside its own evaluation.

It establishes that residency programs should be evaluated at least every five years, in order to renew their accreditation. The five-year assessments will cover the following areas: infrastructure, the educational program, the faculty, the student body and the contribution to the development of the local healthcare system.

Evaluations shall be carried out in situ, by a visiting commission using assessment instruments approved by the NCMR.

The SCMR is responsible for appointing the Medical Residency Assessment Commission, which should be made, preferably, of at least one member of SCMR; one member appointed by the Specialty Society of the program whom, in turn, must be affiliated to the Brazilian Medical Association (AMB); a representative of the local public healthcare management system, appointed by the State Health Department, and one resident doctor appointed by the National Association of Resident Physicians (ANMR).

In weighting the points awarded to the evaluated aspects, the program content and infrastructure will account for 40%; 30% for the faculty and 30% for the student body.

Programs with performance index greater than 50% will be reaccredited for five more years. Those with variable performance between 25% and 50% will have their program submitted to diligence and should be reviewed before two years. If the performance index is less than 25%, the program will be terminated.

Just like with the NCMR, the SCMR lack sufficient structure, universal method concerning the medical specialties and enough trained workers.

This evaluation process is based on fulfilling the minimum requirements set by the NCMR. For being minimum, they should not be met short of full. There is a clear leveling at the minimum standards, which does not distinguish between excellence and minimum efficiency. Since it does not punctually identifies the deficiencies of the weaker program, it does not provide consistent benefits that can direct the improvement of the training program.

This commendable initiative, despite belated, attests, at most, minimal proficiency for teaching programs, but it does not rank them by quality as CAPES does for Master's and Doctoral Programs.

The ABORL-CCF anticipated that resolution and developed its Protocol for Assessment and Classification of Medical Residency or Specialization Programs (PACRE). The development of this protocol included a series of technical and administrative procedures, all important to its implementation and effectiveness.

Initially, we created a protocol as an evaluation tool, complementary to the ABORL-CCF "Minimum Requirements" for the accreditation of specialization programs. During the preparation of this assessment protocol, we agreed on the need to redefine the minimum requirements for the accreditation of programs. Most existing programs could not meet those conditions, which therefore, could not be considered minimal.

The evaluation protocol was developed during three months, in weekly meetings of the ABORL-CCC Teaching, Training and Residency Committee, after rigorously insightful discussions and evaluations. They considered various aspects that influence the quality of training programs in three major dimensions of its scope: teaching, care and research.

Within item: "General Description of the Program", they also evaluated the academic environment.

Learning opportunities are different in a university hospital, with tertiary or quaternary levels of complexity, where there are other residency programs of different specialties, that promote the exchange of knowledge, the surveillance from ethics committees and the continuous search for qualification - a prerequisite of academic institutions. Even outside of universities, that have teaching and research as their main reasons for existing, general hospitals can develop effective training programs, by providing appropriate equipment and technology, excellence in its structures and standardized protocols that make up for the lack of academic organization. However, these do not seem to be the teaching conditions offered to the majority of medical residents or trainees in Otolaryngology in Brazil. Many specialization programs are provided in smaller hospitals and private clinics, mainly geared to providing specialized medical care, without having an environment that fosters learning, except the one generated by the vocation and dedication of their preceptors.

It is a fact that Brazilian universities, public and private alike, do not supply the demand for openings in otorhinolaryngology residency. Specialization programs developed in isolated hospitals and private clinics are numerically significant in the training of Brazilian ENTs. Having accurate outside guidance, geared to the conceptual and practical aspects of the professional training, as necessary even within the university environment, becomes essential outside of it, where issues related to education are not a priority. Adequate physical facilities, diagnosis equipment, treatment and rehabilitation are essential to good medical care and are important in professional training. Libraries; access to the Internet and to scientific publications are essential in a specialty training program.

However, an exquisite infrastructure, is no guarantee of effective training. There are situations in which equipment and technology are merely incentives for unnecessary complementary tests, which main purpose is to generate profits, which creates a moral deviation in the exercise of professional training.

The greatest difficulty in creating the protocol was in evaluating the faculty. The number of professors, their titles and contractual time dedicated to the institution are inadequate indicators for they do not assess the complex relationship established between student and professor, which often transcends the formal exchange of expertise. The assessment of possible conventional indicators of quantity and titling, combined in a formula of proportionality, was the solution adopted.

Under "Training Activities" we assessed the syllabus, with the theoretical aspects of training and development of outpatient and surgical activities. The existence of regular lectures, clinical meetings and discussion of the scientific literature are elements of difficult qualitative and quantitative evaluations, except by relative opinion from the resident physicians.

In the evaluation of outpatient care activities, we sought to differentiate a minimum volume of patients that ensures learning of undesirable conditions to which the resident is exposed from the intensive care at the expense of the time available to develop other aspects of training. The protocol also differentiated, the way of participation in the surgical activities, either as primary surgeon or as mere observer of procedures performed by others, in addition to training opportunities involving experimental surgery in a laboratory.

Participation in scientific research is not essential to learn the specialty; however, it helps develop observation skills, judgement and discipline. It teaches that learning does not end with the end of residency or specialization; it is an ongoing and never ending process in the practice of medicine. The protocol evaluated positively the institutions that offer students opportunities for participation, development and dissemination of scientific research, as well as participation in the main meetings of the specialty.

The generation of scientific knowledge enriches the academic environment and encourages the adoption of quality procedures and standardization. These aspects were evaluated under "Research Activities and Scientific Production of the Institution" and promote programs that develop it. Although considered by its relevance, this item had less weight in the balancing of the grades when compared with the other aspects involved in the specialist's training.

The students were heard separately, in direct interviews with the evaluators, and all the aspects of the assessment protocol were freely discussed in most cases. At the end of the interview, they anonymously filled out the "Assessment of the Teaching Protocol by the Students" form. Interaction with the students provided fundamental subsidies for the evaluators to prove the reliability of the information presented. this procedure established a more direct relationship between the students and the ABORL-CCF Teaching, Training and Residency Commission, materializing in the frequent exchange of information and electronic mail.

The detailed and individual analysis of the metrics considered in the classification and assessment protocol, important to find shortcomings in training program, are beyond the scope of this paper.

The protocol is not accurate in the qualitative assessment of the teaching provided in the training programs. Ethical issues, important in medical training, are difficult to be judged or documented in an objective protocol. The interview with the residents, often times revealed embarrassment, inhibition and even deliberate refusal to provide information which may be unfavorable for the program visited. However, in some programs they seemed to feel more at ease to discuss the program, especially in the best-regarded university services. Interviews with the residents or interns was fundamental in order to perceive the reality around them and, sometimes, determinant for the perception of settings manipulated to cause a good impression, masking the lack of equipment, poor theoretical program, unavailability of preceptors and even not having patients enough to meet the minimum requirements set by the ABORL-CCF.

The perceived lack of a suitable environment for training and learning sometimes escapes the confines of a protocol; however, it is readily perceived by evaluators trained for this purpose.

Entry in this particular and complex universe demanded extreme care in the choice of evaluators who visited the services on site and their residency or specialization programs.

The evaluators were selected by appointment from the heads of the most recognized university services or directly invited by the SCMR. We looked for a specific profile of professionals: ENT physicians participating in graduate programs or already graduated from it, who were familiarized with medical residency programs, who had recognized ability to work with discretion and objectivity, not overexposed in conferences and membership activities and available to travel around the country. With these features, we tried to avoid relationships of friendship or closeness between evaluators and coordinators of programs evaluated or Department Heads. Much importance was assigned to these aspects to ensure objectivity and impartiality, procedure reliability and to rule off possible favoritism or biased rejections. It was surprising to note the great involvement of participants, serious commitment and desire to contribute, a revealing citizenship.

The difference in the rigor of the evaluative teams was indeed expected. We attempted to make up pairs of evaluators with complementary profiles. All assessments were then discussed among all evaluators called for a general meeting, specifically for this purpose, in order to achieve the maximum possible equalization criteria.

The submission and approval in the specialty's Mini-Forum was a strategy that, in addition to an exercise in democracy and institutional regularity, it garnered the approval of the principal preceptors of the specialization and residency programs, present or represented at the forum. In that environment favorable to the adoption of ideas and regulatory actions, it was possible to present in details all the protocol's actions and intentions.

The presentation given later at the Professional Defense Forum sealed the previously agreed and approved commitment at the specialty's Mini-Forum. And again, it enabled the detailed presentation of the program to a larger number of leaders and opinion makers.

When the results of the Assessment Protocol were presented at the General Assembly of the ABORL-CCF, it came without opponents or critics. It was widely heeded and endorsed as a program of strategic importance for the future of the specialty.

In this protocol, the different aspects that influence the quality of the residency or specialization programs or internships are evaluated separately, which enables the identification of weaknesses and strengths in the system. Finding the problems is a fundamental step to solve the shortcomings. The programs that obtained the highest ratings also had shortcomings that can be corrected and improved.

Several programs rated less favorably, on the other end of the scale, and were classified as insufficient, despite the good impression that previously characterized them due to the reputation of their preceptors. These programs received timely information about their weaknesses and guidance regarding the necessary remedial actions.

The wide acceptance and almost no criticism to the program reflected the understanding that the lack of previous criteria did not allow for a well-founded concept and that quality assessment and ratings are useful tools for those who are committed to remedy deficiencies.

This evaluation method is multifactorial in concept. Training programs should not be compared by single items, e.g. a service that has polysomnography can be classified as having a disadvantage compared to others that do not offer this exam. This is because we must consider the joint analysis of all features.

The evaluation of a teaching program is often interpreted as an evaluation of the institution where it is taught. Likewise, it may be perceived as a professional assessment of the program's faculty or preceptorship. They are, nonetheless, distinct concepts. The institution itself, as well as its staff, may not have any quality-wise association with the education it provides. Renowned institutions as well as renowned and prestigious preceptors, may provide poor-quality programs. Conversely, less known institutions can provide residency or specialization programs deeply involved with the scientific, technical, ethical and moral training of their students.

Medical schools and teaching hospitals have in teaching their main reason for existence. They have education, medical residency, medical ethics, research and others committees which hold permanent forums for discussion, critique, evaluation and proposals. An environment that is totally different from this one can be found in private clinics and general hospitals, which main focus is to provide medical care.

Medical schools do not provide enough openings to meet the demand generated by the large number of medical students graduating annually and seeking specialization through residency programs. Consequently, private clinics play an important role in the training of otorhinolaryngologists in Brazil.

In our country, there are 53 residency or specialization programs in Otorhinolaryngology being held at private clinics and general hospitals. Only 32 are associated with medical schools and universities. It is evident the need for tools to assess the quality of the training provided and information to aide their improvement.

This need is often ignored by some, whose goal is primarily to provide medical care. The very definition of medical residency contributes to it: on-the-job training (OJT).

OJT must necessarily be complemented with theoretical teaching, training on principles of good care, ethical use of technology available for additional tests and clinical diagnosis, and development of critical judgement vis-à-vis the avalanche of scientific and pseudo-scientific knowledge continuously produced day-in-day-out.

Medical schools and universities are accustomed to having teaching assessment processes, even if inadequate. Private clinics and general hospitals usually do not go through this process and intuitively transfer their prestige to their training programs, without the support of adequate quality indicators.

When we designed the PACRE, our first thoughts were of the rejection it would suffer. Similarly, the institutions that agreed to participate in it could then discredit it before an unfavorable rating. Just imagine that respected professionals from reputable clinics in the academia and private, with highly desired training programs, having poor quality programs unveiled and finding themselves in disadvantageous comparison. It would certainly trigger a chain reaction.

As an essential part of the method used to develop and deploy the PACRE, we set up a strategy of progressive and successive approvals that brought together the acceptance of the main political and scientific leaders of the Brazilian ENT community. The communication and deployment of this evaluation and classification methodology, based on clear and predetermined rules, paved a path of no return. Without completion of this step, the other ones would not be doable.

Our PACRE was broadly accepted, with 96.5% participation of the Brazilian otorhinolaryngology teaching/training institutions.

The precise compliance with the established rules gave credibility to the protocol and turned it into the reference for the improvement of residency or specialization programs in Otorhinolaryngology throughout the country.

Teaching assessment is a complex topic, subject to wide reflection by theorists and educators. Two realities emerge from it: the lack of a single, universal, accurate and infallible method and the need for assessment as a tool for personal and institutional improvement.

Medical education covers rather complex aspects, including ethics, morals, philosophy, science, administration, and resource management.

The National Commission of Medical Residency was established 30 years ago today, by Decree Nº. 80,281, of September 5, 1977, during the administration of President Ernesto Geisel. From the beginning, it was an assignment of regularly assessing programs in view of their performance in relation to training needs and health care at the national or regional level. In addition to the recognition of the need for evaluation, it also established that the healthcare institution accredited to offer a residency program and which was not linked to the organized education system should be associated with a Medical School or University, seeking mutual collaboration in the development of medical training programs. Even when properly directed, the syllabus was vaguely mentioned: "residency programs must include, a minimum of 10% and a maximum of 20% of theoretical and practical activities in their workload, in the form of updated sessions, seminars, clinicopathological correlations or others, according to predetermined programs".

In that same year, the NCMR standardized the minimum requirements for medical residency programs of 55 specialties and areas of expertise. Establishing that in Otorhinolaryngology, 15% of the annual workload should be assigned to an inpatient unit; a minimum of 25% for outpatient; a minimum of 20% in emergency care and a minimum of 20% in the operating room. It also established required internships in buccopharyngology, stomatology, laryngology, otology, neurotology, rhinosinusology, tumors of the face, neck and skull base, trauma surgery and facial cosmetic surgery, urgency and emergencies in ENT. It defined required facilities and equipment: audiometer, impedance meter, resources for conditioned audiometry, electronystagmography, auditory evoked potential, 30 degrees nasal telescope, 70 degrees laryngeal telescope and flexible nasopharyngolaryngoscope.

These are generic requirements which do not ensure quality of training and can not be considered as minimum requirements in a quality program.

The Brazilian Federation of Gynecology and Obstetrics Associations (FEBRASGO) in 2006 introduced an evaluation system of residency programs in obstetrics and gynecology analogous to the one implemented by the SBORL in 2004. The evaluation process developed conceptualizes several physical, educational, healthcare and human resources aspects in excellent (E), good (B), regular (C) or insufficient (I). It establishes a periodicity for reassessments, variable with the qualification of the training program and proposes, like the ABORL-CCF, a strategy for program improvement and adaptation.

In April 2007, the NCMR constituted a Special Committee to prepare and propose the methodology and criteria to standardize and upgrade the syllabus for medical residency programs, and standardized the monitoring visits they should receive. This committee shall submit a final report within six months.

In the PACRE, created by the ABORL-CCF, the ranges of scores corresponding to concepts A, B+, B, C+, C, D and E are unequal and were thus determined in order to establish differences between categories, from the ends. Having seven categories, some with very subtle differences, serves as a stimulus for developing and improving the training programs. Clustering at regular and larger intervals, with greater distance between the categories might lead to complacency.

There were statistically significant differences among all the categories, revealing the property of the method adopted to discriminate differences between the training programs evaluated, even when in categories that are very near each other.

The State of São Paulo concentrates 42% of the training programs, followed by the states of Rio de Janeiro and Minas Gerais, in direct proportion to the size of their populations and concentration of otorhinolaryngologists.

The best residency programs are developed at universities in São Paulo. All geographic regions of the country have at least one level A+ program, except the Northern region (Figure 1, Tables 3 and 4). The general qualitative distribution reveals a predominance of programs in intermediate positions of the quality scale. The majority (60%) offers good or satisfactory training conditions (categories A, B+, B+ and C). There are, nonetheless, 17% of programs in only regular conditions and 23% in a critical condition - insufficient for the proper training of otolaryngologists. They are distributed throughout the country, except in the Midwest region (Figures 2-7).

In the absence of external evaluators and criteria for self-assessment, these institutions are not identified as being insufficient and attract recently-graduated doctors seeking specialization. The applicants are unaware of the conditions of the training provided there, where the prestige of the institution prevails over the program or its preceptor, which does not ensure proper learning conditions.

There are factors beyond the scope of this study which require thinking. While the state of São Paulo concentrates 71% of the well-qualified (A, B+, B, and C+) and only 12% of the insufficient programs, Rio de Janeiro has 31% which are well qualified (B or C+); 31% insufficient; 39% border lining insufficiency and no A or B+ program (Table 3).

Despite the social and urban conditions that brand the city of Rio de Janeiro, we do find favorable conditions for the development of good training programs. It is necessary to assess whether resident physicians and trainees are not being used as cheap labor in exchange for insufficient training in some training programs.

The programs initially classified as D and E were reassessed one year and six months later, respectively, as provided in the initial schedule.

Of the two programs classified as E, one was upgraded to D after its one-year reassessment; and since it remained at level D, the program was prevented from opening new spots for students, tending to terminate after completing the training of their current students on January 31, 2008. The other residency program is accredited by MEC and is not obliged to abide by the sanctions established by the ABORL-CCF. It meets the minimum requirements of the NCMR and maintains its regular activities. This program is taught in the medical school and not in a private practice. Thus subject to MEC and NCMR regulations.

Of the 17 programs initially evaluated as D, 77% were upgraded to C, and one of them was upgraded to C+. Two of them requested early reevaluation within a year, and deserved the upgrade to C+ and B (Chart 14).

Of the 31 training programs initially assessed as C or C+, four did not participate in the reassessment process two years later, as scheduled. They all are accredited by MEC and, therefore, were not required to participate.

Paradoxically, MEC and NCMR - regulator agencies responsible for maintaining quality in the training of specialists in Brazil, are sometimes used as an alibi to the refusal of some institutions to modify their training programs considered insufficient.

Among 30 participants in the reassessment, 11 (37%) were upgraded in the rating scale, four (13%) were downgraded and 15 (50%) remained unchanged, such as C or C+ (Chart 14).

The ability to detect the weak and the favorable aspects provided by the evaluation protocol of residency programs fostered the search for improvements and guidance with objective bases. The frequency of assessments made some programs seek to improve on the weaknesses of their educational programs, in physical structure and other items, as a strategy to maintain their good classification obtained or ensure their progression in the overall ratings.

The evaluation protocol became a guide for the pursuit of excellence.

Assessing the performance of graduates in the ABORL-CCF Board's Title of Specialist Exam revealed that the order of classification of the ones who passed the test is directly proportional to the order of the ranking of their training institutions. Of the 146 candidates in 2006, the concentration of the 73 top-ranked graduates is higher among better qualified training programs and progressively decreases in less qualified programs. In an inverse relationship, the bottom 73 approved, came from less qualified programs (Figure 10).

A comparison of the mean scores from the successful applicants to the ABORL-CCF in the Title of Specialist exam revealed a significant difference only at the ends. The small size of the sample of successful applicants coming from D programs hinders the proper mathematical comparison. Similarly, the range of score variation is small due to the inclusion of the grades in the curriculum evaluation and the practice test in the make up of the mean score.

Figure 11 shows that the average grades of the applicants who graduated from level-A programs is higher than that of those who graduated from lower-ranking institutions. These mean grades decrease similarly to the decrease in average scores of the respective training institutions.

The percentages of failure of applicants to the ABORL-CCF Board's Title of Specialist Exam is higher among less qualified training programs, when compared to their level-A counterparts (Figure 13). The correlation between the performance of applicants to the ABORL Board's Title of Specialist Exam and the classification of their training programs certifies PACRE'S competence.

These performance metrics reflect the direct relationship between the quality of the program and that of the graduating resident. They also show that PACRE enables quality discrimination and properly ranks the programs evaluated. This is a direct and significant indicator of the method's effectiveness.

If we also consider that the best programs attract top students, the information provided by the method of classification will enable the applicants to better select their training program.

The programs that could not match the minimum quality standard (level C) were unable to tender for new applicants and will be disqualified by the ABORL-CCF.

The classification of the programs was disclosed at the ABORL-CCF's electronic address, and it can be visited by newly-graduated physicians looking for information on medical residency programs in otorhinolaryngology in Brazil.

In July 2005, the NCMR published Resolution Nº. 4, which provides for the inter-institutional exchange to support the creation and improvement of medical residency programs in priority specialties in underserved regions of the country. It established that the institutions concerned should enter into an agreement among themselves, with the approval of the NCMR, primarily in regions of the Brazilian Amazon and Northeast Brazil. The ABORL-CCF classification protocol may serve as a safe guide in instructing training programs that may participate in the NCMR's Inter-institutional exchange.

We found that the Southeast region concentrates a large number of residency or specialization programs. In parallel, this region also concentrates the most otorhinolaryngologists, making some places heavily saturated, with consequent deterioration of working conditions and employment, as reported by the SBORL 2000 census.

The ABORL-CCF deems it unnecessary to open new training centers in the Southeast; it restricted accreditation of new openings in existing programs and accreditation of new specialization courses, except to those that have been previously assessed as level A, B or B+. The granting of openings and new accreditations are guaranteed to good quality programs, but denied to those which are insufficient.

Since 2004, a hospital in the city of São Paulo and another in the city of Rio de Janeiro have requested accreditation of new specialization programs in Otorhinolaryngology. Both were evaluated using PACRE criteria and did not reach the B rating, minimum condition for granting accreditation of new training programs in the states of São Paulo and Rio de Janeiro. They were not accredited and did not start training activities. One specialization program was started in Juiz de Fora, MG, with initial concept of C+, because it is located outside the regions considered full.

One residency program (44), which was disqualified by MEC for not complying with the requirements, requested accreditation by the ABORL-CCF. It was evaluated as a new program and did not get a B rating - required for accreditation. Its activities were terminated.

It is empirically estimated that the volume of medical knowledge doubles within a few years.

In the residency program, the trainee must be encouraged to develop a continuous learning method. It is not enough to just absorb and develop skills. It is fundamental to have a tutor, a physical structure, adequate demand, build a, academic atmosphere, have a pedagogical model and a suited preceptorship.

Brazil is a nation under construction, and so are our institutions. This country is vast and unequal and does not stand out for the adoption of standardized and sustainable procedures. Even less for its continuity and improvement.

The ABORL-CCF's Protocol for Assessment and Classification of Residency and Specializations Programs is being adopted as a model by the Brazilian Society of Orthopedics, and it is being considered for use by the Argentine Society of Otorhinolaryngology.

The continuous pursuit of qualification, perceived from North to South in the training programs, will certainly benefit the training of Brazilian otorhinolaryngologists, with positive impact on the future of this fascinating medical specialty.

CONCLUSIONS

  • The ABORL-CCF's Protocol for Assessment and Classification of Residency and Specialization Programs has been widely accepted, possibly thanks to its development strategy, characterized by its multiple and progressive submissions for institutional approval and broad dissemination of the predefined rules; the Protocol is semiquantitative enough to discriminating by quality the residency or specialization programs in otolaryngology;

  • ENT residency or specialization programs in Brazil have differences in quality and geographical distribution. The better quality programs are concentrated in the universities of the State of São Paulo. PACRE has identified and located the weaker training programs, those classified as average and the top-quality ones in Brazil;

  • There was an improvement in the ranking of most training programs initially identified by PACRE as being weak;

  • The ranking order of the successful applicants to the ABORL-CCF Board's Title of Specialist exam tends to follow the classification order of their training institutions;

  • The average scores of successful applicants to the Title of Specialist showed the same classification trend of their training institutions;

  • The percentages of failed applicants to the Title of Specialist in the ABORL-CCF Board's exam is much lower in A-level programs;

  • The trend concerning the correlation between the applicants' performance in the ABORL-CCF Board's Title of Specialist Exam and the classification of their training programs indicate that the Protocol for Assessment and Classification of Residency and Specialization Programs of the ABORL-CCF was able to discriminate them vis-à-vis quality.

ACKNOWLEDGEMENTS

Dr. José Victor Maniglia; Dr. José Alexandre de Médicis; Dr. José Francisco Figueiredo (In memorian); Dra. Shirley Pignatari and Dr. Almiro José Machado Júnior.

REFERÊNCIAS

  • 1
    Sociedade Brasileira de Otorrinolaringologia. Censo 2002. São Paulo: Sociedade Brasileira de Otorrinolaringologia; 2002. 87p.
  • 2
    Brasil. Coordenadoria de Aperfeiçoamento de Pessoal de nível Superior-CAPES - Resoluções [Acessado 19 outubro 2013]. Disponível em: http://www.capes.gov.br/sobre-a-capes/legislacao/2341-resolucoes
    » http://www.capes.gov.br/sobre-a-capes/legislacao/2341-resolucoes
  • 3
    Comissão Estadual de Residência Médica - CEREM [Acessado em 19 de outubro de 2013]. Disponível em: http://www.cerem.org.br/
    » http://www.cerem.org.br/
  • 4
    Dias Sobrinho J. Paradigmas e políticas de avaliação da educação superior. Autonomia e heteronomia. In: Universidad e investigación científica: convergências y tensiones. Vessuri H, org. Buenos Aires: CLACSO, Consejo Latinoamericano de Ciencias Sociales; 2006.
  • 5
    Dias Sobrinho J. Universidade e Avaliação Entre a ética e o mercado. Florianópolis: Insular; 1997. 54p.
  • 6
    Entrevista de Antonio Carlos Lopes. J Clin. 2004;66:1.
  • 7
    Brasil. Ministério da Educação. Resoluções e atas da Comissão Nacional de Residência Médica- CNRM [Acessado em 18 de outubro de 2013]. Disponível em: http://portal.mec.gov.br/index.php?option=com_content&view=article&id=12263&Itemid=507
    » http://portal.mec.gov.br/index.php?option=com_content&view=article&id=12263&Itemid=507
  • 8
    Crespo A. The training of otorhinolaryngologists in Brazil. Braz J Otorhinolaryngol. 2008;74(6):802.

Publication Dates

  • Publication in this collection
    Sep-Oct 2013

History

  • Received
    12 Sept 2013
  • Accepted
    14 Oct 2013
Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Sede da Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico Facial, Av. Indianópolia, 1287, 04063-002 São Paulo/SP Brasil, Tel.: (0xx11) 5053-7500, Fax: (0xx11) 5053-7512 - São Paulo - SP - Brazil
E-mail: revista@aborlccf.org.br