Primary Health Care Planning workshops: construction and validation of an assessment instrument.

OBJECTIVE
to describe the stages of construction and content validation of an instrument to assess Primary Health Care Planning workshops.


METHODS
this methodological study focused on validating the instrument's content. The instrument developed was assessed by a committee of experts using the Delphi Technique, in two rounds. For the degree of agreement, percentage agreement and Content Validity Index (CVI) were used.


RESULTS
in the first round, six experts participated, and the degree of agreement was 87% for clarity and 94% for representativeness. In the second round, five experts participated, the CVI was 0.95 for clarity, 0.97 for representativeness and 0.96 total CVI. The final instrument had 42 items divided into three chunks.


CONCLUSION
the instrument has content validity to assess Primary Health Care Planning workshops, being a tool for the use of state and municipal administrations.


INTRODUCTION
A health system that aims to meet the needs of users must seek to organize the offer of its services and practices to achieve this objective. However, after more than 20 years of the implementation of the Brazilian Unified Health System (Sistema Único de Saúde, abbreviated SUS), sector analysis reveals a low effectiveness in the provision of services and other limitations resulting from scarce public financing, persistence of segmentation in the system, organizational barriers to access, incipient integration and coordination between levels of care (1)(2) .
Primary Health Care (PHC), the first level of care, the communication center of the system must order the network and coordinate care. A health system based on PHC is a comprehensive organization strategy, ensuring that services are adjusted to the population's health needs and coordinated care for other points of care, seeking to strengthen Health Care Networks (Redes de Atenção à Saúde, abbreviated RAS) (1,3) .
Access to health services in relation to the needs of the population has been widely debated. However, insufficient coherence and coordination in health care are still considered the main causes of lack of responses to the population (4) . Seeking institutional responses to the health needs to be faced in PHC, the Brazilian National Council of Health Officers (Conselho Nacional de Secretários de Saúde, abbreviated CONASS) has been developing Health Care Planning since 2007. The methodological proposal of CONASS, which manages and monitors the initiative, goes beyond professional training, as it contributes to the organization of services in RAS. Planning is being developed in 25 regions of eleven states in Brazil, which joined from the institutional partnership with the entity (5) .
Planning is a health care process that aims to problematize and reflect on the role of PHC as the organizer of the network. It seeks to provide technical support to municipal management teams and workers in the field to qualify the organization of the RAS according to SUS principles and comprehensive care between the primary, secondary, and tertiary care levels. It consists of six Primary Health Care Planning workshops, tutorials from PHC, and tutorials from Specialized Outpatient Care.
PHC Planning workshops methodology, the first stage of the process, provides for monthly meetings, with the participation of health team workers, managers, and state and municipal technicians (6) . Such theoretical workshops aim at the conceptual alignment of themes relevant to PHC and RAS, which are continued in on-site tutoring/supervision (5) . CONASS facilitators participate in the meetings by providing technical and operational support, aiming to unite theory to practice, with a view to reorganizing the work processes in health services, defining flows in the RAS and making agreements for greater resolution, based on the reality of participants.
Health assessment is necessary for RAS through tools that identify the critical nodes of the system. In Brazil, PHC assessment initiatives that contribute to the involvement of health teams in the process of improving the quality of services have been gaining prominence (6)(7) . Despite the relevance of the assessment to measure the performance of the health system, there was a lack of instruments for assessing practices similar to Planning. We intend to present the construction and validation of an instrument to assess PHC Planning workshops, with a view to contributing to a systematic assessment of processes and development of tools that support the conduction of changes in health care practices.

OBJECTIVE
This study aims to describe the stages of construction and content validation of an instrument to assess Primary Health Care Planning workshops.

Ethical aspects
The Research Ethics Committee (REC) of the Universidade Federal de Ciências da Saúde de Porto Alegre (UFCSPA) approved the research, is in compliance with the ethical and legal precepts that govern Resolution 466/12, which deals with research involving human beings (8) .

Study design, period, and place
This methodological study aimed at the construction and content validation of an instrument to assess PHC Planning workshops. It is a strategy that systematically uses the existing knowledge to develop a reliable, accurate, and usable instrument that can be used by other researchers (9) . For content validation, the Delphi Technique was used, which consists of judging the instrument by experts with extensive experience in the subject in question, throughout rounds, aiming to ascertain trends in the object under study (10) .
The design used was a methodological study using the Delphi Technique. The instrument was composed of construction, pilot test, content validation and assessment of the degree of agreement by experts. The two Delphi rounds were held between December 2017 and March 2018, in order to reach consensus on the instrument's items.

Population and sample; inclusion and exclusion criteria
For convenience, five professionals were selected for the pilot test, two representatives from state management, two representatives from universities and one representative from Primary Care assistance. The research population was composed of a group of experts, formed by health professionals with experience in the field of PHC and/or Planning, from different states of Brazil and chosen for convenience, after analyzing the Currículo Lattes platform. The expected expert committee was 14 people, not being invited the same professionals who performed the pilot test.
The criteria to select participants were: being a health professional and having experience in PHC and/or Planning; being a professor in the field of PHC or RAS; being a professional in the state and/or municipal management of SUS, Primary Care and/ or Specialized Care; being a CONASS server/consultant and who accepted to participate in the study. Moreover, expertise (training, professional experience, knowledge production) in the requested field and, if possible, qualification (specialization/master's/doctorate) in the fields of PHC, public health, public health or health management were observed in the Currículo Lattes platform.
After construction, a self-administered, individual and electronic form was sent to five professionals for the pilot test. This aimed to observe the pertinence of the items, the understanding of the instrument and the feasibility of the electronic form. At this stage, the Delphi Technique methodology was not used. Three professionals responded, two from state management and one from PHC assistance. Two university representatives did not return to the invitation for the pilot test. As for the items, no suggestions were made for inclusions and 35 items (81%) were restructured as to the form and/or the score of the Likert scale (> 90% with an opinion in favor of the modification). One item (2%) was completely rewritten and one item (2%) was deleted after being unified with another. The instrument totaled 42 items for content validation.
After the pilot test, the instrument was submitted to the selected experts via email, along with the link to the electronic form for content validation. The self-applicable form, individual and without nominal identification of the participants, was made available through the Google Forms tool, in order to judge its content. The form had, in its first section, the Informed Consent Term and clarifications in relation to ethical and methodological aspects, being an essential condition for completing the questionnaire.
The experts received specific instructions on content validation, in which they were instructed to assess the instrument in two different stages, based on the models by Coluci et al (2015) (12) . In the first round, the experts assessed each of the chunks and items of the instrument, determining the scope, that is, whether each chunk was adequately represented by the set of items and whether all dimensions were included. They also checked whether content was appropriate, whether structure was adequate and whether content was representative (12) . It was only in this round to suggest inclusion and/or elimination of items, in addition to comments and suggestions.
In the second round, the experts assessed each item individually, in addition to the format, title and scores of the instrument, considering the clarity and/or relevance of each aspect. Regarding clarity, they assessed the wording of the items, if they were understandable and adequately expressed what is expected to be measured. As for pertinence or representativeness, they verified whether the items really reflected the concepts involved, whether they were relevant and adequate to achieve the proposed objectives (12) .

Analysis of results, and statistics
In the first round, the degree of agreement was verified using the Degree of agreement, using the formula: % agreement = number of participants who agreed x 100/total number of participants. Items with an degree of agreement of 90% among experts were considered validated (13) . Items with agreement below this percentage were reworked or rewritten, according to the assessment, giving rise to the second version of the instrument.
In the second round, degree of agreement was verified through the Content Validity Index (CVI), which measures the proportion of experts in agreement, in relation to the items and the general aspects of the instrument, in a quantitative way. The CVI of each item, each chunk and instrument was calculated using a Likert scale with four positions. To assess the item's relevance/representativeness, experts could choose between: 1 = not relevant or not representative; 2 = item needs major revision to be representative; 3 = item needs minor revision to be representative; 4 = relevant item or representative. Scope, clarity, and relevance were assessed using the same type of scale: 1 = not clear; 2 = not very clear; 3 = very clear; 4 = very clear. Calculation was performed from the sum of the answers "3" and "4" of each expert in each of the items and divided by the total number of answers. The formula used was: CVI = number of responses "3" or "4"/total number of responses. Items with a score of "1" or "2" have been revised or eliminated. Items that obtained CVI above 0.78 were considered validated (14) .

RESULTS
In the first round, six experts responded to the form, most of whom were women (66.7%) with a predominant age range of 30 to 39 years old (50%). Most had 5 to 9 years of training (50%). The professional category that most appeared was psychology (33.3%). Most participants had a specialization (50%) in the health field and the majority were state/municipal managers (33.3%) and/ or professors (33.3%). The majority (83.3%) reported experience with Health Care Planning. In this round, no suggestions were made for the inclusion of new items. However, 25 items (59.5%) were maintained according to the first version, six items (14.3%) were restructured regarding writing and 11 items (26.2 %) were completely rewritten as to the form and/or punctuation of the Likert scale. In terms of clarity, the instrument presented an 87% degree of agreement among items (<90%). As for representativeness, degree of agreement was 94% between items (> 90%).
In the second round, five experts responded to the form, most of the participants were also women (80%), with predominant age groups from 30 to 39 years old (40%) and 40 to 49 years old (40%). Most had 5 to 9 years of training (60%). The predominant professional category was nursing (40%). Of the participants, most had a doctorate (60%) and were professors (60%). Most (60%) reported experience with Health Care Planning. In this round, no suggestions were made for inclusion or exclusion of items, all 42 being maintained. Of these, 40 items (95.2%) were maintained according to the second version and two (4.8%) were restructured as to writing. Moreover, two items were relocated to another chunk. The instrument reached CVI 0.95 for clarity and 0.97 for representativeness. This resulted in a total CVI of the instrument of 0.96 (> 0.78), characterizing the validation of its content.
As for the title, the CVI reached 0.80, with respect to layout and scores, the CVI was 1.00 for both. Table 1 shows in detail the degree of agreement in relation to clarity and representativeness of items, chunks, and the instrument in general, related to the two rounds of the Delphi Technique for content validation.
Based on the results obtained, the content validation process was completed, giving rise to the third version of the instrument, considered the final validated version (Chart 1).

DISCUSSION
Analyzing the assessment instrument in relation to structure, corresponding to Chunk 1 with nine items, four showed belowexpected agreement (<90%). Difficulties were observed in the interpretations related to the writing of items; thus, they were rewritten to provide better clarity and understanding to the respondents. According to the literature, "clarity" must be intelligible to all strata of the target population because understanding phrases is more important than their artistic elegance (15) .
In the last round, all nine related items were validated above the expected agreement (CVI> 0.78). The literature points to the of Primary health care planning workshops: construction and validation of an assessment instrument Nicola T, Weis AH.
need to value the structural components in the assessment of health services and to discuss their relationship with the quality of work processes and the achievement of results, with respect to the health of the population. Findings reiterate the need for investments in the structure dimension aimed at the needs of professionals working in teams (16)(17) . As for content (process), relating to Chunk 2 with 26 items, 16 presented below-expected agreement among experts (<90%). Most of the items were rewritten, based on the suggestions of the experts who contributed qualitatively to their writing; however, some items were kept due to insufficient suggestions. A similar finding was found in another study, in which the process dimension was the one with the highest number of rectifications (15) .
It is noteworthy the fact that all 26 items related to content were validated with 100% agreement in the last round (CVI 1.00), which demonstrates stability in their clarity and representativeness. In another validation study, some items also showed full consensus of relevance among the participants, with maximum CVI, making it possible to verify credibility and transparency in these items of the instrument (18) .
The need to assess the understanding of contents of Planning workshops seeks to provide changes in the health care process offered by PHC teams. With regard to the workshops, it was considered relevant to assess, based on the proposed instrument, the level of understanding, professional training and integration among team members, because they are strategies that influence health care. Efforts must be made to improve the quality of care offered, based on assessment processes that allow monitoring the services' capacity to respond to health needs. It is necessary to expand the coverage of programs, motivate and train professionals, encourage teamwork, organize communication between levels of care and systematically assess the results obtained (15,19) .
Nevertheless, monitoring and assessment in PHC are relevant to understand the assessment processes as essential in guiding teams' health practices. The incorporation of assessment in the routine of health organizations can be carried out through instruments that integrate the planning process and the management of policies and programs. There is a long way to the institutionalization of the assessment, result of a joint work of parties and conquest of political space for autonomy of the necessary resources and strengthening of the evaluative capacity (20)(21)(22) .
Finally, with regard to the operational applicability of Planning (result), relating to Chunk 3 with seven items, four showed belowexpected agreement among experts (<90%), being restructured according to suggestions. In the last round, all seven items related to applicability were validated above the expected agreement (CVI> 0.78). Assessment of the result dimension consists of verifying whether they correspond to what was expected, i.e., to the objectives that the intervention proposed to achieve, which can influence the decision-making of managers and teams (23) .
The items in the applicability chunk listed issues that address the ability to put into practice what was apprehended. An instrument must become dynamic from the monitoring and assessment of the results obtained and the need for the service, as well as to compose other indicators. When assessing the degree of integration of the RAS, it also makes it possible to assess the capacity of Primary Care, aiming to subsidize new strategies regarding the structuring and organization of the system (20) . Therefore, it is considered that the assessment of health services from instruments acts to build a new perspective of care, since a valid judgment on the results of an intervention helps in the daily life of health services and management (18,23) .
Therefore, the study provides information on the construction, validation, representativeness and clarity of the proposed instrument, based on the collaboration of experts in the field. In this way, the instrument's final version allows the knowledge of the local reality, aiming to subsidize the assessment of health care processes, as well as serving as a basis for assessing future interventions.

Study limitations
As a limitation of the study, the sample universe of participants is mentioned, in view of refusal of some selected experts, delay in returning the material sent, and loss of participants between rounds.

Contributions to nursing, health, and public policies
Construction and validation of the instrument seeks to contribute to assess the Primary Health Care Planning process and to use instruments as tools that lead to changes in health care. Even though the final instrument was assessed by judges from various professional categories with expertise in the subject, as it is a tool that assesses structure, process and results from the perspective of RAS, it can be applied by nurses to measure the effect Planning workshops in training and in the work process. Concerning health assessment in PHC, institutionalization of this practice can support managers in decision-making, teams in health care planning and constant monitoring of results. The methodology applied in the study used parameters that showed the instrument's content validity, which can be replicated in other studies and similar initiatives.

CONCLUSION
The critical assessment carried out by the expert committee in the two content validation rounds allowed the improvement of the instrument to assess PHC Planning workshops, with reformulated questions to better suit its target audience. The use of the Delphi Technique, associated with methodologies to verify degree of agreement, was very effective in the rounds and the priorities in the approach of the workshops.
Other advantages of the Delphi Technique were observed such as anonymity among participants, provision of equal conditions of participation, application of the form to distant people and composition quality of the expert committee. Despite the fact that electronic platforms are well accepted by study participants, some challenges were observed, such as difficulty in attracting and complying with experts and loss of participants between rounds. Such challenges did not affect the validity or quality of the study.
The results contribute to the expansion of knowledge on the subject and to highlight the relevance of health assessment processes, through the use of instruments that measure the performance of practices. The instrument has content validity to assess Primary Health Care Planning workshops, being a tool for the use of state and municipal administrations.