Acessibilidade / Reportar erro

What Colors do Undergraduates Associate with Training Courses? Student Evaluations of the Applied Mathematics Educational Program through the Color Selection Method

¿Qué Colores los Estudiantes Universitarios Asocian con los Cursos? Las Evaluaciones Estudiantiles del Programa Educativo de Matemáticas Realizadas a través de la Aplicación del Método de Selección de Color

Abstract

It is no doubt that today the role of mathematics is increasing and mathematical education requires constant attention. Nevertheless, there are not so many studies devoted to the problems of teaching university students who have chosen mathematics as their profession. The purpose of this article is to research the attitude of undergraduates towards the courses that make up the Applied Mathematics educational program, implemented in one of the technical universities in Russia. The survey was conducted using the Color Selection Method, based on Max Lüscher ideas. Students have associated each course with one of the eight proposed colors. The outcomes of the survey were investigated through correlation and cluster analysis and compared with the results of another survey conducted by a verbal evaluation tool. This research has revealed that the student's assessments obtained through two methods (verbal and imaginative) do not contradict each other. The use of the Color Selection Method helps identifying the problems that arise in the educational process and allows to outline ways of improving teaching quality.

Keywords:
Student evaluation of teaching; Lüscher Test; Mathematics education; Higher education

Resumen

Actualmente, el papel de las matemáticas se expande y la educación matemática, por consecuencia, requiere una atención constante. Sin embargo, no hay muchas investigaciónes sobre los problemas de la preparación de los estudiantes universitarios que eligieron las matemáticas para su profesión. Este artículo se basa en los estudios anteriores relativos a los factores que influyen en la satisfacción de los estudiantes de matemáticas con la calidad del programa educativo. Por lo tanto, el objectivo es estudiar la actitud de los estudiantes hacia las disciplinas que componen el programa educativo Matemáticas Aplicadas, implementado en una de las universidades técnicas de Rusia. La investigación se realizó con el Método de Elección de Color, basado en las ideas de Max Lusher. Se pidió a los estudiantes que asociaran cada disciplina con uno de los ocho colores propuestos. Los resultados obtenidos fueron analisados de acuerdo a el análisis de correlación por agrupaciones, y se compararon con los resultados de otra investigación, realizada sobre la base de un instrumento de evaluación verbal. El estudio demostró que las evaluaciones estudiantiles de la calidad del programa educativo, obtenidas a través de dos métodos (verbal y figurativo), no presentan contradicción entre sí. La aplicación del Método de Selección de Color ayuda a identificar los problemas que surgen en el proceso de aprendizaje, lo que permite trazar formas de mejorar la calidad de la enseñanza.

Palabras clave:
Evaluación de la enseñanza del estudiante; Prueba de Lüscher; Educación matemática; Educación superior

1 Introduction

In a high-tech society, education becomes a valuable resource for sustainable development. Therefore, the quality assessment of education and the search for ways to improve it are essential tasks both at the national and at the individual pedagogical collectives levels. In the international educational practice, there are various approaches for assessing the quality of university work. So, there are many publications devoted to this problem (for example, GERRITSEN-VAN LEEUWENKAMP et al., 2017GERRITSEN-VAN LEEUWENKAMP, K. J.; JOOSTEN-TEN BRINKE, D.; KESTER, L. Assessment quality in tertiary education: An integrative literature review. Studies in Educational Evaluation, Edinburgh, v. 55, p. 94-116, 2017.; HARVEY; GREEN, 1993HARVEY, L.; GREEN, D. Defining quality. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 18, n. 1, p. 9-34, 1993.; NOVIKOV, 2007NOVIKOV A.M. How to evaluate the quality of education? 2007. Available at: http://anovikov.ru/artikle/kacth_obr.htm. Access in: 21 mar. 2020.
http://anovikov.ru/artikle/kacth_obr.htm...
; SCHINDLER et al., 2015SCHINDLER, L.; PULS-ELVIDGE, S.; WELZANT, H.; CRAWFORD, L. Definitions of quality in higher education: A synthesis of the literature. Higher Learning Research Communications, Baltimore, v. 5, n. 3, p. 3-13, 2015., TAM, 2001TAM, M. Measuring Quality and Performance in Higher Education. Quality in Higher Education, London, v. 7, n. 1, p. 47-54, 2001.). Essential aspects in determining the quality of university educational activities are assessments of training courses quality and the students’ satisfaction with the quality of education. The analysis of publications concludes that studies on the problems of student evaluation of teaching (SET) are conducted by scientists around the world, which is evidence of their relevance for education theorists and practitioners. The articles of Benton and Cashin (2014)BENTON, S. L.; CASHIN, W. E. Student ratings of instruction in college and university courses. In: PAULSEN, M. B. (Ed.). Higher education: Handbook of theory and research. Dordrecht: Springer, 2014. p. 279-326., Kulik (2001)KULIK, J. A. Student ratings: Validity, utility, and controversy. New Directions for Institutional Research, Corvallis, v. 2001, n. 109, p. 9-25, 2001. and Richardson (2005)RICHARDSON, J.T.E. Instruments for obtaining student feedback: A review of the literature. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 30, n. 4, p. 387-415, 2005. presented reviews of such studies.

In Russia, SET has not been widely adopted. During the period of social and economic transformations (the end of the 1980s), the Ministry of Higher and Secondary Education of the Russian Federation introduced the practice of student interviews using the questionnaire “The Teacher in the Eyes of Students”. The appearance of this questionnaire aroused criticism from scientists and educators (GORBATENKO, 1990GORBATENKO, А. S. About the questionnaire “the teacher through the eyes of students” through the eyes of a social psychologist, teacher of high school. Voprosy psihologii, Moscow, n. 1, p. 184-186, 1990.; ZELENTSOV, 1999ZELENTSOV, B. Students’ assessment of teachers: a survey methodology. Vysshee obrazovanie v Rossii, Moscow, n. 6, p. 44-47, 1999.; LEVCHENKO, 1990LEVCHENKO, Е. V. On the psychological problems encountered in the survey “Teacher through the eyes of students”. Voprosy psihologii, Moscow, n. 6, p. 181-182, 1990.). The publications of that time noted the lack of research on psychometric properties of this questionnaire and incorrect use of questioning results for the adoption of important decisions. Regular surveys in higher education institutions have gradually ceased. At present, in connection with the accession of Russia to the Bologna process, the problem of assessing the quality of education has again become topical. However, now the development of questionnaires and their application are at the level of separate higher education institutions (for example, ZELENEV; TUMANOV, 2012ZELENEV, I. R.; TUMANOV, S. V. An estimate of the quality of teaching at the university in the context of the perception of the students of their teachers. Vysshee obrazovanije v Rossii, Moscow, n. 11, p. 99-105, 2012.; KUZNETSOVA, 2019KUZNETSOVA, E. Evaluation and interpretation of student satisfaction with the quality of the university educational program in applied mathematics. Teaching Mathematics and its Applications: An International Journal of the IMA, Oxford, v. 38, n 2, p. 107-119, 2019.).

Lipetsk State Technical University also began to pay attention to students’ satisfaction with the education quality. The main problem of monitoring was the choice of an evaluation tool because due to the lack of broad practice of student evaluations, publications devoted to the analysis of evaluation tools used in Russia are scarce. To assess the students’ satisfaction with the quality of the educational program, we developed the questionnaire “The Learning Process in the Eyes of Students (LPES)” and surveyed senior students. Using a 100-point scale, students assessed the quality of the learning process organization, the quality of teaching and their results of training for the 34 study courses that make up the “Applied Mathematics” educational program. The survey outcomes were investigated through correlation, factor, regression, and cluster analysis. The study results are presented in the article Kuznetsova (2019)KUZNETSOVA, E. Evaluation and interpretation of student satisfaction with the quality of the university educational program in applied mathematics. Teaching Mathematics and its Applications: An International Journal of the IMA, Oxford, v. 38, n 2, p. 107-119, 2019.. Next, a survey was conducted using the Color Selection Method (CSM). This method is based on Max Lüscher ideas, according to which students choose the color that they associate with each academic discipline of the educational program.

The purpose of this article is, using CSM, to study the students’ attitude to the courses that make up the Applied Mathematics curriculum, and to compare the two methods (verbal LPES and imaginative CSM) to reveal their ability to identify bottlenecks in learning and teaching. This article draws on previous research on the factors affecting the satisfaction of undergraduates with the quality of the educational program in applied mathematics.

2 Literature review

It is well known that the practice of student surveys has become widespread since the late 1960s (DARWIN, 2016) and is seen as a mean of improving teaching quality and involving students in perfecting the education process (BENTON; CASHIN, 2014BENTON, S. L.; CASHIN, W. E. Student ratings of instruction in college and university courses. In: PAULSEN, M. B. (Ed.). Higher education: Handbook of theory and research. Dordrecht: Springer, 2014. p. 279-326.; HAMMONDS et al., 2017HAMMONDS, F.; MARIANO, G. J.; AMMONS, G.; CHAMBERS, S. Student evaluations of teaching: improving teaching quality in higher education. Perspectives: Policy and Practice in Higher Education, New York, v. 21, n. 1, p. 26-33, 2017.). As noted by Nilson (2012)NILSON, L. B. 14: Time to raise questions about student ratings. To Improve the Academy, San Francisco, v. 31, n. 1, p. 213-227, 2012., students have changed noticeably in recent decades. Therefore, the questions “Are the students telling us the truth?” as well as “How reliable are students’ evaluations of teaching quality?” continue to interest researchers (CLAYSON; HALEY, 2011CLAYSON, D. E.; HALEY, D. A. Are Students Telling Us the Truth? A Critical Look at the Student Evaluation of Teaching. Marketing Education Review, Philadelphia, v. 21, n. 2, p. 101-112, 2011.; FEISTAUER; RICHTER, 2017FEISTAUER, D.; RICHTER, T. How reliable are students’ evaluations of teaching quality? A variance components approach. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 42, n. 8, p. 1263-1279, 2017.; MCCLAIN et al., 2018MCCLAIN, L.; GULBIS, A.; HAYS, D. Honesty on student evaluations of teaching: effectiveness, purpose, and timing matter! Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 43, n. 3, p. 369-385, 2018.). Indeed, outcomes of SET depend not only on teachers and university administrations but also on the students themselves: their attitude to knowledge and the ways of obtaining this knowledge (O’DONOVAN, 2017O’DONOVAN, B. How student beliefs about knowledge and knowing influence their satisfaction with assessment and feedback. Higher education, Dordrecht, v. 74, n 4, p. 617-633, 2017.), the notion of teaching and its forms (FEISTAUER; RICHTER, 2017FEISTAUER, D.; RICHTER, T. How reliable are students’ evaluations of teaching quality? A variance components approach. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 42, n. 8, p. 1263-1279, 2017.), and academic maturity and emotion (LYNAM; CACHIA, 2018LYNAM, S.; CACHIA, M. Students’ perceptions of the role of assessments at higher education. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 43, n. 2, p. 223-234, 2018.). Therefore, it seems essential that along with the Likert-type evaluation tools (such as the Course Experience Questionnaire, or Students’ Evaluation of Educational Quality Questionnaire) there are alternative approaches based on associations and imaginative thinking. For example, Gal and Ginsburg (1994)GAL, I.; GINSBURG, L. The Role of Beliefs and Attitudes in Learning Statistics: Towards an Assessment Framework. Journal of Statistics Education, Abingdon-on-Thames, v. 2, n.2, p. 1-16, 1994. demonstrated an example of an evaluation tool for measuring students’ attitudes toward the study of mathematics. There is a set of 12-15 cards showing faces with different emotions (“anxious”, “puzzled”, “fearful”, “frozen”, “interested”, “indifferent”, “confused”, and so on). Students must choose one card which most reflects their feelings concerning the academic discipline or situation in the learning process. According to the authors, despite the limitations, this technique can be useful in identifying bottlenecks and problem situations. “But it is useful to break out of the mold for perceiving students’ attitudes as lying along linear paths, and for an “attitude change” as moving students “higher” or “lower” along such paths, as is the case when five-point Likert scales are used (GAL; GINSBURG, 1994GAL, I.; GINSBURG, L. The Role of Beliefs and Attitudes in Learning Statistics: Towards an Assessment Framework. Journal of Statistics Education, Abingdon-on-Thames, v. 2, n.2, p. 1-16, 1994.).

This characteristic may refer to the Color Selection Method (CSM). The developer of the test's original version (Max Lüscher) postulated that the color choice reflects the mood, functional state, and the most enduring personality traits. The test development is based on an empirical approach and was initially associated with a study of a person's emotional and psychological state (LÜSCHER, 1990LÜSCHER, M. The Lüscher color test. New York: Washington Square press, 1990.). The Russian psychologist Sobchik (2007)SOBCHIK, L.N. The method of color choices - modification of eight-colors test of Luscher: A Practical Guide. St. Petersburg: Rech’, 2007. characterizes CSM as projective, because, in her opinion, this technique reveals not so much the conscious, subjective attitude of the examinee to the color standards, but their unconscious reactions.

The ideas of diagnosis through color associations appeared in the middle of the 20th century. Since then, there were numerous empirical studies on the effect of color on humans and the possibilities of CSM for the diagnosis of a person's internal state. For example, many researchers believe that this method is not sufficiently reliable as a diagnostic tool in clinical practice (see, CERNOVSKY; FERNANDO, 1988CERNOVSKY, Z. Z.; FERNANDO, L. M. D. Color Preference of ICD-9 Schizophrenics and Normal Controls. Perceptional and Motor Skills, Eastern Virginia, v. 67, n. 1, p. 159-162, 1988.; HOLMES et al., 1985HOLMES, C. B.; FOUTY, H. E.; WURTZ, P. J.; BURDICK, B. M. The relationship between color preference and psychiatric disorders. Journal of Clinic Psychology, Hoboken, New Jersey, v. 41, n. 6, p. 746-749, 1985.). At the same time, numerous studies show examples of successful use of the test to describe personality and behavior (CARMER et al., 1974CARMER, J. C.; CRADDICK, R. A.; SMITH, E.W. An investigation of the Luscher Color Test personality descriptions. International Journal of Symbology, Atlanta, v. 5, n. 2, p. 1-6, 1974.; COROTTO; HAFNER, 1980COROTTO, L. V. HAFNER, J. L. The Luscher Color Test: Relationship between color preferences and behavior. Perceptional and Motor Skills, Eastern Virginia, v. 50, n. 3, p. 1066-1069, 1980.; LANGE; RENTFROW, 2007LANGE, R.; RENTFROW, L. Color and personality: Strong's interest inventory and Cattell's 16PF. North American Journal of Psychology, Schellsburg, v. 9, n. 3, p. 423-438, 2007.; NOLAN et al., 1995NOLAN, R. F.; DAI, Y.; STANLEY, P. D. An Investigation of the Relationship between Color Choice and Depression Measured by the Beck Depression Inventory. Perceptual and Motor Skills, Eastern Virginia, v. 81, n. 3, p. 1195–1200, 1995.). Donnelly (1974), in his study of the color preferences of college students, concluded that the reliability of the Lüscher Color Test, “although somewhat low, appears comparable to that reported for other projective techniques”. Sobchik (2007)SOBCHIK, L.N. The method of color choices - modification of eight-colors test of Luscher: A Practical Guide. St. Petersburg: Rech’, 2007. argued that the test based on Lüscher's ideas has the following essential characteristics: it does not provoke (in contrast to other, especially verbal, tests) reactions of a protective nature, and also is consistent with the concept of a holistic multi-level understanding of the individual.

Specialists of the psycho-diagnostics laboratory of Tomsk Polytechnic University have developed a particular modification of color selection method in order to investigate students’ attitude to teaching. The interpretation of colors is presented in Table 1.

Table 1
Notation of attributes

The study of this method was presented in the dissertation research by Maruhina (2003), who is an employee of this university. She saw CSM as an addition to the Likert-type questionnaire to investigate students’ attitudes toward study courses. The joint application of the two methods allows concluding that the students’ color associations correspond to the interpretations presented in Table 1.

Due to this fact, and that many studies also support the validity of the Lüscher color theory (for example, CARMER et al., 1974CARMER, J. C.; CRADDICK, R. A.; SMITH, E.W. An investigation of the Luscher Color Test personality descriptions. International Journal of Symbology, Atlanta, v. 5, n. 2, p. 1-6, 1974.; COROTTO; HAFNER, 1980COROTTO, L. V. HAFNER, J. L. The Luscher Color Test: Relationship between color preferences and behavior. Perceptional and Motor Skills, Eastern Virginia, v. 50, n. 3, p. 1066-1069, 1980.), we hypothesized that CSM could be used as a separate methodology for students’ assessment of training courses. Since the further investigation of CSM application in education has not been carried out, we believe that our study will contribute to the disclosure of CSM's capabilities as a substantive student teaching evaluation tool and will help expand the practice of using it.

3 Methodology

Sixty-six seniors of the Applied Mathematics undergraduate program took part in the survey. Of these, 39 are men and 27 are women. The survey took place in the last month of their university studies. These students were invited to participate in the survey, because they could evaluate all the courses of study that make up the educational program. As the studies confirm, senior students have academic maturity and experience and therefore can give an adequate assessment of teaching quality (LYNAM; CACHIA, 2018LYNAM, S.; CACHIA, M. Students’ perceptions of the role of assessments at higher education. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 43, n. 2, p. 223-234, 2018.; THEALL; FRANCLIN, 2001).

Also, an essential factor in conducting student interviews is the attitude of students to participate in the survey. According to Hoshower and Chen (2003)HOSHOWER, L. B.; CHEN, Y. Student Evaluation of Teaching Effectiveness: An Assessment of Student Perception and Motivation. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 28, n.1, p. 71-88, 2003., McClain et al. (2018)MCCLAIN, L.; GULBIS, A.; HAYS, D. Honesty on student evaluations of teaching: effectiveness, purpose, and timing matter! Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 43, n. 3, p. 369-385, 2018., and some other researchers, students give more honest assessments if they believe that their opinion will help improve the courses’ content and teaching rather than serve administrative interests. Benton and Cashin (2014)BENTON, S. L.; CASHIN, W. E. Student ratings of instruction in college and university courses. In: PAULSEN, M. B. (Ed.). Higher education: Handbook of theory and research. Dordrecht: Springer, 2014. p. 279-326. argue that in order to increase survey validity, “the instructor should take time to encourage students to take the process seriously.” Therefore, not only we discussed with the students the goals of the upcoming survey but also invited them to participate in the questionnaire items discussion and selection of the assessment scale (KUZNETSOVA, 2019KUZNETSOVA, E. Evaluation and interpretation of student satisfaction with the quality of the university educational program in applied mathematics. Teaching Mathematics and its Applications: An International Journal of the IMA, Oxford, v. 38, n 2, p. 107-119, 2019.).

This study consisted of two consecutive stages. At the first stage, a survey was conducted using the questionnaire “The Learning Process in the Eyes of Students” (LPES). The questionnaire, developed by us, consists of 10 items and reflects the students’ opinions on the following aspects of the learning process: the educational process quality of organization; teaching quality; the results of the process of studying the course. Using the 100-point scale, familiar to them, the students evaluated 34 courses studied by them since their first year. From the survey outcomes, we compiled a summary table, in which each academic discipline corresponds to the average score for each of the ten items.

Table 2 presents the results of the internal consistency analysis of this questionnaire. Student evaluations were analyzed using correlation, factor, regression, and cluster analysis. The results are presented in the article KUZNETSOVA (2019)KUZNETSOVA, E. Evaluation and interpretation of student satisfaction with the quality of the university educational program in applied mathematics. Teaching Mathematics and its Applications: An International Journal of the IMA, Oxford, v. 38, n 2, p. 107-119, 2019..

Table 2
Results of evaluating the internal consistency of the questionnaire The Learning Process through the Eyes of Students

In the second stage, which took place two weeks later, we used a test based on Max Lüscher's ideas. The same group of students was asked to choose which color from the ones presented in Table 1 (but without deciphering their meaning) they associated with one or another course. The students evaluated the same 34 academic disciplines as in the first stage. Based on the survey outcomes, a summary table was compiled. Table 3 demonstrates a fragment corresponding to such disciplines as Discrete Math and Sociology.

Table 3
Fragment of the color selection method summary table

According to Table 3, the discipline Discrete Math causes associations with blue in 14% of respondents (the value of the blue variable is 0.14), with green in 18% of the students surveyed (the value of the green variable is 0.18), and so on. In general, we can conclude that for most of the 66 students, this course is associated with bright, optimistic colors. The attitude of students to the Sociology course is different. For 23% of students, this course is associated with gray (the value of the gray variable is 0.23). Following Table 1, gray means indifference. At the same time, there is only 9% of association with green (the value of the green variable for Sociology is 0.9). It is evident that this course did not arouse the students’ interest.

The CSM-based summary table for the 34 academic disciplines was researched through correlation and cluster analysis using the procedures implemented in the STATISTICA application package. Further, the CSM test results compared with the LPES test results, performed in the first stage.

4 Results

4.1 Correlation analysis

First, we considered the correlation between the CSM questionnaire items. The results can be seen in Table 4.

Table 4
Correlation matrix I

For the interpretation, we will rely on the decoding of the color associations presented in Table 1. The survey shows that the association with blue (thoroughness, reliability, durability) has a statistically significant negative correlation with violet (intuition, search for something unusual in the new information). The association with green (interest, the search for meaning for themselves, usefulness) has a statistically significant negative correlation with colors such as brown (stability, conservatism, inflexibility of a position, stiffness, toughness of thought patterns), black (denial, rejection, negative perceptions) and gray (indifference, uncertainty). The association with red (activity, initiative) has weakly significant (p <0.10) negative correlations with yellow (comfort) and gray (indifference). Somewhat unexpected was the presence of a significant negative correlation between yellow (comfort) and brown (stability, conservatism, the toughness of thought patterns): conservatism, the inflexibility of thinking in teaching and learning often causes discomfort and therefore is not as harmless as it seemed to us earlier. Thus, the analysis shows that the survey results as a whole do not contradict the color interpretations presented in Table 1, which agrees with Marukhina (2003) conclusions.

Next, we considered the correlation of CSM outcomes and outcomes of the LPES questionnaire, reflecting on the shortcomings of the learning process organization and teaching quality. The results can be found in Table 5. First of all, note that the colors blue and yellow do not have significant correlations with these items. However, green (cognitive activity), black (denial) and gray (indifference) have a close enough connection with items describing the teaching quality: Teacher's knowledge on the subject, Teaching skills and Impartial and fair assessment. In addition, the shortcomings in the learning process organization, although not provoking complete denial (there aren't any highly significant correlations of the items Lack of theory and Lack of practice with the item black), cause reduced cognitive activity (negative correlation of the items Lack of theory and Lack of practice with the item green) and contribute to the formation of indifference on the studied subject (a positive correlation of these two items with the item gray).

Table 5
Correlation matrix II

In the article KUZNETSOVA (2019)KUZNETSOVA, E. Evaluation and interpretation of student satisfaction with the quality of the university educational program in applied mathematics. Teaching Mathematics and its Applications: An International Journal of the IMA, Oxford, v. 38, n 2, p. 107-119, 2019., a factor analysis was carried out based on the outcomes of the LPES questionnaire. Three factors were identified. Factor1 – shortcomings in course arrangement and gaps in teaching skills. Factor1 accounted for 46.2% of the total observed data variance. Factor2 – favorable moral climate. Factor2 accounted for 27.4% of the total observed variance in the data. Factor3 – the intrinsic subject difficulty. Factor3 accounted for 15.6% of the total observed data variance.

Consider the correlation between the color associations, on the one hand, and the items Need for a change, Knowledge level, Factor1, Factor2, and Factor3 on the other hand. The results are shown in Table 6.

Table 6
Correlation matrix III

The variable Need for a change is an indicator of the students’ dissatisfaction with the studied course. Dissatisfaction has statistically significant connections with boredom (positive correlation with gray), lack of cognitive activity (negative correlation with green), and rejection (positive correlation with black). Courses which students have marked as well-known by them are associated with green and do not cause associations with such colors as brown (conservatism, inflexibility of thinking), black (negative perceptions), or gray (indifference). Shortcomings in course arrangement and gaps in teaching skills (Factor1) are associated with the formation of indifference (positive correlation with gray). Favorable moral climate (Factor2) is associated with creative activity (positive correlation with green), lack of boredom, rejection, and inflexibility (negative correlation with gray, black and brown). The intrinsic subject difficulty (Factor3) is associated with blue (thoroughness, reliability, and durability), lack of easiness, comfort, and intuition (negative correlations with yellow and violet). The presence of low-significance (p <0.1) correlations of the variable Factor3 can be interpreted as the absence of indifference for demanding disciplines (negative correlation with gray), the presence of inflexibility of thinking and rejection (positive correlation with brown and black).

Thus, the correlation analysis indicates the consistency of the student evaluation by two methods (verbal LPES and imaginative CSM). The fact that the obtained conclusions do not contradict the theory and practice of teaching and learning testifies to the meaningful validity of the CSM. Therefore, we will continue comparing the results of the two surveys using cluster analysis.

4.2 Cluster analysis

At first, we researched the outcomes of the LPES survey. Using the K-means method, four clusters were identified: Cluster1, Cluster2, Cluster3, and Cluster4. According to the results presented by Kuznetsova (2019)KUZNETSOVA, E. Evaluation and interpretation of student satisfaction with the quality of the university educational program in applied mathematics. Teaching Mathematics and its Applications: An International Journal of the IMA, Oxford, v. 38, n 2, p. 107-119, 2019., these groups have the following characteristics. Cluster1 – problem courses. These are five disciplines related to programming which, in the students’ opinion, are interesting, but require teaching improvement. Cluster2 – severe problem courses. These are four courses which students described as uninteresting, having a low teaching quality level. Cluster3 – successful courses. These are 18 courses that are interesting enough for students. The teachers of these courses received high marks. For these courses, students noted the lack of shortcomings in the educational process organization. Cluster4 united difficult courses. These are seven courses covering abstract sections of pure mathematics. The students rated them as uninteresting, but at the same time, the teachers received high marks.

Let us consider clustering the outcomes of a survey conducted on the basis of the CSM evaluation tool. The groups of courses obtained by applying the K-means method to the CSM outcomes are denoted by Cluster1*, Cluster2*, Cluster3*, and Cluster4*. The average means of variables for each cluster are presented in Table 7. The differences in the mean values of the red and violet variables turned out to be statistically insignificant; therefore these variables were excluded during the analysis. Consider the characteristics of the obtained clusters in more detail.

Table 7
Average means of variables for each cluster

Cluster1* contains five courses: History, Russian Language, Metrology, Sociology, and Computer Network. These are boring, uninteresting academic disciplines that induce indifference (association with gray for 29% of the students surveyed) and rejection (association with black for 11%). However, studying these courses was comfortable for some students (association with yellow for 19%).

Cluster2* contains 17 courses (Algebra and Analytic Geometry, Mathematical Analysis, Discrete Math, Probability and Mathematical Statistics, Stochastic Processes, Econometrics, Differential Equations, Numerical Methods, Theory of Functions of a Complex Variable, Methods of Optimization, Algorithmic Languages and Programming, Computer Graphics, Database, Object-oriented Programming, Optimization of Computations, Algorithms of Optimization, English) that students associate with bright, favorable colors. These can be demanding fundamental disciplines (18% association with blue), exciting (18% association with green, and comfortable (17% association with yellow).

Cluster3* contains five courses that are very interesting for students (44% association with green) and comfortable (12% association with yellow). These are Economics, Mathematical Methods and Models in Economics I, Mathematical Methods and Models in Economics II, Intelligent Systems, and Application Software. Probably, students associate their future professional activities with studying these courses.

Cluster4* brings together seven courses that cause discomfort when studying them. There are Functional Analysis, Math Modeling, Differential Equations with Partial Derivatives, Mathematical Theory of Systems, Computer Architecture, Physics, and Philosophy. As is evident in Table 7, many students associate these courses with such colors as brown, black, and gray, which signal problems in the learning process. In order to understand the causes of discomfort, let us examine in more detail the colors associated with these disciplines, using the fragment of the CSM survey summary table for courses inserted into Cluster4* (see Table 8).

Table 8
Fragment of CSM summary table for CLUSTER4* courses

It is apparent that there are differences in the students’ perception on the courses gathered in Cluster4*. For example, Philosophy, Physics, and Computer Architecture have little association with blue (fundamental, complexity) and green (cognitive interest). At the same time, the students’ perception on the Philosophy course displays conservatism/inflexibility (27% of students surveyed) and boredom (23% of respondents). The perception of Physics reveals little association with the colors blue, green, yellow or red, reflecting the positive perception on the subject. At the same time, 60% of the students associated this course with the colors brown, black, and gray. It seems that the teaching of these courses poses problems caused by the teaching methods chosen by the instructor. Also, some problems are present in the teaching of Computer Architecture. Here, the main difference is that this course is almost not associated with brown color (conservatism). Some students have no problems with its study (association with yellow 14% and red 14%). However, there is a large proportion of students who are not just indifferent (14% association with gray) but demonstrate complete denial (association with black in about 27% of respondents). It means, when studying this course, that every 3-4 students faced problems that they could not overcome. Perhaps the reason is the lack of a student-centered approach, the teacher's lack of attention to the students’ needs and capabilities.

The second group of uncomfortable disciplines is courses that cover the most abstract branches of mathematics. Studying them requires a high level of theoretical thinking. Some students overcome this intrinsic difficulty of pure mathematics (association with green for Math Modeling, Differential Equations with Partial Derivatives, and association with red for Functional Analysis, Mathematical Theory of Systems, and Math Modeling). Some students see in it the fundamental essence (association with blue takes place for all disciplines of this group). However, almost every fourth student has associated these courses with brown (conservatism, inflexibility of thinking). That is, these students could not see a living meaning within the complexity of mathematical structures and the rigor of the logical inference inherent in these disciplines. Therefore, the teaching of these courses also requires improvement in order to help students overcome the intrinsic difficulty of pure mathematics, to show its meaning, elegance, value, and connection with applied problems. The social and psychological conditions that contribute to the solution of this problem are the personal and cognitive maturity inherent to the age of late adolescence, mathematical giftedness of the students who have chosen mathematics as their future professional activity (KUZNETSOVA, 2018KUZNETSOVA, E.; MATYTCINA M. A multidimensional approach to training mathematics students at a university: improving the efficiency through the unity of social, psychological and pedagogical aspects. International Journal of Mathematical Education in Science and Technology. Abingdon-on-Thames, v. 49, n. 3, p. 401-416, 2018.).

According to the cluster analysis of the CSM outcomes, of the 34 courses that make up the educational program, three (Philosophy, Physics, and Computer Architecture) cause a negative attitude in students. The process of studying 22 disciplines (17 comfortable from Clusrer2* and five extremely excited from Cluster3*) is acceptable to the students. So, students in general are satisfied with the educational program in Applied Mathematics.

The cluster comparison for the two surveys (see Table 9) has revealed that none of the members of Cluster2 (severe problem courses) belong to Cluster2* (comfortable courses) or Cluster3* (very interesting courses). None of the members of Cluster3 (successful courses) belongs to Cluster4* (uncomfortable courses). All members of Cluster3* (very interesting courses) are members of Cluster3 (successful courses).

That is, as a whole, the clustering outcomes of each of the two surveys do not contradict each other.

Table 9
Comparison of clustering for two surveys

5 Conclusion

Supporting student feedback, an analysis of survey outcomes, and respect for students’ opinion are essential factors in the successful improvement of education quality. Application of CSM for monitoring the educational program quality in applied mathematics has shown its ability not only to reveal the existence of problems but also to define their causes. For example, as a result of the cluster analysis, such courses as Philosophy, Physics, Computer Architecture, and Functional Analysis have been gathered in the group of disciplines uncomfortable for students. The detailed analysis of color associations for each of these courses allows us to conclude that the causes of discomfort are different. It will help us choose the correct strategy for improving teaching and learning.

The research has revealed the most appealing courses for students. They are Economics, Mathematical Methods and Models in Economics I, Mathematical Methods and Models in Economics II, Intelligent Systems, and Application Software. The content of these disciplines has a close connection with the students’ future professional activity as applied and industrial mathematicians. The fact that on average, 44% of students have these courses associated with green (interest, the search for themselves, usefulness, following the Table 1), proves how important the understanding of the connection between learning and practice is for them.

The cluster analysis of the CSM-based survey outcomes has detected problems with the teaching of social sciences and humanities. Such courses as History, Russian Language, Sociology, and Philosophy are not attractive to future mathematicians. An exception to this group is English (Cluster2* – comfortable courses), and Economics (Cluster3* – very interesting courses). At first sight, social sciences and humanities have no connection with the future professional activity of bachelors-mathematicians. However, mathematics expands its scope today. Increasingly mathematical methods are applied in human sciences, for example, Sociology or Psychology. Besides, the Russian mathematician Kolmogorov emphasized that for developing creative abilities in mathematics, it is necessary to go beyond mathematics and develop common cultural interests, in particular, interest in art and poetry (Yurkevich, 2001YURKEVICH, V. S. A.N. Kolmogorov and the problem of the development of mathematical giftedness. Voprosy psihologii, Moscow, n. 3, p. 107-116, 2001.). Therefore, the problem of improving teaching in social sciences and humanities within the Applied Mathematics educational program also requires care and attention.

We can note other advantages of CSM. First, conducting a survey using this questionnaire does not demand much time from students. This is especially important if we have to estimate not one but several disciplines. Indeed, for the characteristic of a course, a student has to choose a color among the eight offered, instead of responding to numerous Likert-type items. Secondly, as our experience has shown us, CSM has not caused rejection among students; none of them refused to participate in the survey. Students were interested in color associations. This fact is consistent with results obtained by Sobchik (2007)SOBCHIK, L.N. The method of color choices - modification of eight-colors test of Luscher: A Practical Guide. St. Petersburg: Rech’, 2007.. However, this task confused three respondents in the beginning of the survey.

The analysis of the CSM outcomes and their comparison with LPES allows us to conclude that the surveys for each of the two methods do not contradict each other and therefore both questionnaires may be considered as robust. It confirms our hypothesis that CSM can be used as a full-fledged evaluation tool for identifying the problems arising in the process of teaching and learning.

In the future, it would be interesting to continue researching the properties of CSM. For example, further research should investigate the influence of temperament on its results, to understand the reasons why the same situation in studying the course for some students causes a feeling of comfort (association yellow), indifference for others (association with gray), and rejection (association with black).

It is also necessary to note that the results of the CSM, as well as the results of SET in general, need a balanced attitude as one of the education quality indicators and should be considered in conjunction with other indicators of learning effectiveness.

Acknowledgments

I would like to thank the editors and anonymous arbitrators for considering this paper and for helpful comments.

References

  • BENTON, S. L.; CASHIN, W. E. Student ratings of instruction in college and university courses. In: PAULSEN, M. B. (Ed.). Higher education: Handbook of theory and research. Dordrecht: Springer, 2014. p. 279-326.
  • BEDGGOOD, R. E.; DONOVAN, J. D. University performance evaluations: what are we really measuring? Studies in higher education, Oxford, v. 37, n. 7, p. 825-842, 2012.
  • CARMER, J. C.; CRADDICK, R. A.; SMITH, E.W. An investigation of the Luscher Color Test personality descriptions. International Journal of Symbology, Atlanta, v. 5, n. 2, p. 1-6, 1974.
  • CERNOVSKY, Z. Z.; FERNANDO, L. M. D. Color Preference of ICD-9 Schizophrenics and Normal Controls. Perceptional and Motor Skills, Eastern Virginia, v. 67, n. 1, p. 159-162, 1988.
  • CLAYSON, D. E.; HALEY, D. A. Are Students Telling Us the Truth? A Critical Look at the Student Evaluation of Teaching. Marketing Education Review, Philadelphia, v. 21, n. 2, p. 101-112, 2011.
  • CLAYSON, D. E. Student evaluation of teaching and matters of reliability. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 43, n. 4, p. 666-681, 2018.
  • COROTTO, L. V. HAFNER, J. L. The Luscher Color Test: Relationship between color preferences and behavior. Perceptional and Motor Skills, Eastern Virginia, v. 50, n. 3, p. 1066-1069, 1980.
  • DONELLY, F. A. The Luscher Color Test: Reliability and Selection Preferences by College Students. Psychological Reports, Thousand Oaks, California, v. 34, n. 2, p. 635-638, 1974.
  • ELLIOT, K. M.; SHIN, D. Student Satisfaction: An Alternative Approach to Assessing this Important Concept. Journal of Higher Education Policy and Management, London, v. 24, n. 2, p. 197-209, 2002.
  • FEISTAUER, D.; RICHTER, T. How reliable are students’ evaluations of teaching quality? A variance components approach. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 42, n. 8, p. 1263-1279, 2017.
  • GAL, I.; GINSBURG, L. The Role of Beliefs and Attitudes in Learning Statistics: Towards an Assessment Framework. Journal of Statistics Education, Abingdon-on-Thames, v. 2, n.2, p. 1-16, 1994.
  • GERRITSEN-VAN LEEUWENKAMP, K. J.; JOOSTEN-TEN BRINKE, D.; KESTER, L. Assessment quality in tertiary education: An integrative literature review. Studies in Educational Evaluation, Edinburgh, v. 55, p. 94-116, 2017.
  • GORBATENKO, А. S. About the questionnaire “the teacher through the eyes of students” through the eyes of a social psychologist, teacher of high school. Voprosy psihologii, Moscow, n. 1, p. 184-186, 1990.
  • HAMMONDS, F.; MARIANO, G. J.; AMMONS, G.; CHAMBERS, S. Student evaluations of teaching: improving teaching quality in higher education. Perspectives: Policy and Practice in Higher Education, New York, v. 21, n. 1, p. 26-33, 2017.
  • HARVEY, L.; GREEN, D. Defining quality. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 18, n. 1, p. 9-34, 1993.
  • HOLMES, C. B.; FOUTY, H. E.; WURTZ, P. J.; BURDICK, B. M. The relationship between color preference and psychiatric disorders. Journal of Clinic Psychology, Hoboken, New Jersey, v. 41, n. 6, p. 746-749, 1985.
  • HOSHOWER, L. B.; CHEN, Y. Student Evaluation of Teaching Effectiveness: An Assessment of Student Perception and Motivation. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 28, n.1, p. 71-88, 2003.
  • KUZNETSOVA, E.; MATYTCINA M. A multidimensional approach to training mathematics students at a university: improving the efficiency through the unity of social, psychological and pedagogical aspects. International Journal of Mathematical Education in Science and Technology. Abingdon-on-Thames, v. 49, n. 3, p. 401-416, 2018.
  • KUZNETSOVA, E. Evaluation and interpretation of student satisfaction with the quality of the university educational program in applied mathematics. Teaching Mathematics and its Applications: An International Journal of the IMA, Oxford, v. 38, n 2, p. 107-119, 2019.
  • KULIK, J. A. Student ratings: Validity, utility, and controversy. New Directions for Institutional Research, Corvallis, v. 2001, n. 109, p. 9-25, 2001.
  • LANGE, R.; RENTFROW, L. Color and personality: Strong's interest inventory and Cattell's 16PF. North American Journal of Psychology, Schellsburg, v. 9, n. 3, p. 423-438, 2007.
  • LEVCHENKO, Е. V. On the psychological problems encountered in the survey “Teacher through the eyes of students”. Voprosy psihologii, Moscow, n. 6, p. 181-182, 1990.
  • LÜSCHER, M. The Lüscher color test New York: Washington Square press, 1990.
  • LYNAM, S.; CACHIA, M. Students’ perceptions of the role of assessments at higher education. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 43, n. 2, p. 223-234, 2018.
  • MCCLAIN, L.; GULBIS, A.; HAYS, D. Honesty on student evaluations of teaching: effectiveness, purpose, and timing matter! Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 43, n. 3, p. 369-385, 2018.
  • NILSON, L. B. 14: Time to raise questions about student ratings. To Improve the Academy, San Francisco, v. 31, n. 1, p. 213-227, 2012.
  • NOLAN, R. F.; DAI, Y.; STANLEY, P. D. An Investigation of the Relationship between Color Choice and Depression Measured by the Beck Depression Inventory. Perceptual and Motor Skills, Eastern Virginia, v. 81, n. 3, p. 1195–1200, 1995.
  • NOVIKOV A.M. How to evaluate the quality of education? 2007. Available at: http://anovikov.ru/artikle/kacth_obr.htm Access in: 21 mar. 2020.
    » http://anovikov.ru/artikle/kacth_obr.htm
  • O’DONOVAN, B. How student beliefs about knowledge and knowing influence their satisfaction with assessment and feedback. Higher education, Dordrecht, v. 74, n 4, p. 617-633, 2017.
  • POUNDER, J. Is student evaluation of teaching worthwhile? An analytical framework for answering the question. Quality assurance in education, Bingley, v. 15, 2, p. 178-191, 2007.
  • RICHARDSON, J.T.E. Instruments for obtaining student feedback: A review of the literature. Assessment & Evaluation in Higher Education, Abingdon-on-Thames, v. 30, n. 4, p. 387-415, 2005.
  • SCHINDLER, L.; PULS-ELVIDGE, S.; WELZANT, H.; CRAWFORD, L. Definitions of quality in higher education: A synthesis of the literature. Higher Learning Research Communications, Baltimore, v. 5, n. 3, p. 3-13, 2015.
  • SPOOREN, P.; BROCKX, B.; MORTELMANS, D. On the validity of student evaluation of teaching the state of the art. Review of Educational Research, Thousand Oaks, California, v. 83, n. 4, p. 598-642, 2013.
  • SOBCHIK, L.N. The method of color choices - modification of eight-colors test of Luscher: A Practical Guide. St. Petersburg: Rech’, 2007.
  • TAM, M. Measuring Quality and Performance in Higher Education. Quality in Higher Education, London, v. 7, n. 1, p. 47-54, 2001.
  • YURKEVICH, V. S. A.N. Kolmogorov and the problem of the development of mathematical giftedness. Voprosy psihologii, Moscow, n. 3, p. 107-116, 2001.
  • ZELENEV, I. R.; TUMANOV, S. V. An estimate of the quality of teaching at the university in the context of the perception of the students of their teachers. Vysshee obrazovanije v Rossii, Moscow, n. 11, p. 99-105, 2012.
  • ZELENTSOV, B. Students’ assessment of teachers: a survey methodology. Vysshee obrazovanie v Rossii, Moscow, n. 6, p. 44-47, 1999.

Publication Dates

  • Publication in this collection
    17 Apr 2020
  • Date of issue
    Jan-Apr 2020

History

  • Received
    06 Nov 2018
  • Accepted
    09 Dec 2019
UNESP - Universidade Estadual Paulista, Pró-Reitoria de Pesquisa, Programa de Pós-Graduação em Educação Matemática Avenida 24-A, 1515, Caixa Postal 178, 13506-900 Rio Claro - SP Brasil - Rio Claro - SP - Brazil
E-mail: bolema.contato@gmail.com