Acessibilidade / Reportar erro

Assessing the service quality in Software-as-a-Service from the customers’ perspective: a methodological approach and case of use

Abstract

Despite the advances in the evaluation of the quality of Software-as-a-Service (SaaS), new studies seem to be necessary, since the existing criticisms concerning the SERVQUAL scale and the lack of studies concerning the use of SERVPERF for this purpose. This work aims to fulfil this gap by proposing a methodological approach to assess the SaaS service quality by measuring SaaS customers’ satisfaction. Factor analysis is used to summarize the information contained in the original items into a smaller set of new dimensions and Quartile analysis is suggested to determine the most critical items. By conducting a study, the factors that most influence customer’s satisfaction are customer service, customer assistance and the reliability of SaaS. Most of the critical items are associated with the transparency and accuracy in correcting errors, the company's interest in solving customer problems, SaaS application's ability to meet business requirements, implemented updates and regularity of service performance.

Keywords
SaaS service quality; Customer satisfaction; Service quality; Software as a service

1. Introduction

Cloud computing is a new paradigm that allows the number of network-based services to increase (Domínguez-Mayo et al., 2015Domínguez-Mayo, F. J., García-García, J. A., Escalona, M. J., Mejías, M., Urbieta, M., & Rossi, G. (2015). A framework and tool to manage Cloud Computing service quality. Software Quality Journal, 23(4), 595-625. http://dx.doi.org/10.1007/s11219-014-9248-0.
http://dx.doi.org/10.1007/s11219-014-924...
) and it is probably the most significant new technology in the twenty-first century which is now having great impact on society, especially to the business world (Wei, 2010Wei, Z. W. Z. (2010, November 4-6). An initial review of cloud computing services research development. In Proceedings of the 2010 International Conference on Multimedia Information Networking and Security. Nanjing, Jiangsu: IEEE.). The term 'cloud' is derived from the idea of access of businesses and users being able to access applications from anywhere in the world on demand (Low et al., 2011Low, C., Chen, Y., & Wu, M. (2011). Understanding the determinants of cloud computing adoption. Industrial Management & Data Systems, 111(7), 1006-1023. http://dx.doi.org/10.1108/02635571111161262.
http://dx.doi.org/10.1108/02635571111161...
).

Several companies (e.g. Google, IBM, Microsoft, HP, Amazon, and Yahoo) have already made investments not only in cloud research but also in establishing cloud computing infrastructure services (Sahinoglu & Cueva-Parra, 2011Sahinoglu, M., & Cueva-Parra, L. (2011). CLOUD computing. Wiley Interdisciplinary Reviews: Computational Statistics, 3(1), 47-68. http://dx.doi.org/10.1002/wics.139.
http://dx.doi.org/10.1002/wics.139...
). By 2018, more than 78 percent of workloads will be processed by cloud data centres (22 percent will be processed by traditional data centres), and, 59 percent of the total cloud workloads will be SaaS workloads, up from 41 percent in 2013 (Cisco, 2014Cisco. (2014). Cisco global cloud index: forecast and methodology, 2013-2018 (White Paper). Retrieved in 2017, March 27, from https://www.terena.org/mail-archives/storage/pdfVVqL9tLHLH.pdf
https://www.terena.org/mail-archives/sto...
).

In practice, the services offered by clouds can be grouped into three main categories (Zhang et al., 2010Zhang, Q., Cheng, L., & Boutaba, R. (2010). Cloud computing: state-of-the-art and research challenges. Journal of Internet Services and Applications, 1(1), 7-18. http://dx.doi.org/10.1007/s13174-010-0007-6.
http://dx.doi.org/10.1007/s13174-010-000...
; Lian et al., 2014Lian, J. W., Yen, D. C., & Wang, Y. T. (2014). An exploratory study to understand the critical factors affecting the decision to adopt cloud computing in Taiwan hospital. International Journal of Information Management, 34(1), 28-36. http://dx.doi.org/10.1016/j.ijinfomgt.2013.09.004.
http://dx.doi.org/10.1016/j.ijinfomgt.20...
; Walz & Grier, 2010Walz, J., & Grier, D. A. (2010). Time to push the cloud. IT Professional, 12(5), 14-16. http://dx.doi.org/10.1109/MITP.2010.137.
http://dx.doi.org/10.1109/MITP.2010.137...
; Karadsheh, 2012Karadsheh, L. (2012). Applying security policies and service level agreement to IaaS service model to enhance security and transition. Computers & Security, 31(3), 315-326. http://dx.doi.org/10.1016/j.cose.2012.01.003.
http://dx.doi.org/10.1016/j.cose.2012.01...
): Infrastructure as a service (IaaS) allows cloud clients to create various configurations of computer systems from servers to complete clusters and also to host their own services and even complete software systems without having to worry about hardware costs (Goscinski & Brock, 2010Goscinski, A., & Brock, M. (2010). Toward dynamic and attribute based publication, discovery and selection for cloud computing. Future Generation Computer Systems, 26(7), 947-970. http://dx.doi.org/10.1016/j.future.2010.03.009.
http://dx.doi.org/10.1016/j.future.2010....
). Platform as a service (PaaS) provides developers with a platform including systems and environments comprising all life cycle of developing, testing, deploying and hosting of web applications (Rimal et al., 2009Rimal, B. P., Choi, E., & Lumb, I. (2009, August 25-27). A taxonomy and survey of cloud computing systems. In Proceedings of the Fifth International Joint Conference on INC, IMS and IDC (pp. 44-51). Washington: IEEE Computer Society.). Software as a service (SaaS) refers to the use of specific services and applications over the Internet (Goscinski & Brock, 2010Goscinski, A., & Brock, M. (2010). Toward dynamic and attribute based publication, discovery and selection for cloud computing. Future Generation Computer Systems, 26(7), 947-970. http://dx.doi.org/10.1016/j.future.2010.03.009.
http://dx.doi.org/10.1016/j.future.2010....
; Zhang et al., 2010Zhang, Q., Cheng, L., & Boutaba, R. (2010). Cloud computing: state-of-the-art and research challenges. Journal of Internet Services and Applications, 1(1), 7-18. http://dx.doi.org/10.1007/s13174-010-0007-6.
http://dx.doi.org/10.1007/s13174-010-000...
) and it allows clients to use the software without worrying about the costs and efforts to keep software licenses current nor the handling of software updates (Goscinski & Brock, 2010Goscinski, A., & Brock, M. (2010). Toward dynamic and attribute based publication, discovery and selection for cloud computing. Future Generation Computer Systems, 26(7), 947-970. http://dx.doi.org/10.1016/j.future.2010.03.009.
http://dx.doi.org/10.1016/j.future.2010....
).

In a business perspective, SaaS is a newly emerging business model in the software industry, since SaaS vendors are responsible not only for developing the application, but the entire suite of services in order to provide the entire customer experience including implementation, testing, training, troubleshooting, maintenance, hosting, upgrades and security (Ju et al., 2010Ju, J., Wang, Y., Fu, J., Wu, J., & Lin, Z. (2010, June 22-23). Research on Key Technology in SaaS. In Proceedings of the 2010 International Conference on Intelligent Computing and Cognitive Informatics (pp. 384-387). Kuala Lumpur, Malaysia: IEE. http://dx.doi.org/10.1109/ICICCI.2010.120.
http://dx.doi.org/10.1109/ICICCI.2010.12...
).

Pricing in SaaS is generally done on a pay-per-user basis (Ju et al., 2010Ju, J., Wang, Y., Fu, J., Wu, J., & Lin, Z. (2010, June 22-23). Research on Key Technology in SaaS. In Proceedings of the 2010 International Conference on Intelligent Computing and Cognitive Informatics (pp. 384-387). Kuala Lumpur, Malaysia: IEE. http://dx.doi.org/10.1109/ICICCI.2010.120.
http://dx.doi.org/10.1109/ICICCI.2010.12...
; Lee et al., 2009Lee, J. Y., Lee, J. W., Cheun, D. W., & Kim, S. D. (2009, December 2-4). A quality model for evaluating Software-as-a-service in cloud computing. In Proceedings of the 2009 Seventh ACIS International Conference on Software Engineering Research, Management and Applications (pp. 261-266). Haikou, China: IEE.; Walz & Grier, 2010Walz, J., & Grier, D. A. (2010). Time to push the cloud. IT Professional, 12(5), 14-16. http://dx.doi.org/10.1109/MITP.2010.137.
http://dx.doi.org/10.1109/MITP.2010.137...
), and some developers charge for a minimum amount of users and storage is also charged for separately (Ju et al., 2010Ju, J., Wang, Y., Fu, J., Wu, J., & Lin, Z. (2010, June 22-23). Research on Key Technology in SaaS. In Proceedings of the 2010 International Conference on Intelligent Computing and Cognitive Informatics (pp. 384-387). Kuala Lumpur, Malaysia: IEE. http://dx.doi.org/10.1109/ICICCI.2010.120.
http://dx.doi.org/10.1109/ICICCI.2010.12...
). In this context, since no cost is necessary to purchase and to maintain/update the application and also because companies purchase only the amount of computing services needed, SaaS radically reduces capital requirements for many companies.

However, despite the benefits provided by cloud computing and the SaaS applications, care should be taken to some aspects related to them. In this context, the lack of quality assurance in SaaS applications can cause access failures, problems with availability, reliability and data integrity (Duarte Filho et al., 2013Duarte Filho, N. F., Padua, C. I. P. S, Bermejo, P. H. S., Zambalde, A. L., & Barros, U. S. (2013). Saasquality - a method for quality evaluation of software as a service (Saas). International Journal of Computer Science and Information Technology, 5(3), 101-117. http://dx.doi.org/10.5121/ijcsit.2013.5308.
http://dx.doi.org/10.5121/ijcsit.2013.53...
), there are great risks concerning privacy and security associated with cloud computing (Walz & Grier, 2010Walz, J., & Grier, D. A. (2010). Time to push the cloud. IT Professional, 12(5), 14-16. http://dx.doi.org/10.1109/MITP.2010.137.
http://dx.doi.org/10.1109/MITP.2010.137...
), the intrinsic features of SaaS require more rigorous quality measurement because conventional frameworks for measuring quality such as ISO 9126 would be limited in assessing the quality of SaaS (Lee et al., 2009Lee, J. Y., Lee, J. W., Cheun, D. W., & Kim, S. D. (2009, December 2-4). A quality model for evaluating Software-as-a-service in cloud computing. In Proceedings of the 2009 Seventh ACIS International Conference on Software Engineering Research, Management and Applications (pp. 261-266). Haikou, China: IEE.), and because of problems in IS, many organizations fail (partially or completely) to deliver the expected technical performance, functionality and business benefits within budget and schedule (Loukis & Charalabidis, 2011Loukis, E., & Charalabidis, Y. (2011). Why do eGovernment projects fail? Risk factors of large information systems projects in the greek public sector. International Journal of Electronic Government Research, 7(2), 59-77. http://dx.doi.org/10.4018/jegr.2011040104.
http://dx.doi.org/10.4018/jegr.201104010...
).

Several studies have been developed in order to contribute to improving the quality of SaaS regarding the development of: SaaS maturity model and the SaaS architecture (Kang et al., 2010Kang, S., Myung, J., Yeon, J., Ha, S. W., Cho, T., Chung, J. M., & Lee, S. G. (2010). A general maturity model and reference architecture for SaaS service. In H. Kitagawa, Y. Ishikawa, Q. Li, & C. Watanabe (Eds.), DASFAA 2010: Database Systems for Advanced Applications (Lecture Notes in Computer Science, vol 5982, pp. 337-346). Heidelberg: Springer.), requirement elicitation technique for SaaS applications (Zhou et al., 2011Zhou, X., Yi, L., & Liu, Y. (2011, July 10-12). A collaborative requirement elicitation technique for SaaS applications. In Proceedings of the 2011 IEEE International Conference on Service Operations, Logistics and Informatics (pp. 83-88). Beijing, China: IEEE. http://dx.doi.org/10.1109/SOLI.2011.5986533.
http://dx.doi.org/10.1109/SOLI.2011.5986...
), methodology to evaluate SaaS vulnerability (Ganesan et al., 2012Ganesan, R., Sarkar, S., & Tewari, N. (2012, June 25-28). An independent verification of errors and vulnerabilities in SaaS cloud. In Proceedings of the IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN 2012) (pp. 1-6). Boston, MA, USA: IEEE. http://doi.org/10.1109/DSNW.2012.6264695.
http://doi.org/10.1109/DSNW.2012.6264695...
), reference guide to evaluate development process of SaaS system (Cancian, 2009Cancian, M. H. (2009). Uma proposta de guia de referência para provedores de software como um serviço (dissertação de mestrado). Universidade Federal de Santa Catarina, Santa Catarina.), methods to evaluate the quality of SaaS products and their characteristics (Duarte Filho et al., 2013Duarte Filho, N. F., Padua, C. I. P. S, Bermejo, P. H. S., Zambalde, A. L., & Barros, U. S. (2013). Saasquality - a method for quality evaluation of software as a service (Saas). International Journal of Computer Science and Information Technology, 5(3), 101-117. http://dx.doi.org/10.5121/ijcsit.2013.5308.
http://dx.doi.org/10.5121/ijcsit.2013.53...
; Gao et al., 2011Gao, J., Pattabhiraman, P., Bai, X., & Tsai, W. T. (2011, December 12-14). SaaS performance and scalability evaluation in clouds. In Proceedings of the 2011 IEEE 6th International Symposium on Service Oriented System (SOSE) (pp. 61-71). Irvine, CA: IEEE. http://doi.org/10.1109/SOSE.2011.6139093.
http://doi.org/10.1109/SOSE.2011.6139093...
; Godse & Mulik, 2009Godse, M., & Mulik, S. (2009, September 21-25). An approach for selecting Software-as-a-Service (SaaS) Product. In Proceedings of the IEEE International Conference on Cloud Computing (pp. 155-158). Washington: IEEE Computer Society. http://dx.doi.org/10.1109/CLOUD.2009.74.
http://dx.doi.org/10.1109/CLOUD.2009.74...
; Lee et al., 2009Lee, J. Y., Lee, J. W., Cheun, D. W., & Kim, S. D. (2009, December 2-4). A quality model for evaluating Software-as-a-service in cloud computing. In Proceedings of the 2009 Seventh ACIS International Conference on Software Engineering Research, Management and Applications (pp. 261-266). Haikou, China: IEE.) and a quality model to measure the security, quality of service, and software quality of the SaaS system, from the perspective of platform, provider and customer separately (Wen & Dong, 2013Wen, P. X., & Dong, L. (2013, September 9-11). Quality model for evaluating SaaS service. In Proceedings of the 4th International Conference on Emerging Intelligent Data and Web Technologies (pp. 83-87). Xi'an, China: EIDWT. http://dx.doi.org/10.1109/EIDWT.2013.19.
http://dx.doi.org/10.1109/EIDWT.2013.19...
). Specially, in Wen & Dong (2013)Wen, P. X., & Dong, L. (2013, September 9-11). Quality model for evaluating SaaS service. In Proceedings of the 4th International Conference on Emerging Intelligent Data and Web Technologies (pp. 83-87). Xi'an, China: EIDWT. http://dx.doi.org/10.1109/EIDWT.2013.19.
http://dx.doi.org/10.1109/EIDWT.2013.19...
model, quality of service (QoS) metrics mainly focus on quality of platform (QoP), quality of application (QoA) and quality of experience (QoE). However, the proposed QoP metrics (e.g. data auditing, application isolation, service capability, and penetration testing), QoA metrics (e.g. multi-tenancy, configuration, interoperability and data isolation) and QoE metrics (e.g. usability, response timeliness, total cost of ownership and return of investment) seem to be not enough comprehensible for ordinary SaaS users, i.e. such metrics seem to be adequate to evaluating SaaS service only from the perspective of SaaS experts.

However, an effective analysis of the services produced by IT division for other organizational divisions, or IT client divisions, should take into consideration how these clients perceive IT services (Roses et al., 2009Roses, L. K., Hoppen, N., & Henrique, J. L. (2009). Management of perceptions of information technology service quality. Journal of Business Research, 62(9), 876-882. http://dx.doi.org/10.1016/j.jbusres.2008.10.005.
http://dx.doi.org/10.1016/j.jbusres.2008...
). Nowadays, clients not only demand quality products obtained from mature processes, but they also require quality in the services they receive (Mesquida et al., 2012Mesquida, A. L., Mas, A., Amengual, E., & Calvo-Manzano, J. A. (2012). IT service management process improvement based on ISO/IEC 15504: a systematic review. Information and Software Technology, 54(3), 239-247. http://dx.doi.org/10.1016/j.infsof.2011.11.002.
http://dx.doi.org/10.1016/j.infsof.2011....
). According to these authors, while IT organizations have been deploying their software development processes, there has been an on-going demand for better IT services. More than this, the client’s desire for a relationship with the service provider influence the client’s level of motivation to fully participate in the service and their level of confidence in their consumption choices – as a result, the greater the client’s involvement with the service – the more they feel comfortable and a “part” of and engage in the service – the stronger the impact on relationship strength, satisfaction and retention (Raciti et al., 2013Raciti, M. M., Ward, T., & Dagger, T. S. (2013). The effect of relationship desire on consumer‐to‐business relationships. European Journal of Marketing, 47(3/4), 615-634. http://dx.doi.org/10.1108/03090561311297490.
http://dx.doi.org/10.1108/03090561311297...
).

These issues can be extended to IT organizations, since they also need to satisfy their clients with the services and software provided. In this context, the measurement of the degree of customer’s satisfaction concerning a set of relevant criteria is one of the most widely used methods to assess the quality of services (Costa et al., 2007Costa, H. G., Mansur, A. F. U., Freitas, A. L. P., & Carvalho, R. A. (2007). ELECTRE TRI aplicado a avaliação da satisfação de consumidores. Produção, 17(2), 230-245. http://dx.doi.org/10.1590/S0103-65132007000200002.
http://dx.doi.org/10.1590/S0103-65132007...
).

Some scales and instruments resulting from the adaptation of the SERVQUAL scale (Parasuraman et al., 1988Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1988). SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality. Journal of Retailing, 64(1), 12-40.) have been developed to investigate the service quality concerning online services. However, notwithstanding its growing popularity and widespread application, the SERVQUAL scale has been subjected to a number of theoretical and operational criticisms (Buttle, 1996Buttle, F. (1996). SERVQUAL: review, critique, research agenda. European Journal of Marketing, 30(1), 8-32. http://dx.doi.org/10.1108/03090569610105762.
http://dx.doi.org/10.1108/03090569610105...
) and most of the instruments resulting from adaptation of the SERVQUAL scale were conducted to assess the quality of online services in a business-to-customer (B2C) and/or websites research context.

Despite the importance of using SaaS to help firms gain benefits and the popularity of SaaS, understanding about SaaS satisfaction is still in its infancy (Chou & Chiang, 2013Chou, S. W., & Chiang, C. H. (2013). Understanding the formation of software-as-a-service (SaaS) satisfaction from the perspective of service quality. Decision Support Systems, 56(1), 148-155. http://dx.doi.org/10.1016/j.dss.2013.05.013.
http://dx.doi.org/10.1016/j.dss.2013.05....
) and the relative newness of SaaS means that there have been few empirically validated models of SaaS satisfaction from the client’s perspective, with little overlap among models (Yang et al., 2015Yang, Z., Sun, J., Zhang, Y., & Wang, Y. (2015). Computers in human behavior understanding SaaS adoption from the perspective of organizational users: a tripod readiness model. Computers in Human Behavior, 45, 254-264. http://dx.doi.org/10.1016/j.chb.2014.12.022.
http://dx.doi.org/10.1016/j.chb.2014.12....
).

Further, the service quality literature reveals that the evaluation of the SaaS service quality has been traditionally conducted by measuring the gap resulting from the difference between clients' expectations of SaaS and their perceptions of SaaS performance or by means of the perceptions-only measurement. It is also noted that the aforementioned studies are mostly devoted to the problem of assessing SaaS service quality concerning technical metrics and the evaluation process is conducted only from the perspective of SaaS experts. Studies for evaluating SaaS service quality concerning clients’ satisfaction are still incipient.

Desiring to contribute to the problem of evaluation of SaaS service quality, this work proposes a methodological approach to assess the quality of SaaS from the perspective of the clients. Based on the service evaluation literature, a questionnaire was developed in order to obtain the profile of clients and their perception about the degree of satisfaction in relation to those items. More specifically, by conducting an exploratory study in a SaaS organization, this work aimed to answer the following research question: “What factors do influence on the Saas service quality from the perspective of the clients?”

This paper is organized as follows: Section 2 presents the theoretical concepts related to service quality and its relationship with customer satisfaction; Section 3 describes the steps for structuring the proposed methodological approach to assess the SaaS service quality by measuring the satisfaction degree of users of SaaS applications - the results of the study and some analysis are presented; Sections 4 and 5 show, respectively, the contributions of the study, the limitations and the proposals for future research.

2. Service quality and customer satisfaction

Three typical characteristics of services - intangibility, heterogeneity and inseparability - must be acknowledged to a full understanding of service quality (Parasuraman et al., 1985Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1985). A conceptual model of service quality and its implications for future research. Journal of Marketing, 49(4), 41-50. http://dx.doi.org/10.2307/1251430.
http://dx.doi.org/10.2307/1251430...
). Although services usually have the same characteristics, IT services have peculiarities that make your measurement even more subjective (Freitas & Albernaz, 2012Freitas, A. L. P., & Albernaz, C. M. R. M. (2012). A multicriteria approach for assessing the quality of information technology support services. International Journal of Business and Social Science, 3(16), 59-72. Retrieved in 2017, March 27, from http://ijbssnet.com/journals/Vol_3_No_16_Special_Issue_August_2012/7.pdf
http://ijbssnet.com/journals/Vol_3_No_16...
) and possibly it is more difficult to be conducted.

According to Peppard (2003)Peppard, J. (2003). Managing IT as a portfolio of services. European Management Journal, 21(4), 467-483. http://dx.doi.org/10.1016/S0263-2373(03)00074-4.
http://dx.doi.org/10.1016/S0263-2373(03)...
IT services are more or less intangible, i.e., they are generally something one cannot touch or feel. Although IT services may be associated with something physical and may have a predominantly physical outcome - for example, the delivery and installation of a PC or the provision of a cable for network connection - there are other IT services that can be entirely intangible, such as advice and support from a help desk, training, consultancy and design of IT systems or server software upgrade. The information services provided by computer applications are also intangible but require a physical platform to exist (inseparability).

Services are heterogeneous; their performance varies from producer to producer, from consumer to consumer, and from day to day. The consistency of behaviour from employees is difficult to be assured, because what the company intends to provide can be totally different from what the client actually receives (Parasuraman et al., 1985Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1985). A conceptual model of service quality and its implications for future research. Journal of Marketing, 49(4), 41-50. http://dx.doi.org/10.2307/1251430.
http://dx.doi.org/10.2307/1251430...
). This condition is no different for IT services. For example, clients may have different perceptions of service quality regarding the attendance policy provided by a helpdesk: on the one hand the procedure can be considered very 'bureaucratic' in the opinion of clients who have some expertise on the subject, on the other hand, it can be positively considered very 'detailed and explanatory' for those who understand little or nothing about the matter.

Many IT services are produced and consumed simultaneously. For instance, support from a helpdesk is generally provided and utilized immediately. The consequence of this is that a bad service cannot be perceived and avoided before it has been received by the client. A bad experience can impact the perception that the client will have the next time he uses the same service. Then, the need arises to evaluate service quality concerning the client perspective (Freitas & Albernaz, 2012Freitas, A. L. P., & Albernaz, C. M. R. M. (2012). A multicriteria approach for assessing the quality of information technology support services. International Journal of Business and Social Science, 3(16), 59-72. Retrieved in 2017, March 27, from http://ijbssnet.com/journals/Vol_3_No_16_Special_Issue_August_2012/7.pdf
http://ijbssnet.com/journals/Vol_3_No_16...
).

Despite the relevance of the studies reported previously, some limiting aspects reveal the existence of a gap concerning the problem in question and these aspects are explored by the methodological approach proposed in this work.

Since there are still no consensus among researchers and practitioners concerning the comprehension of the relationship between the service quality vs. customer satisfaction. Several studies have been conducted to investigate the direction of causality between customer satisfaction and service quality, although these constructs have been defined differently by researchers.

Service quality, as perceived by customers, stems from a comparison of what they feel service providers should offer (i.e., from their expectations) with their perceptions of the performance of the service providers (Parasuraman et al., 1985Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1985). A conceptual model of service quality and its implications for future research. Journal of Marketing, 49(4), 41-50. http://dx.doi.org/10.2307/1251430.
http://dx.doi.org/10.2307/1251430...
). Based on this definition, Parasuraman et al. (1988)Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1988). SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality. Journal of Retailing, 64(1), 12-40. developed the SERVQUAL scale to assess service quality computing the differences (gaps) between customer expectations of service and service performance perceived by them regarding 22 items grouped in five dimensions (tangibles, responsiveness, reliability, assurance and empathy). Satisfaction is the customer’s evaluation of a product or service in terms of weather that product or service has met the customer’s needs and expectations. When needs and expectations are not met, it is assumed to result in customer dissatisfaction with the product or service.

Conversely, customer satisfaction is a broader concept that is influenced by perceptions of service quality, product quality, price, as well as situational and personal factors (Zeithaml et al., 2006Zeithaml, V. A., Bitner, M. J., & Gremler, D. D. (2006). Services marketing: integrating customer focus across the firm (4th ed.). Boston: McGraw-Hill/Irwin.). The customer’s satisfaction with the services of the organization is based on (or it is a function of) all encounters/customer’s experiences with that organization. Similar to service quality, customer satisfaction can occur at multiple levels in an organization (e.g. satisfaction with the attendant, satisfaction with a particular service and satisfaction with the organization as a whole) (Sureshchandar et al., 2002Sureshchandar, G. S., Rajendran, C., & Anantharaman, R. N. (2002). The relationship between service quality and customer satisfaction – a factor specific approach. Journal of Services Marketing, 16(4), 363-379. http://dx.doi.org/10.1108/08876040210433248.
http://dx.doi.org/10.1108/08876040210433...
). Furthermore, different customers will express varying levels of satisfaction for the same service experience/encounter (Ueltschy et al., 2007Ueltschy, L. C., Laroche, M., Eggert, A., & Bindl, U. (2007). Service quality and satisfaction: an international comparison of professional services perceptions. Journal of Services Marketing, 21(6), 410-423. http://dx.doi.org/10.1108/08876040710818903.
http://dx.doi.org/10.1108/08876040710818...
).

Some studies conclude that satisfaction is an antecedent construct of perceived service quality (Bitner, 1990Bitner, M. J. (1990). Evaluating service encounters: the effects of physical surroundings and employee responses. Journal of Marketing, 54(2), 69-82. http://dx.doi.org/10.2307/1251871.
http://dx.doi.org/10.2307/1251871...
; Bolton & Drew, 1991Bolton, R. N., & Drew, J. H. (1991). A multistage model of customers’ assessments of service quality and value. The Journal of Consumer Research, 17(4), 375-384. http://dx.doi.org/10.1086/208564.
http://dx.doi.org/10.1086/208564...
; Parasuraman et al., 1988Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1988). SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality. Journal of Retailing, 64(1), 12-40.). However, other studies (Cronin & Taylor, 1992Cronin, J. J., & Taylor, S. (1992). Measuring service quality: a reexamination and extension. Journal of Marketing, 56(3), 55. http://dx.doi.org/10.2307/1252296.
http://dx.doi.org/10.2307/1252296...
; Iacobucci et al., 1995Iacobucci, D., Ostrom, A., & Grayson, K. (1995). Distinguishing service quality and customer satisfaction : the voice of the consumer. Journal of Consumer Psychology, 4(3), 277-303. http://dx.doi.org/10.1207/s15327663jcp0403_04.
http://dx.doi.org/10.1207/s15327663jcp04...
; Kuo et al., 2009Kuo, Y.-F., Wu, C.-M., & Deng, W.-J. (2009). The relationships among service quality, perceived value, customer satisfaction, and post-purchase intention in mobile value-added services. Computers in Human Behavior, 25(4), 887-896. http://dx.doi.org/10.1016/j.chb.2009.03.003.
http://dx.doi.org/10.1016/j.chb.2009.03....
; Lee et al., 2000Lee, H., Lee, Y., & Yoo, D. (2000). The determinants of perceived service quality and its relationship with satisfaction. Journal of Services Marketing, 14(3), 217-231. http://dx.doi.org/10.1108/08876040010327220.
http://dx.doi.org/10.1108/08876040010327...
; Yee et al., 2010Yee, R. W. Y., Yeung, A. C. L., & Edwin Cheng, T. C. (2010). An empirical study of employee loyalty, service quality and firm performance in the service industry. International Journal of Production Economics, 124(1), 109-120. http://dx.doi.org/10.1016/j.ijpe.2009.10.015.
http://dx.doi.org/10.1016/j.ijpe.2009.10...
) suggest that service quality is an antecedent of customer satisfaction and, consequently, service quality positively influences on customer satisfaction, indicating that when the service company provides quality services, customer satisfaction is enhanced. Exceptionally, a company can provide high quality services but do not satisfy the customers, if they are not taken into consideration (Iacobucci et al., 1995Iacobucci, D., Ostrom, A., & Grayson, K. (1995). Distinguishing service quality and customer satisfaction : the voice of the consumer. Journal of Consumer Psychology, 4(3), 277-303. http://dx.doi.org/10.1207/s15327663jcp0403_04.
http://dx.doi.org/10.1207/s15327663jcp04...
).

According to Lee et al. (2000)Lee, H., Lee, Y., & Yoo, D. (2000). The determinants of perceived service quality and its relationship with satisfaction. Journal of Services Marketing, 14(3), 217-231. http://dx.doi.org/10.1108/08876040010327220.
http://dx.doi.org/10.1108/08876040010327...
customer satisfaction exerts stronger influence the purchase intention than service quality does and, perhaps, customers may not necessarily buy the highest quality services, but they may buy services that provide greater satisfaction. On the other hand, Cronin & Taylor (1992)Cronin, J. J., & Taylor, S. (1992). Measuring service quality: a reexamination and extension. Journal of Marketing, 56(3), 55. http://dx.doi.org/10.2307/1252296.
http://dx.doi.org/10.2307/1252296...
suggest that satisfaction has more influence than service quality on purchase intentions.

There are still some lacks of consensus among researchers and managers concerning the most adequate way to measure SaaS service quality: In this context, the SERVQUAL has been the most widely used scale to measure service quality and it has been adapted to investigate the service quality concerning online services, resulting in several instruments/scales, such as: Webqual (Loiacono et al., 2002Loiacono, E. T., Watson, R. T., & Goodhue, D. L. (2002). WEBQUAL: a measure of website quality. In K. R. Evans & L. K. Scheer (Eds.), Proceedings of the American Marketing Association: Winter Marketing Educators’ Conference (pp. 432-438). Chicago: American Marketing Association.), Webqual (Barnes & Vidgen, 2002Barnes, S. J., & Vidgen, R. T. (2002). An integrative approach to the assessment of e-commerce quality. Journal of Electronic Commerce Research, 3, 114-127.), EtailQ (Wolfinbarger & Gilly, 2003Wolfinbarger, M., & Gilly, M. C. (2003). eTailQ: dimensionalizing, measuring and predicting etail quality. Journal of Retailing, 79(3), 183-198. http://dx.doi.org/10.1016/S0022-4359(03)00034-4.
http://dx.doi.org/10.1016/S0022-4359(03)...
), E-S-Qual (Parasuraman et al., 2005Parasuraman, A., Zeithaml, V. A., & Malhotra, A. (2005). E-S-QUAL: a multiple-item scale for assessing electronic service quality. Journal of Service Research, 7(Feb), 1-21.), eTransQual (Bauer et al., 2006Bauer, H. H., Falk, T., & Hammerschmidt, M. (2006). eTransQual: a transaction process-based approach for capturing service quality in online shopping. Journal of Business Research, 59(7), 866-875. http://dx.doi.org/10.1016/j.jbusres.2006.01.021.
http://dx.doi.org/10.1016/j.jbusres.2006...
), PeSQ (Cristobal et al., 2007Cristobal, E., Flavián, C., & Guinalíu, M. (2007). Perceived e-service quality (PeSQ): measurement validation and effects on consumer satisfaction and web site loyalty. Managing Service Quality, 17(3), 317-340. http://dx.doi.org/10.1108/09604520710744326.
http://dx.doi.org/10.1108/09604520710744...
), e-SELFQUAL (Ding et al., 2011Ding, D. X., Hu, P. J.-H., & Sheng, O. R. L. (2011). e-SELFQUAL: a scale for measuring online self-service quality. Journal of Business Research, 64(5), 508-515. http://dx.doi.org/10.1016/j.jbusres.2010.04.007.
http://dx.doi.org/10.1016/j.jbusres.2010...
) and SaaS-Qual (Benlian et al., 2012Benlian, A., Koufaris, M., & Hess, T. (2012). Service quality in Software-as-a-Service: developing the SaaS-Qual measure and examining its role in usage continuance. Journal of Management Information Systems, 28(3), 85-126. http://dx.doi.org/10.2753/MIS0742-1222280303.
http://dx.doi.org/10.2753/MIS0742-122228...
). Other than the SaaS-Qual scale, the previous studies were conducted to assess the quality of online services in a business-to-customer (B2C) and/or websites research context. However, the SERVQUAL scale has been the subject of criticism in many scientific studies, including debates over issues related to the measuring scale, measuring time, and service quality dimensions (Babakus & Mangold, 1992Babakus, E., & Mangold, W. G. (1992). Adapting the SERVQUAL scale to hospital services: an empirical investigation. Health Services Research, 26(6), 767-786. PMid:1737708.; Brown et al., 1993Brown, T. J., Churchill Junior, G. A., & Peter, J. P. (1993). Improving the measurement of service quality. Journal of Retailing, 69(1), 127-139. http://dx.doi.org/10.1016/S0022-4359(05)80006-5.
http://dx.doi.org/10.1016/S0022-4359(05)...
; Carman, 1990Carman, J. M. (1990). Consumer perceptions of service quality: an assessment of the SERVQUAL dimensions. Journal of Retailing, 66(1), 33-55.; Cronin & Taylor, 1992Cronin, J. J., & Taylor, S. (1992). Measuring service quality: a reexamination and extension. Journal of Marketing, 56(3), 55. http://dx.doi.org/10.2307/1252296.
http://dx.doi.org/10.2307/1252296...
; Lee et al., 2000Lee, H., Lee, Y., & Yoo, D. (2000). The determinants of perceived service quality and its relationship with satisfaction. Journal of Services Marketing, 14(3), 217-231. http://dx.doi.org/10.1108/08876040010327220.
http://dx.doi.org/10.1108/08876040010327...
; Teas, 1993Teas, R. K. (1993). Expectations, performance evaluation, and consumers’ perceptions of quality. Journal of Marketing, 57(4), 18. http://dx.doi.org/10.2307/1252216.
http://dx.doi.org/10.2307/1252216...
). For example, Cronin & Taylor (1992)Cronin, J. J., & Taylor, S. (1992). Measuring service quality: a reexamination and extension. Journal of Marketing, 56(3), 55. http://dx.doi.org/10.2307/1252296.
http://dx.doi.org/10.2307/1252296...
argued that if “service quality is considered similar to an attitude”, its measure could be better represented by an attitude-based conceptualisation. Therefore, the researchers suggested that the expectations scale (SERVQUAL) should be replaced by a performance-only measure of service quality (the SERVPERF scale).

Despite the existing controversy, customer satisfaction and service quality are two distinct constructs, although highly correlated (Bansal & Taylor, 1999Bansal, H. S., & Taylor, S. F. (1999). The Service Provider Switching Model (SPSM): a model of consumer switching behavior in the services industry. Journal of Service Research, 2(2), 200-218. http://dx.doi.org/10.1177/109467059922007.
http://dx.doi.org/10.1177/10946705992200...
; Dabholkar et al., 2000Dabholkar, P., Shepherd, C. D., & Thorpe, D. I. (2000). A comprehensive framework for service quality: an investigation of critical conceptual and measurement issues through a longitudinal study. Journal of Retailing, 76(2), 139-173. http://dx.doi.org/10.1016/S0022-4359(00)00029-4.
http://dx.doi.org/10.1016/S0022-4359(00)...
); the construct satisfaction can be positively or negatively associated with the performance of a service company (Bolton & Drew, 1994Bolton, R. N., & Drew, J. H. (1994). Linking customer satisfaction to service operations and outcomes. In R. T. Rust & R. L. Oliver (Eds.), Service quality: new directions in theory and practice (pp. 173-200). Newbury Park, CA: Sage Publications Inc.), and there is a dependence between quality of service and customer satisfaction that an increase in one is likely to lead an increase in another (Sureshchandar et al., 2002Sureshchandar, G. S., Rajendran, C., & Anantharaman, R. N. (2002). The relationship between service quality and customer satisfaction – a factor specific approach. Journal of Services Marketing, 16(4), 363-379. http://dx.doi.org/10.1108/08876040210433248.
http://dx.doi.org/10.1108/08876040210433...
). That is, since there is no consensus in the scientific literature about the causal relationship between service quality and customer satisfaction (Parasuraman et al., 1994Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1994). Reassessment of expectations as a comparison standard in measuring service quality: implications for further research. Journal of Marketing, 58(1), 111-124. http://dx.doi.org/10.2307/1252255.
http://dx.doi.org/10.2307/1252255...
). it’s unquestionable the importance of measuring the degree of satisfaction of SaaS clients in order to improve the quality of services.

In this context, the efficient and effective measurement of SaaS quality may be even more complex because it incorporates aspects of the quality of the 'software product' provided by the SaaS development company to the users, and also the quality of the services associated with this provision, such as marketing (sales, hiring, payments, etc.), training, updates (upgrades), and other issues.

3. The methodological approach

The proposed approach is focused on assessing the SaaS service quality by measuring the degree of satisfaction of SaaS clients. Since the aim of this study is to gather preliminary information in order to provide insights and understandings about a problem, the proposed methodological approach can be characterized as an exploratory research. The exploratory nature comes from the fact that the scientific studies on the quality of SaaS concerning users’ degree of satisfaction in a business-to business (B2B) context are still incipient.

3.1. Modelling the problem

Initially, the elements and procedures that make the modelling of the problem are defined. This step defines the object of study, as well as some key issues of the research are appointed.

Object of the study: The exploratory study was conducted in a SaaS company located in the state of Rio de Janeiro, Brazil. The company operates in the IT field for over 18 years and currently has more than 500 business customers (about 1480 users) spread throughout the Brazilian territory and approximately 40 employees. The SaaS company develops and supports line-of-business services, since it is dedicated to providing IT solutions to the legal department to meet the interests of companies, law firms and universities of different sizes. More specifically, the SaaS company provides its clients a SaaS application to control of litigation and intellectual property processes. One of the main attractions of the SaaS application is the automatic registration of trademarks and patents processes, since the system performs the reading of data obtained by mean of the JIP (Journal of Industrial Property), which is available by the Brazilian National Institute of Industrial Property.

Questionnaire design: The questionnaire designed to assess the SaaS service quality was composed of three blocks of questions: In Block I multi-category structured questions were defined to obtain responses concerning the characteristics and profile of the respondents, such as system expertise, length of time in the SaaS company and current education level. In Block II, the service quality and the customer satisfaction literature were considered to design a 35-items questionnaire for measuring the SaaS service quality regarding the customers’ Satisfaction Degree. To ensure the content validity of the questionnaire, the suggestions and recommendations of three SaaS managers and three researchers of the Service Quality Management area were considered. Based on the results of the content adequacy assessment, minor adjustments were made to the items in order to be better understandable by the respondents in future work. In the evaluation process, the respondent describes his/her degree of satisfaction for each item using a non-comparative itemized rating scale ranging from 0 (very dissatisfied) to 10 (very satisfied). According to Parasuraman et al. (2004)Parasuraman, A., Grewal, D., & Krishnan, R. (2004). Marketing research. Boston: Houghton Mifflin Company. an itemized rating scale is easier to respond to and more meaningful from the respondent’s perspective. The option ‘(N/A) – not applicable’ can be used by the respondent if the item is not relevant to him/her. Block III was composed of an open-ended qualitative question with a free text field in which respondents can record additional information (suggestions and/or criticisms) that were not covered in the questionnaire. To ensure the content validity of the questionnaire, the suggestions and recommendations of three SaaS managers and three service quality management professors were considered. Based on the results of the content adequacy assessment, minor adjustments were made to the questions.

Methods: Factor analysis is used to identify relationships among the items to summarize the information contained in the original items into a smaller set of new composite dimensions (factors) with a minimum loss of information. The Cronbach’s alpha (Cronbach, 1951Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297-334. http://dx.doi.org/10.1007/BF02310555.
http://dx.doi.org/10.1007/BF02310555...
) and item-total correlations are used to measure the questionnaire reliability and to identify items that could be dropped to increase the questionnaire reliability. Quartile analysis (Freitas et al., 2006Freitas, A. L. P., Manhães, N. R. C., & Cozendey, M. I. (2006, October 9-11). Using SERVQUAL to evaluate the quality of information technology services: an experimental analysis. In Proceedings of the XII International Conference on Industrial Engineering and Operations Management (pp. 1-8). Fortaleza: ICIEOM.) is conducted to determine which questions were most critical based on to the satisfaction averages for the questions. Quartile analysis is a ranking measure developed by Freitas et al. (2006)Freitas, A. L. P., Manhães, N. R. C., & Cozendey, M. I. (2006, October 9-11). Using SERVQUAL to evaluate the quality of information technology services: an experimental analysis. In Proceedings of the XII International Conference on Industrial Engineering and Operations Management (pp. 1-8). Fortaleza: ICIEOM., which classifies questions/items by four priority levels (critical, high, moderate, and low) based on the satisfaction averages for the questions. The three quartiles are considered to be border values. The satisfaction averages are used to calculate the quartiles by which the questions are classified into the previously mentioned levels. For example, the questions with satisfaction averages below the first quartile are designated as critical priority and questions with satisfaction averages above the third quartile are designated as low priority.

3.2. Data collection

Due to geographical restrictions and the costs of data collection, users evaluated the quality of the SaaS application by mean of an online questionnaire which was available for all users in own the SaaS application. In this study, the data collection was conducted during 40 days and 317 of the SaaS users fulfilled the questionnaire.

3.3. Data analysis and results

Table 1 shows that approximately 70% of the respondents have more than two years of interaction with the SaaS Company and 82% of them consider having at least intermediate knowledge on the use of the SaaS application. Furthermore, in terms of school background, 84% of the respondents have a university degree or are attending it.

Table 1
Respondents’ data profiles.

Some preliminary tests were conducted to verify the feasibility of factor analysis. Regarding to the sample size, there is a ratio of 9 respondents for each variable. According to Hair et al. (2006)Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data analysis (6th ed.). New Jersey: Pearson Prentice Hall. this ratio is appropriate for the calculation of correlations among variables. The ‘blank’ values sum together 10.5% of the total data. All those missing values were substituted by the averages for the respective questions. The visual inspection of the correlation matrix reveals that 98% of the correlations are statistically significant at 0.001. The Bartlett test of sphericity revealed that there is non-zero correlation at a significance level of 0.0001. Finally, the Kaiser-Meyer-Olkin (KMO) test resulted in a measure of sampling adequacy (MSA) value of 0.963. In this sense, all those measures indicate that the set of variables is appropriate for factor analysis.

The Principal Component method with orthogonal Varimax rotation and Kaiser normalization was used for extracting the factors. After using the latent root criterion (eigenvalues greater than 1 are associated) and the discretion of the scree test (this criterion has added two additional factors for inclusion in the criterion latent root) six factors (F1, F2, ..., F6) were extracted. All 35 variables (items) were included in the factor solution after considering .35 as the minimum factor loading to obtain statistical significance. However, such items previously distributed in 8 dimensions were grouped in six factors, which together, account for 70.474% of the total variance. Table 2 shows the extracted factors, their conceptual definition and variables (questions), the factor loadings, the eigenvalues and the percentage of the variance accounted for each factor.

Table 2
Factor analysis.

Table 3 shows the average degree of satisfaction for each question (DS)¯q, the average degree of satisfaction for each factor (DS)¯F, and the general degree of satisfaction (DS)¯G. In addition, Cronbach’s α values per factor (αF), the α value if a particular question was excluded from the respective factor (αQe), and the item-total correlations (itc). Since the minimum acceptable alpha value for exploratory studies is 0.60 (Hair et al., 2006Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data analysis (6th ed.). New Jersey: Pearson Prentice Hall.; Malhotra, 2007Malhotra, N. K. (2007). Marketing research: an applied orientation (5th ed.). New Jersey: Pearson Prentice Hall.), the questionnaire was reliable in all factors. None of the questions, once excluded from the questionnaire, could contribute to the increased reliability of the factor to which it belongs. The values of the item-total correlations show that all questions are well correlated in relation to other questions of the factor to which they belong.

Table 3
Cronbach’s alpha values, item-total correlations and average performances.

The results of the quartile analysis are presented in Figure 1 and show that the most critical questions (items) regarding the perception of all levels of users are predominantly associated with the updates of the application (Q15), the activities and services performed (Q17), the regularity in the execution and delivery of services (Q18) and the transparency and accuracy in error correction (Q1), the ability to meet business requirements (Q6), the explanation about the use of the application by means of manuals and helps (Q34), the effectiveness of communication by means of available news (Q26) and the features and management of the SaaS application (Q23) and the company's interest in solving customers’ problems (Q2). Such items are predominantly related to the reliability of the SaaS and customer service factors. Attention to customer service and reliability are respectively important since ease of use and convenience is the biggest favorable factor to the usage and adoption of cloud computing by small and medium businesses (SMBs), and reliability is ignored as SMBs do not consider cloud as reliable (Gupta et al., 2013Gupta, P., Seetharaman, A., & Raj, J. R. (2013). The usage and adoption of cloud computing by small and medium businesses. International Journal of Information Management, 33(5), 861-874. http://dx.doi.org/10.1016/j.ijinfomgt.2013.07.001.
http://dx.doi.org/10.1016/j.ijinfomgt.20...
). Furthermore, those results also suggest the existence of communication problems between the SaaS development company and the users. Despite the attention (Q4), patience (Q12) and politeness (Q13) of the professionals and the clarity of language (Q14) when performing the services are considered low-priority items, the users want more accurate information about the activities and services provided, the errors detected, and the new features of the SaaS application. Thus, SaaS managers should improve the quality of processes regarding all those items and the interaction when performing the services.

Figure 1
Quartile analysis.

Conversely, tools and backup policies (Q27) and data security (Q21) are considered low priority items, which is an instigating result, since many studies (e.g. Kim, 2010Kim, D. (2010). An integrated framework of HoQ and AHP for the QOE improvement of network-based ASP services. Annales des Télécommunications, 65(1-2), 19-29. http://dx.doi.org/10.1007/s12243-009-0143-9.
http://dx.doi.org/10.1007/s12243-009-014...
; Mahesh et al., 2011Mahesh, S., Landry, B. J. L., Sridhar, T., & Walsh, K. R. (2011). A decision table for the cloud computing decision in small business. Information Resources Management Journal, 24(3), 9-25. http://dx.doi.org/10.4018/irmj.2011070102.
http://dx.doi.org/10.4018/irmj.201107010...
; Wen & Dong, 2013Wen, P. X., & Dong, L. (2013, September 9-11). Quality model for evaluating SaaS service. In Proceedings of the 4th International Conference on Emerging Intelligent Data and Web Technologies (pp. 83-87). Xi'an, China: EIDWT. http://dx.doi.org/10.1109/EIDWT.2013.19.
http://dx.doi.org/10.1109/EIDWT.2013.19...
; Lian et al., 2014Lian, J. W., Yen, D. C., & Wang, Y. T. (2014). An exploratory study to understand the critical factors affecting the decision to adopt cloud computing in Taiwan hospital. International Journal of Information Management, 34(1), 28-36. http://dx.doi.org/10.1016/j.ijinfomgt.2013.09.004.
http://dx.doi.org/10.1016/j.ijinfomgt.20...
) argue that security is the most critical concern for adopting cloud computing, and there are numerous conceptual frameworks that incorporate security as an important component of customer management (Goode et al., 2015Goode, S., Lin, C., Tsai, J. C., & Jiang, J. J. (2015). Rethinking the role of security in client satisfaction with Software-as-a-Service (SaaS) providers. Decision Support Systems, 70, 73-85. http://dx.doi.org/10.1016/j.dss.2014.12.005.
http://dx.doi.org/10.1016/j.dss.2014.12....
). Moreover, given the importance of security and privacy in SaaS (Gupta et al., 2013Gupta, P., Seetharaman, A., & Raj, J. R. (2013). The usage and adoption of cloud computing by small and medium businesses. International Journal of Information Management, 33(5), 861-874. http://dx.doi.org/10.1016/j.ijinfomgt.2013.07.001.
http://dx.doi.org/10.1016/j.ijinfomgt.20...
), motivating users requires the SaaS vendor to show its ability, goodwill, and contractual promise keeping (Chou & Chiang, 2013Chou, S. W., & Chiang, C. H. (2013). Understanding the formation of software-as-a-service (SaaS) satisfaction from the perspective of service quality. Decision Support Systems, 56(1), 148-155. http://dx.doi.org/10.1016/j.dss.2013.05.013.
http://dx.doi.org/10.1016/j.dss.2013.05....
) – such issues are specially important since in this study the explanation about costs of service (Q24) and the clarity of the limitations of use and contracting services (Q25) are considered high priority items.

In order to verify if the three user groups (basic, intermediate, and advanced) perceive and assess the quality of SaaS similarly or in a different way, the quartile analysis was also used from the evaluations of the respondents belonging to each group (see Figure 1). As a result, it is possible to identify the critical items concerning the group of all users are predominantly related to the operation of the SaaS application (Q17, Q18, Q34 and Q23). The exclusive critical items concerning intermediate level users are associated with the SaaS application reliability (Q15 and Q16). The critical item exclusively from the perspective of the advanced level users is the interest of the SaaS development company to update the application (Q3).

Supposedly less knowledgeable with the use of the SaaS application, the basic level users are exclusively less satisfied with the ease-of-use of the SaaS application (Q30), the explanation about the cost of the service (Q24), clarity about the limitations of use and contracting services (Q25) and the ease of deployment and upgrade of the SaaS application (Q22). The Q30 and Q22 issues are particularly important because difficulties concerning these items inhibit and reduce the interest of customers by learning and using the software. If the managers of the companies that hired the SaaS development company are classified in this profile, special attention should be directed to the Q24 and Q25 issues because they combine financial and professional subjects.

Questions with satisfaction averages below the first quartile were designated as critical priority and were analyzed first by the managers of the SaaS development company in order to provide possible improvements to SaaS. A brief analysis concerning each critical item is presented as follows:

  • Q17 (Actions and services that SaaS application performs) and Q23 (Functionality and control tools provided by the SaaS application) - Clients consider the desktop system previously developed by the company and which had more functionalities and control tools than the SaaS application as a parameter for making their evaluations. There is intention to immediately implement new features and improvements in SaaS application, similarly or better than the old system, expecting results within a year;

  • Q18 (Regularity of service performance and of service delivery) - The company is not able to implement the features requested by clients quickly. It is important to make the “roadmap” clear to the clients, i.e., to show them the sequence of implementations of which has been planned and which has been primarily developed;

  • Q34 (Explanation about the use of the SaaS application by means of tutorials and helps) - Managers do not agree with the degree of criticality assigned to this item, although the tutorial and help are very descriptive. Some improvement actions are considered: (i) to create links to assess the online tutorial within each screen with questions/tips with dedicated business language, (ii) to require the support professionals check with customers who attributed lower scores for the item, suggestions for improve the tutorials, and (iii) to develop the tutorials focusing the explanation about the business process rather than guiding by the software;

  • Q15 (Updates made to the SaaS application according to the business) – The managers did not identify reasons concerning this item. They recommend contacting the most critical evaluators and asking them the causes that lead to the appointment of the problem;

  • Q6 (SaaS application's ability to meet business requirements) – The managers reported the absence of system features that are important to the business of SaaS application. Some actions are suggested: (i) to implement new features and improvements in system similarly or better than the old system, and (ii) to inform the customers of the queue implementations being made;

  • Q1 (Transparency and accuracy in correcting identified errors) – This result is due to the lack of reading of the information on the changes made to each version of the system and to the lack of clarity in the information of the implemented corrections. It is necessary to promote and highlight warnings of the most relevant adjustments;

  • Q2 (Interest of the SaaS development company to solve customer’s problems) - It is not clear to customers which development activities are being conducted. There are simple implementations that affect many customers and which remain at the end of the queue implementations because the company prioritizes the most critical features. It is necessary to analyze the priority list of the features that will be developed and to improve the procedure of prioritization of customer’s requests;

  • Q26 (Effectiveness of communication of the new features available in SaaS application) – It is supposed that clients do not understand or do not perceive or ignore the information on the initial screen. It is necessary to improve the procedure of dissemination of news, for example, by sending e-mail.

Besides identifying the most critical items, the main causes of problems regarding the SaaS service quality, and also the possible improvement actions, it is important to highlight that 136 respondents expressed through open response, reporting criticisms and suggestions. In general, the critical referred predominantly to the SaaS application update schedule which is conducted during the business hours and it requires the interruption of the system operation, and the absence of feedback about the implementation of new features requested. In addition, more than 50 customers suggested improvements and creating new features for the application.

4. Contributions of the study and managerial implications

The growth of SaaS applications in the IT market and the confidence that customers must have in SaaS development companies - since your data are deposited in other companies’ structures – they are relevant and motivating aspects for the implementation of this study.

All over the years, the evaluation of the SaaS service quality has been traditionally conducted by measuring technical metrics. Some few studies have also been conducted to measure the gap resulting from the difference between users' expectations of SaaS and their perceptions of actual SaaS performance (the SERVQUAL scale) or by means of the perceptions-only measurement (the SERVPERF scale). Differently from such evaluation models, this work proposes a methodological approach to assess the SaaS service quality by measuring the satisfaction of the users of SaaS applications. Thus, the primary contribution of this study lies in the development of an alternative evaluation approach since the existing criticisms concerning the SERVQUAL scale (and probably its variations) and the lack of studies concerning the use of SERVPERF to assess the SaaS service quality. Additionally, our study aims to complement the studies and methodologies used to measure the SaaS quality that usually use metrics to measure the performance and the efficiency of SaaS, besides the use and allocation of resources in the SaaS systems.

By means of an exploratory study, 35-items were assigned into six factors (customer service, customer assistance, reliability, business process, accessibility and information about the use of SaaS), indicating that customer service, customer assistance and the reliability of SaaS are the three factors that strongest influence on the quality of SaaS. Regarding those results, users seem to be more interested on the SaaS provider’s ability to solve operational problems and perform support services helpfully and accurately. The reliability of the questionnaire was verified in all factors.

Our study also provides practical contributions to the area of service quality management. Based on the satisfaction averages on the 35 items, a ranking measure is used to classify the items into four priority levels (critical, high, moderate, and low) as perceived by the user profiles (basic, intermediate, advanced and all users). As a result, our study reveals some critical issues which are particularly important because difficulties concerning these items inhibit and reduce the interest of users by learning and using the SaaS application. Thus, specific actions to improve the quality of SaaS concerning the perceptions of the users are identified. Conversely, this study reveals that the backup policies and the data security system used by the SaaS company are considered low priority items. These results contradict the findings of many previous studies (e.g. Kim, 2010Kim, D. (2010). An integrated framework of HoQ and AHP for the QOE improvement of network-based ASP services. Annales des Télécommunications, 65(1-2), 19-29. http://dx.doi.org/10.1007/s12243-009-0143-9.
http://dx.doi.org/10.1007/s12243-009-014...
; Mahesh et al., 2011Mahesh, S., Landry, B. J. L., Sridhar, T., & Walsh, K. R. (2011). A decision table for the cloud computing decision in small business. Information Resources Management Journal, 24(3), 9-25. http://dx.doi.org/10.4018/irmj.2011070102.
http://dx.doi.org/10.4018/irmj.201107010...
; Wen & Dong, 2013Wen, P. X., & Dong, L. (2013, September 9-11). Quality model for evaluating SaaS service. In Proceedings of the 4th International Conference on Emerging Intelligent Data and Web Technologies (pp. 83-87). Xi'an, China: EIDWT. http://dx.doi.org/10.1109/EIDWT.2013.19.
http://dx.doi.org/10.1109/EIDWT.2013.19...
; Lian et al., 2014Lian, J. W., Yen, D. C., & Wang, Y. T. (2014). An exploratory study to understand the critical factors affecting the decision to adopt cloud computing in Taiwan hospital. International Journal of Information Management, 34(1), 28-36. http://dx.doi.org/10.1016/j.ijinfomgt.2013.09.004.
http://dx.doi.org/10.1016/j.ijinfomgt.20...
) which claim that security is the most critical issue for adopting cloud computing.

In this study, the lowest priority items and highest priority items were presented to the managers of SaaS development company to comment and to make proposals for improvement of the services. Regarding the group of all users, most of the critical issues are associated with these items: transparency and accuracy in correcting identified errors, the company's interest in solving customer problems, SaaS application's ability to meet business requirements, implemented updates, and regularity of service performance and of service delivery.

Interpretations of IT company managers about the critical items highlighted variables and IT technical elements that should be implemented to improve the quality of SaaS, since these variables are not necessarily known for most SaaS users. Based on these results, the main improvement actions are related to the implementation of new features in the SaaS application, to make the “roadmap” to show the users the sequence of the services and to improve the explanation tools about the business process. For a better use of the SaaS application, it is also necessary to improve the procedure of dissemination of news about the SaaS application and to promote and highlight warnings of the most relevant adjustments that were done.

Furthermore, the study also reveal some criticisms and suggestions that are highlighted by the SaaS users. The most of the criticisms are related to the SaaS application update schedule and to the absence of feedback about the implementation of new features. The main suggestions are related to improvements and creation of new features for the application. It is important to note that such suggestions are in agreement with the suggestions made by the IT managers.

Finally, knowing that an increase in service quality is likely to lead an increase in customer satisfaction (and vice-versa) and that customer satisfaction influences on purchase intentions, all those results reinforce the need for the assessment of SaaS quality by measuring user’s satisfaction.

5. Limitations and future research suggestions

The study was restricted to a single Brazilian SaaS developing company. A convenience sampling procedure was used to collect data and the evaluations were conducted by SaaS users. Although the number of respondents was representative for performing the analyzes and the reliability of the questionnaire has been verified, the generalization of the results and the findings must be carefully conducted and interpreted when comparing with the results of studies of other SaaS companies.

The questionnaire did not allow to explore the causal relationship between service quality and user satisfaction. Questions (items) should be further developed and incorporated into the instrument to enable analysis by structural equation techniques. In this study, the user-level classification was made by the respondent himself.

Finally, not wishing to exhaust the discussion of the problem of assessing the quality of SaaS concerning the users’ perception - which is broad and diverse – further studies can be directed to the following issues: (i) the use of the proposed approach for assessing the quality of SaaS in other SaaS development companies; (ii) the use of the proposed approach for assessing the quality of SaaS from the perspective of persons that develop SaaS, as in a self-evaluation process and, (iii) the use of discriminant analysis and/or multi-criteria decision aid techniques to group SaaS users and developers into different profiles regarding the personal assignments. Further, the results of the application of the proposed approach according to the perspective of SaaS users and to the perspective of the SaaS developers can be compared in order to verify the existence of gaps regarding each question/item and several managerial implications eventually can be identified.

  • How to cite this article: Freitas, A. L. P., & Freitas Neto, M. M. (2017). Assessing the service quality in Software-as-a-Service from the customers’ perspective: a methodological approach and case of use. Production, v27, e20170020. http://dx.doi.org/10.1590/0103-6513.20170020.

References

  • Babakus, E., & Mangold, W. G. (1992). Adapting the SERVQUAL scale to hospital services: an empirical investigation. Health Services Research, 26(6), 767-786. PMid:1737708.
  • Bansal, H. S., & Taylor, S. F. (1999). The Service Provider Switching Model (SPSM): a model of consumer switching behavior in the services industry. Journal of Service Research, 2(2), 200-218. http://dx.doi.org/10.1177/109467059922007
    » http://dx.doi.org/10.1177/109467059922007
  • Barnes, S. J., & Vidgen, R. T. (2002). An integrative approach to the assessment of e-commerce quality. Journal of Electronic Commerce Research, 3, 114-127.
  • Bauer, H. H., Falk, T., & Hammerschmidt, M. (2006). eTransQual: a transaction process-based approach for capturing service quality in online shopping. Journal of Business Research, 59(7), 866-875. http://dx.doi.org/10.1016/j.jbusres.2006.01.021
    » http://dx.doi.org/10.1016/j.jbusres.2006.01.021
  • Benlian, A., Koufaris, M., & Hess, T. (2012). Service quality in Software-as-a-Service: developing the SaaS-Qual measure and examining its role in usage continuance. Journal of Management Information Systems, 28(3), 85-126. http://dx.doi.org/10.2753/MIS0742-1222280303
    » http://dx.doi.org/10.2753/MIS0742-1222280303
  • Bitner, M. J. (1990). Evaluating service encounters: the effects of physical surroundings and employee responses. Journal of Marketing, 54(2), 69-82. http://dx.doi.org/10.2307/1251871
    » http://dx.doi.org/10.2307/1251871
  • Bolton, R. N., & Drew, J. H. (1991). A multistage model of customers’ assessments of service quality and value. The Journal of Consumer Research, 17(4), 375-384. http://dx.doi.org/10.1086/208564
    » http://dx.doi.org/10.1086/208564
  • Bolton, R. N., & Drew, J. H. (1994). Linking customer satisfaction to service operations and outcomes. In R. T. Rust & R. L. Oliver (Eds.), Service quality: new directions in theory and practice (pp. 173-200). Newbury Park, CA: Sage Publications Inc.
  • Brown, T. J., Churchill Junior, G. A., & Peter, J. P. (1993). Improving the measurement of service quality. Journal of Retailing, 69(1), 127-139. http://dx.doi.org/10.1016/S0022-4359(05)80006-5
    » http://dx.doi.org/10.1016/S0022-4359(05)80006-5
  • Buttle, F. (1996). SERVQUAL: review, critique, research agenda. European Journal of Marketing, 30(1), 8-32. http://dx.doi.org/10.1108/03090569610105762
    » http://dx.doi.org/10.1108/03090569610105762
  • Cancian, M. H. (2009). Uma proposta de guia de referência para provedores de software como um serviço (dissertação de mestrado). Universidade Federal de Santa Catarina, Santa Catarina.
  • Carman, J. M. (1990). Consumer perceptions of service quality: an assessment of the SERVQUAL dimensions. Journal of Retailing, 66(1), 33-55.
  • Chou, S. W., & Chiang, C. H. (2013). Understanding the formation of software-as-a-service (SaaS) satisfaction from the perspective of service quality. Decision Support Systems, 56(1), 148-155. http://dx.doi.org/10.1016/j.dss.2013.05.013
    » http://dx.doi.org/10.1016/j.dss.2013.05.013
  • Cisco. (2014). Cisco global cloud index: forecast and methodology, 2013-2018 (White Paper). Retrieved in 2017, March 27, from https://www.terena.org/mail-archives/storage/pdfVVqL9tLHLH.pdf
    » https://www.terena.org/mail-archives/storage/pdfVVqL9tLHLH.pdf
  • Costa, H. G., Mansur, A. F. U., Freitas, A. L. P., & Carvalho, R. A. (2007). ELECTRE TRI aplicado a avaliação da satisfação de consumidores. Produção, 17(2), 230-245. http://dx.doi.org/10.1590/S0103-65132007000200002
    » http://dx.doi.org/10.1590/S0103-65132007000200002
  • Cristobal, E., Flavián, C., & Guinalíu, M. (2007). Perceived e-service quality (PeSQ): measurement validation and effects on consumer satisfaction and web site loyalty. Managing Service Quality, 17(3), 317-340. http://dx.doi.org/10.1108/09604520710744326
    » http://dx.doi.org/10.1108/09604520710744326
  • Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297-334. http://dx.doi.org/10.1007/BF02310555
    » http://dx.doi.org/10.1007/BF02310555
  • Cronin, J. J., & Taylor, S. (1992). Measuring service quality: a reexamination and extension. Journal of Marketing, 56(3), 55. http://dx.doi.org/10.2307/1252296
    » http://dx.doi.org/10.2307/1252296
  • Dabholkar, P., Shepherd, C. D., & Thorpe, D. I. (2000). A comprehensive framework for service quality: an investigation of critical conceptual and measurement issues through a longitudinal study. Journal of Retailing, 76(2), 139-173. http://dx.doi.org/10.1016/S0022-4359(00)00029-4
    » http://dx.doi.org/10.1016/S0022-4359(00)00029-4
  • Ding, D. X., Hu, P. J.-H., & Sheng, O. R. L. (2011). e-SELFQUAL: a scale for measuring online self-service quality. Journal of Business Research, 64(5), 508-515. http://dx.doi.org/10.1016/j.jbusres.2010.04.007
    » http://dx.doi.org/10.1016/j.jbusres.2010.04.007
  • Domínguez-Mayo, F. J., García-García, J. A., Escalona, M. J., Mejías, M., Urbieta, M., & Rossi, G. (2015). A framework and tool to manage Cloud Computing service quality. Software Quality Journal, 23(4), 595-625. http://dx.doi.org/10.1007/s11219-014-9248-0
    » http://dx.doi.org/10.1007/s11219-014-9248-0
  • Duarte Filho, N. F., Padua, C. I. P. S, Bermejo, P. H. S., Zambalde, A. L., & Barros, U. S. (2013). Saasquality - a method for quality evaluation of software as a service (Saas). International Journal of Computer Science and Information Technology, 5(3), 101-117. http://dx.doi.org/10.5121/ijcsit.2013.5308
    » http://dx.doi.org/10.5121/ijcsit.2013.5308
  • Freitas, A. L. P., & Albernaz, C. M. R. M. (2012). A multicriteria approach for assessing the quality of information technology support services. International Journal of Business and Social Science, 3(16), 59-72. Retrieved in 2017, March 27, from http://ijbssnet.com/journals/Vol_3_No_16_Special_Issue_August_2012/7.pdf
    » http://ijbssnet.com/journals/Vol_3_No_16_Special_Issue_August_2012/7.pdf
  • Freitas, A. L. P., Manhães, N. R. C., & Cozendey, M. I. (2006, October 9-11). Using SERVQUAL to evaluate the quality of information technology services: an experimental analysis. In Proceedings of the XII International Conference on Industrial Engineering and Operations Management (pp. 1-8). Fortaleza: ICIEOM.
  • Ganesan, R., Sarkar, S., & Tewari, N. (2012, June 25-28). An independent verification of errors and vulnerabilities in SaaS cloud. In Proceedings of the IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN 2012) (pp. 1-6). Boston, MA, USA: IEEE. http://doi.org/10.1109/DSNW.2012.6264695
    » http://doi.org/10.1109/DSNW.2012.6264695
  • Gao, J., Pattabhiraman, P., Bai, X., & Tsai, W. T. (2011, December 12-14). SaaS performance and scalability evaluation in clouds. In Proceedings of the 2011 IEEE 6th International Symposium on Service Oriented System (SOSE) (pp. 61-71). Irvine, CA: IEEE. http://doi.org/10.1109/SOSE.2011.6139093
    » http://doi.org/10.1109/SOSE.2011.6139093
  • Godse, M., & Mulik, S. (2009, September 21-25). An approach for selecting Software-as-a-Service (SaaS) Product. In Proceedings of the IEEE International Conference on Cloud Computing (pp. 155-158). Washington: IEEE Computer Society. http://dx.doi.org/10.1109/CLOUD.2009.74
    » http://dx.doi.org/10.1109/CLOUD.2009.74
  • Goode, S., Lin, C., Tsai, J. C., & Jiang, J. J. (2015). Rethinking the role of security in client satisfaction with Software-as-a-Service (SaaS) providers. Decision Support Systems, 70, 73-85. http://dx.doi.org/10.1016/j.dss.2014.12.005
    » http://dx.doi.org/10.1016/j.dss.2014.12.005
  • Goscinski, A., & Brock, M. (2010). Toward dynamic and attribute based publication, discovery and selection for cloud computing. Future Generation Computer Systems, 26(7), 947-970. http://dx.doi.org/10.1016/j.future.2010.03.009
    » http://dx.doi.org/10.1016/j.future.2010.03.009
  • Gupta, P., Seetharaman, A., & Raj, J. R. (2013). The usage and adoption of cloud computing by small and medium businesses. International Journal of Information Management, 33(5), 861-874. http://dx.doi.org/10.1016/j.ijinfomgt.2013.07.001
    » http://dx.doi.org/10.1016/j.ijinfomgt.2013.07.001
  • Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data analysis (6th ed.). New Jersey: Pearson Prentice Hall.
  • Iacobucci, D., Ostrom, A., & Grayson, K. (1995). Distinguishing service quality and customer satisfaction : the voice of the consumer. Journal of Consumer Psychology, 4(3), 277-303. http://dx.doi.org/10.1207/s15327663jcp0403_04
    » http://dx.doi.org/10.1207/s15327663jcp0403_04
  • Ju, J., Wang, Y., Fu, J., Wu, J., & Lin, Z. (2010, June 22-23). Research on Key Technology in SaaS. In Proceedings of the 2010 International Conference on Intelligent Computing and Cognitive Informatics (pp. 384-387). Kuala Lumpur, Malaysia: IEE. http://dx.doi.org/10.1109/ICICCI.2010.120
    » http://dx.doi.org/10.1109/ICICCI.2010.120
  • Kang, S., Myung, J., Yeon, J., Ha, S. W., Cho, T., Chung, J. M., & Lee, S. G. (2010). A general maturity model and reference architecture for SaaS service. In H. Kitagawa, Y. Ishikawa, Q. Li, & C. Watanabe (Eds.), DASFAA 2010: Database Systems for Advanced Applications (Lecture Notes in Computer Science, vol 5982, pp. 337-346). Heidelberg: Springer.
  • Karadsheh, L. (2012). Applying security policies and service level agreement to IaaS service model to enhance security and transition. Computers & Security, 31(3), 315-326. http://dx.doi.org/10.1016/j.cose.2012.01.003
    » http://dx.doi.org/10.1016/j.cose.2012.01.003
  • Kim, D. (2010). An integrated framework of HoQ and AHP for the QOE improvement of network-based ASP services. Annales des Télécommunications, 65(1-2), 19-29. http://dx.doi.org/10.1007/s12243-009-0143-9
    » http://dx.doi.org/10.1007/s12243-009-0143-9
  • Kuo, Y.-F., Wu, C.-M., & Deng, W.-J. (2009). The relationships among service quality, perceived value, customer satisfaction, and post-purchase intention in mobile value-added services. Computers in Human Behavior, 25(4), 887-896. http://dx.doi.org/10.1016/j.chb.2009.03.003
    » http://dx.doi.org/10.1016/j.chb.2009.03.003
  • Lee, H., Lee, Y., & Yoo, D. (2000). The determinants of perceived service quality and its relationship with satisfaction. Journal of Services Marketing, 14(3), 217-231. http://dx.doi.org/10.1108/08876040010327220
    » http://dx.doi.org/10.1108/08876040010327220
  • Lee, J. Y., Lee, J. W., Cheun, D. W., & Kim, S. D. (2009, December 2-4). A quality model for evaluating Software-as-a-service in cloud computing. In Proceedings of the 2009 Seventh ACIS International Conference on Software Engineering Research, Management and Applications (pp. 261-266). Haikou, China: IEE.
  • Lian, J. W., Yen, D. C., & Wang, Y. T. (2014). An exploratory study to understand the critical factors affecting the decision to adopt cloud computing in Taiwan hospital. International Journal of Information Management, 34(1), 28-36. http://dx.doi.org/10.1016/j.ijinfomgt.2013.09.004
    » http://dx.doi.org/10.1016/j.ijinfomgt.2013.09.004
  • Loiacono, E. T., Watson, R. T., & Goodhue, D. L. (2002). WEBQUAL: a measure of website quality. In K. R. Evans & L. K. Scheer (Eds.), Proceedings of the American Marketing Association: Winter Marketing Educators’ Conference (pp. 432-438). Chicago: American Marketing Association.
  • Loukis, E., & Charalabidis, Y. (2011). Why do eGovernment projects fail? Risk factors of large information systems projects in the greek public sector. International Journal of Electronic Government Research, 7(2), 59-77. http://dx.doi.org/10.4018/jegr.2011040104
    » http://dx.doi.org/10.4018/jegr.2011040104
  • Low, C., Chen, Y., & Wu, M. (2011). Understanding the determinants of cloud computing adoption. Industrial Management & Data Systems, 111(7), 1006-1023. http://dx.doi.org/10.1108/02635571111161262
    » http://dx.doi.org/10.1108/02635571111161262
  • Mahesh, S., Landry, B. J. L., Sridhar, T., & Walsh, K. R. (2011). A decision table for the cloud computing decision in small business. Information Resources Management Journal, 24(3), 9-25. http://dx.doi.org/10.4018/irmj.2011070102
    » http://dx.doi.org/10.4018/irmj.2011070102
  • Malhotra, N. K. (2007). Marketing research: an applied orientation (5th ed.). New Jersey: Pearson Prentice Hall.
  • Mesquida, A. L., Mas, A., Amengual, E., & Calvo-Manzano, J. A. (2012). IT service management process improvement based on ISO/IEC 15504: a systematic review. Information and Software Technology, 54(3), 239-247. http://dx.doi.org/10.1016/j.infsof.2011.11.002
    » http://dx.doi.org/10.1016/j.infsof.2011.11.002
  • Parasuraman, A., Grewal, D., & Krishnan, R. (2004). Marketing research Boston: Houghton Mifflin Company.
  • Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1985). A conceptual model of service quality and its implications for future research. Journal of Marketing, 49(4), 41-50. http://dx.doi.org/10.2307/1251430
    » http://dx.doi.org/10.2307/1251430
  • Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1988). SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality. Journal of Retailing, 64(1), 12-40.
  • Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1994). Reassessment of expectations as a comparison standard in measuring service quality: implications for further research. Journal of Marketing, 58(1), 111-124. http://dx.doi.org/10.2307/1252255
    » http://dx.doi.org/10.2307/1252255
  • Parasuraman, A., Zeithaml, V. A., & Malhotra, A. (2005). E-S-QUAL: a multiple-item scale for assessing electronic service quality. Journal of Service Research, 7(Feb), 1-21.
  • Peppard, J. (2003). Managing IT as a portfolio of services. European Management Journal, 21(4), 467-483. http://dx.doi.org/10.1016/S0263-2373(03)00074-4
    » http://dx.doi.org/10.1016/S0263-2373(03)00074-4
  • Raciti, M. M., Ward, T., & Dagger, T. S. (2013). The effect of relationship desire on consumer‐to‐business relationships. European Journal of Marketing, 47(3/4), 615-634. http://dx.doi.org/10.1108/03090561311297490
    » http://dx.doi.org/10.1108/03090561311297490
  • Rimal, B. P., Choi, E., & Lumb, I. (2009, August 25-27). A taxonomy and survey of cloud computing systems. In Proceedings of the Fifth International Joint Conference on INC, IMS and IDC (pp. 44-51). Washington: IEEE Computer Society.
  • Roses, L. K., Hoppen, N., & Henrique, J. L. (2009). Management of perceptions of information technology service quality. Journal of Business Research, 62(9), 876-882. http://dx.doi.org/10.1016/j.jbusres.2008.10.005
    » http://dx.doi.org/10.1016/j.jbusres.2008.10.005
  • Sahinoglu, M., & Cueva-Parra, L. (2011). CLOUD computing. Wiley Interdisciplinary Reviews: Computational Statistics, 3(1), 47-68. http://dx.doi.org/10.1002/wics.139
    » http://dx.doi.org/10.1002/wics.139
  • Sureshchandar, G. S., Rajendran, C., & Anantharaman, R. N. (2002). The relationship between service quality and customer satisfaction – a factor specific approach. Journal of Services Marketing, 16(4), 363-379. http://dx.doi.org/10.1108/08876040210433248
    » http://dx.doi.org/10.1108/08876040210433248
  • Teas, R. K. (1993). Expectations, performance evaluation, and consumers’ perceptions of quality. Journal of Marketing, 57(4), 18. http://dx.doi.org/10.2307/1252216
    » http://dx.doi.org/10.2307/1252216
  • Ueltschy, L. C., Laroche, M., Eggert, A., & Bindl, U. (2007). Service quality and satisfaction: an international comparison of professional services perceptions. Journal of Services Marketing, 21(6), 410-423. http://dx.doi.org/10.1108/08876040710818903
    » http://dx.doi.org/10.1108/08876040710818903
  • Walz, J., & Grier, D. A. (2010). Time to push the cloud. IT Professional, 12(5), 14-16. http://dx.doi.org/10.1109/MITP.2010.137
    » http://dx.doi.org/10.1109/MITP.2010.137
  • Wei, Z. W. Z. (2010, November 4-6). An initial review of cloud computing services research development. In Proceedings of the 2010 International Conference on Multimedia Information Networking and Security Nanjing, Jiangsu: IEEE.
  • Wen, P. X., & Dong, L. (2013, September 9-11). Quality model for evaluating SaaS service. In Proceedings of the 4th International Conference on Emerging Intelligent Data and Web Technologies (pp. 83-87). Xi'an, China: EIDWT. http://dx.doi.org/10.1109/EIDWT.2013.19
    » http://dx.doi.org/10.1109/EIDWT.2013.19
  • Wolfinbarger, M., & Gilly, M. C. (2003). eTailQ: dimensionalizing, measuring and predicting etail quality. Journal of Retailing, 79(3), 183-198. http://dx.doi.org/10.1016/S0022-4359(03)00034-4
    » http://dx.doi.org/10.1016/S0022-4359(03)00034-4
  • Yang, Z., Sun, J., Zhang, Y., & Wang, Y. (2015). Computers in human behavior understanding SaaS adoption from the perspective of organizational users: a tripod readiness model. Computers in Human Behavior, 45, 254-264. http://dx.doi.org/10.1016/j.chb.2014.12.022
    » http://dx.doi.org/10.1016/j.chb.2014.12.022
  • Yee, R. W. Y., Yeung, A. C. L., & Edwin Cheng, T. C. (2010). An empirical study of employee loyalty, service quality and firm performance in the service industry. International Journal of Production Economics, 124(1), 109-120. http://dx.doi.org/10.1016/j.ijpe.2009.10.015
    » http://dx.doi.org/10.1016/j.ijpe.2009.10.015
  • Zeithaml, V. A., Bitner, M. J., & Gremler, D. D. (2006). Services marketing: integrating customer focus across the firm (4th ed.). Boston: McGraw-Hill/Irwin.
  • Zhang, Q., Cheng, L., & Boutaba, R. (2010). Cloud computing: state-of-the-art and research challenges. Journal of Internet Services and Applications, 1(1), 7-18. http://dx.doi.org/10.1007/s13174-010-0007-6
    » http://dx.doi.org/10.1007/s13174-010-0007-6
  • Zhou, X., Yi, L., & Liu, Y. (2011, July 10-12). A collaborative requirement elicitation technique for SaaS applications. In Proceedings of the 2011 IEEE International Conference on Service Operations, Logistics and Informatics (pp. 83-88). Beijing, China: IEEE. http://dx.doi.org/10.1109/SOLI.2011.5986533
    » http://dx.doi.org/10.1109/SOLI.2011.5986533

Publication Dates

  • Publication in this collection
    2017

History

  • Received
    27 Mar 2017
  • Accepted
    16 Oct 2017
Associação Brasileira de Engenharia de Produção Av. Prof. Almeida Prado, Travessa 2, 128 - 2º andar - Room 231, 05508-900 São Paulo - SP - São Paulo - SP - Brazil
E-mail: production@editoracubo.com.br