Follow Instructions and comments on document.
INSTRUCTIONS:
1. Complete and correct comments on Chapter 3: Research Method.
2. Follow the Dissertation TEMPLATE included.
3. Use the (3) attached Dissertation Examples. They serve as a template on how to write
and conceptualize the dissertation.
4. Change Table of contents for page number changes as needed.
5. ADD REFERENCES within text and in references section.
6. APA Guidelines, 7th edition
Remember, this is in preparation to conduct the interviews.
Please do not include your personal opinions and interpretations in your dissertation. You will
need to continue to conduct research and avoid overuse of the same authors. Cite.
Please see the attached documents regarding your Interview Protocol and Guide. Your interview
questions must align with your Research Questions. You can include demographic questions in the first
section. However, the remaining questions must align, address and answer the research questions.
Please see attached.
Please submit a “clean” version without highlighted text, track changes and balloon comments. it
should be a paper ready for submission as an assignment without comments or tracks.
Thank you
Knowledge Management Strategies on the Competitive Advantage of Medium-Sized Enterprises: A Qualitative Case Study
Dissertation Proposal
Submitted to Northcentral University
School of Business
In Partial Fulfillment of the
Requirements for the Degree of
DOCTOR OF PHILOSOPHY
by
San Diego, California
January 2023
Abstract
This is qualitative research study on the Impact of Organizational Culture on the Knowledge Management in medium-sized enterprises. The focus of this research is to determine the impact of knowledge management strategies on the competitive advantage of Medium-Sized Enterprises. The research problem for this study was why Medium-Sized Enterprises experience lowered competitive advantage when faced with the inability to utilize organizational cultural strategies that promote knowledge management. Medium-Sized Enterprises face resource constraints in terms of human resources, finances, and time. This inhibits their capability of taking advantage of knowledge management benefits that give them a competitive advantage in the market. The purpose of this qualitative study is to examine the impact of organizational cultural strategies that promote investment in knowledge management within Medium-Sized Enterprises. The guiding theoretical framework for this study is Ecological Knowledge Management Theory that comprises of the four elements knowledge distribution, knowledge competition, knowledge interaction, and knowledge evolution. The research methodology that will be applied in this research is qualitative research. The case study will be the research design that will be used for this research. The research instruments that will be used in this research include interviews, observation, reading, and document review.
Acknowledgments
I would like to express my gratitude to my professor Dr. Davis who guided me throughout this dissertation. I would also like to thank my friends and family who supported me and offered deep insight into the study.
Table of Contents
7
9
11
12
Research Methodology and Design
13
14
14
Definition of key terms 16
16
18
18
The Domains of Knowledge Management 19
23
Introduction 23
Research Methodology and Design
25
31
Data collection and analysis 36
Data Collection 37
Data Analysis 38
40
42
42
43
Summary 44
46
46
47
Research Question 1 47
48
48
Summary 49
APPENDIX: Sample Interview Questions 50
List of tables
List of figures
Chapter 1: Introduction
Knowledge management is crucial in developing and sustaining organizational strategies. Knowledge management involves the collection, analysis, classification, dissemination, and reuse of data to bolster business activities (Jones & Shideh, 2021). Organizations use knowledge management systems for various reasons. Some purposes of knowledge management are increasing revenues, expanding market shares, creating customer-specific products, targeting messaging and advertisements. Many large corporate organizations have successfully installed knowledge management systems within their operations and gained a competitive advantage within their specialization areas (Hussain et al., 2021). On the contrary, medium-sized enterprises continue to experience challenges of installing knowledge management systems to gain a competitive advantage, meet their strategies, and stay at the top of the pyramid (Mazorodze & Buckley, 2021).
Knowledge management is fundamental to all organizations regardless of the product or industry. These organizations rely on the knowledge and expertise of their employees and stakeholders for them to be successful (Mazorodze & Buckley, 2021). Knowledge is an essential asset for organizations. Organizations need to incorporate processes that grow, store, and share the knowledge between stakeholders to increase effective use of knowledge and stakeholder efficiency. According to Priya et al. (2019) an effective knowledge management system is dependent on employees and what they choose to share. Employees ensure a lasting benefit to the organization by implementing efficient knowledge management strategies. Knowledge management can present challenges to the business if the employees are not able to adequately apply knowledge management strategies. These challenges can be highlighted if the search mechanisms of knowledge management within the organization are not powerful and produce inaccurate results or the organization does not have up to date information (Priya et al., 2019).
Medium-Sized Enterprises encounter resource challenges as opposed to large organizations. These resource constraints hinder medium-sized enterprises from implementing knowledge management strategies in their business operations. Limited finances, human resources, infrastructure, and time characterize resource constraints for Medium-Sized Enterprises (Schropfer et al., 2017). This generally leads to knowledge loss and mismanagement of organizational information (Wei et al., 2017). These outcomes generate loopholes for Medium-Sized Enterprises and the inability to take advantage of information retention and analysis. Failure to implement organizational cultural norms that encourage knowledge management efficacy for Medium-Sized Enterprises minimizes their competitive advantage in the market (Mazorodze & Buckley, 2021).
This research topic is relevant because investment in knowledge management is an emergent business tactic that improves the competitive advantage of organizations in their respective industries (Rialti et al., 2020). This research will also help develop a detailed analysis of knowledge management, Medium-Sized Enterprises, and organizational culture. This research will enhance scholar knowledge on the benefits of knowledge management in Medium-Sized Enterprises. Knowledge management allows organizational stakeholders to stimulate cultural changes and innovation which helps the organization to evolve to the dynamic business need in their market.
The study of knowledge management impact on Medium-Sized Enterprises is crucial because there is an increasing number of Medium-Sized Enterprises embracing knowledge management strategies in their business operations. This study will provide information that can be used to assess the positive and negative impact of applying certain knowledge management strategies in Medium-Sized Enterprises. Additionally, scholars and researchers can utilize the findings of this study as a knowledge base for future research. This research is aimed at contributing to the field of business and organizational leadership that can be referenced by future scholars
There has been various research conducted on knowledge management. A study conducted on the impact of knowledge management in improving organizational effectiveness determined the link between organizational effectiveness and knowledge management and how competitive advantage is generated in the business world (Finn, 2013). Ngulube (2019) maps the methodological issues that arise during knowledge management research. Researchers have conducted studies to determine the factors that influence knowledge management in practice. Existent research by previous researchers will help to create a balance between individual work and collaborative work from the scholar community.
Statement of the Problem
The problem to be addressed in the study is why Medium-Sized Enterprises experience lower competitive advantage when faced with the inability to utilize organizational cultural strategies that promote knowledge management (Rialti et al., 2020). Medium-Sized Enterprises face financial and resource constraints to invest in business strategies like knowledge management. Few Medium-Sized Enterprises have calculated the cost of knowledge management. Rarely have they adopted the practices targeted at improving knowledge management (Castagna et al., 2020). Medium-Sized Enterprises experience knowledge loss because of financial and resource constraints during investment in knowledge management and failure to integrate organizational cultural strategies that foster knowledge management. Hence, Medium-Sized Enterprises miss out on the benefits of knowledge management in better decision making, improved organizational agility, increased rate of innovation, quick problem-solving, improved business processes, employee growth and development, better communication, and competitive advantage (Yekkeh et al., 2021).
Organizations that apply knowledge management tactics in their business strategies help maximize their gains in multiple ways (Przysucha, 2017). Medium-Sized Enterprise organizational culture is not focused on management investment, strategies, and benefits (Chen et al., 2010). According to Hussain et al. (2021), organizational culture is influential in promoting behaviors fundamental to knowledge management. These behaviors include sharing and creating knowledge and mediating the relationships between individual knowledge and organizational knowledge. Organizational culture shapes employee attitude, behavior, and identity. Knowledge is a fundamental resource for all organizations, including Medium-Sized Enterprises (Castagna et al., 2020). The increase in competition and advanced management strategies in companies has heightened the need for organizations to implement knowledge management strategies to gain a competitive edge.
Knowledge management is mostly referred to as a general improvement practice that is used to enhance the effectiveness of knowledge in organizations especially in intensive companies (Peter, 2002). Medium-Sized Enterprises face risks and problems due to immaturity of knowledge management practices and failure to integrate knowledge management in their organizational culture that will ensure consistent knowledge management practices for the organization. A lack of consistency in knowledge management practices for the organization gradually lowers the capability of Medium-Sized Enterprises to maintain a competitive edge in their industries. If this problem is not addressed, Medium-Sized Enterprises face the risk of instability and inability to foster rapid adaptation to the changing market demands and technology in the business environment (Peter, 2002).
Purpose of the Study
The purpose of this qualitative exploratory case study is to examine the impact of organizational culture norms that promote investment in knowledge management strategies in Medium-Sized Enterprises. The aim of this research is systematic management of Medium-Sized Enterprise knowledge assets to meet strategic and tactical requirements and creating value for the organization (Jonsson, 2015). By implementing knowledge management strategies in Medium-Sized Enterprises enhances competitive advantage and improves organizational success. This is possible through effective use of knowledge resources and assets to provide the ability to respond and innovate to changing market demands.
The target population for this research is a medium-sized information technology company located in the northeastern part of the United States. The organization employs at least 50 participants for it to run normally. A sample of 36 participants will be recruited from the target population to participate in the study because a number slightly above half the population will yield comprehensive results. A sample size is selected based on demographics like physical location, availability, and reliability, (Jenkins et al., 2020).
The research instruments that will be used to collect data from the research participants will include individual in-person and video-conferencing interviews. The interviews will take approximately thirty to forty-five minutes. Interviews will be conducted for data collection purposes. During the interviews, the researcher will describe the purpose of the research and inform the participants that they can voluntarily stop the interview process at any time. The qualitative data collected for this study will be analyzed using descriptive analysis. Descriptive analysis is the investigation of the distribution of complex and critical data into proper numbers and figures by identifying the association between various numerous and data on knowledge management in the Medium-Sized Enterprise.
The research process of this study will incorporate identifying an ideal sample from the target population at the Medium-Sized Enterprise, defining the sampling frame, data collection, data analysis, and the major processes of the research and the results. All participant information collected during this research will be kept confidential and securely stored. Inductive coding will be used to code the dataset used in this research. Thematic analysis will be used to analyze data collected from this research.
Introduction to Theoretical Framework
The theory that will be applicable for this study is the Ecological Knowledge Management theory. The Ecological Knowledge Management theory deals with people, relationships, and learning communities (Martins et al., 2019). Knowledge management research can be traced to the 1970s where the early work focused on sociology of knowledge around organizations and technical work in knowledge-based expert systems. Previous knowledge management frameworks focused on knowledge management from a process view. This includes activities like storage, transfer, retrieval, and application of knowledge from one generation to another. Ecology is used to analyze the relationship among members and how they interact with the environment (Martins et al., 2019).
The Ecological Knowledge Management Theory is a model that comprises knowledge interaction, knowledge distribution, knowledge evolution, and knowledge competition. This model is effective in determining the knowledge management strategies and how they are applied in organizations. The theory will be essential in explaining how the interaction of the human resource, clients, and technology can be used to establish knowledge management systems in organizations. The Ecological Knowledge Management Theory applies to this study because it consists of four elements that interact with each other to evolve and enhance healthy knowledge ecology within organizations (Raudeliuniene et al., 2018). The four elements are knowledge distribution, knowledge interaction, knowledge competition and knowledge evolution. According to Deng-Neng et al. (2010) maintaining effective knowledge ecology in organizations is fundamental for the success of knowledge management within the organization. The Ecological Knowledge Management Model will guide the researcher in identifying the impact of knowledge management strategies in Medium-Sized Enterprises.
Introduction to Research Methodology and Design
The research methodology applied in this study is qualitative research. Qualitative research is a social science research method used to collect data by working with non-numerical data and seeks to interpret meaning from the data collected. An exploratory case study was selected for this research because it demonstrates the significance of this study and provides factual evidence to persuade the reader (Rhee et al., 2015). Qualitative research methodology for this research is aimed at understanding the impact of knowledge management in Medium-Sized enterprises. The exploratory case study research design is fundamental to this research because it will demonstrate the significance of this research to the industry (Rhee et al., 2015).
This study will be conducted on Medium-Sized Enterprises. By implementing a qualitative research method will allow the researcher to analyze Medium-Sized Enterprises, organizational culture, and knowledge management amongst other major concepts in this study. The qualitative research method is applicable for this study because it provides the researcher with qualitative data that will be used to analyze the impact of knowledge management strategies on Medium-Sized Enterprises. The data collection process will characterize the use of research instruments like interviews, reading, and observation. The validity of this research will be determined by the appropriateness of the research instruments applied (Aithal, 2017).
This research will focus on how Medium-Sized Enterprises incorporate these knowledge management strategies into their organizational culture. Case study research design, the in-depth study of a phenomenon method, is pertinent for this study because it requires careful formulation, examination, and listing of assumptions of the research in open-ended problems, (Leung, 2015). The research methodology applied in this study will help identify the impact of knowledge management strategies on Medium-Sized Enterprises and how it affects their competitive capability in the industries that they operate in.
Research Questions
RQ1
How does organizational culture affect knowledge management within the medium-Sized Enterprise?
RQ2
How does investment in knowledge management improve the competitive advantage for the Medium-Sized Enterprise?
Significance of the study
The findings of this research will contribute to the success of Medium-Sized Enterprises because organizational culture is an essential component in all organizations. This study will aim to identify how organizational culture that promotes knowledge management in Medium-Sized Enterprises can increase competitive advantage. This research is highly significant because the competitive advantage is important to Medium-Sized Enterprises. If organizations generate higher benefits, then Medium-Sized Enterprises could help improve residual value for the same desired value. This will increase the competitive advantage for the enterprise (Jones et al., 2021).
The data collected in will help evaluate how the organizational culture can be used to improve the competitive advantage of Medium-Sized Enterprises. This study will prepare organizational leaders in dealing with competitive advantage issues that are brought about by an organizational culture that does not support knowledge management in Medium-Sized Enterprises. Also, the study will contribute to the body of knowledge in business administration and organizational leadership and business by investigating how the organizational culture of Medium-Sized Enterprises can be used to increase their competitive advantage. The findings of this study will highlight the aspects of knowledge management that enhance competitive advantage for Medium-Sized Enterprises in their industries. The aspects of knowledge management that will be studied include process, people, content information technology, and strategy. The aspects of knowledge management are vital in determining how knowledge is handled, shared, analyzed, and used to make decisions within organizations.
The study’s purpose is to explore and address the challenges that face Medium-sized Enterprises as they work towards establishing knowledge management systems. Medium-sized enterprises have failed to launch knowledge management systems successfully and this research could be a turning point (Hussain et al., 2021, Mazorodze & Buckley, 2021). The aim of the study is to highlight how the problems associated with implementing knowledge management systems could be solved by the relevant stakeholders. Solutions could include government interventions or a Medium-size Enterprise commitment to Knowledge Management Systems (Mazorodze & Buckley, 2021). Lastly, research on this topic could provide opportunities for future research by other scholars in the field of organizational culture and strategic management.
This research will also be significant to practice because it will enhance the development of organizational leadership. This study will foster a new understanding of knowledge management in Medium-Sized enterprises, enhance concepts, and add to the body of knowledge. The successful completion of this research will provide organizational leaders in Medium-Sized Enterprises with the knowledge management strategies that will lead to quicker problem-solving, improved organizational agility, better and faster decision making, increased rate of innovation, supported employee growth and development, improved business processes, and better communication (Mazorodze et al., 2019).
Medium-Sized Enterprises
Medium-Sized Enterprises are enterprises that employ 250 or fewer employees. These enterprises do not exceed an annual turnover of $50 million (Chen, 2006).
Knowledge Management
Knowledge management is the process of structuring, defining, sharing, and retaining knowledge and employee experience within an organization (Maier et al., 2011).
Summary
This research study will focus focused on how Medium-Sized Enterprises incorporate knowledge management in their organizational culture. Knowledge management helps organizations to expand their market share, increase revenues, help with target messaging, create customer specific products, and better organizational advertisements. The statement of the problem for this research is why Medium-Sized Enterprises face lower competitive advantage for their inability to utilize organizational cultural strategies that promote knowledge management. The purpose of this study is to contribute to the success of Medium-Sized enterprises in their specific industries by introducing effective knowledge management strategies in the organizational culture of Medium-Sized enterprises. The theoretical framework Ecological Knowledge Management Theory will guide the development of this research. Chapter 2 of this dissertation will focus on the literature review of this study. The next chapter will entail a discussion on the impact of knowledge management strategies and rationale for lower competitive advantage on medium-sized enterprises.
Chapter 2: Literature Review
In this chapter, relevant literature information related and consistent with the objectives of the study was reviewed. Important issues and practical problems were brought out and critically examined so as to determine the current situation. This section was vital as it determined the information that links the research with past studies and what future studies would need to explore so as to improve knowledge.
Conceptual Framework
The literature review of this study of knowledge management is segmented into four domains: leadership, culture, technology, and measurement. These domains are aligned with research conducted by the American Productivity and Quality Center (2001).
Leadership indicates the ability of the organization to align knowledge management behaviors with organizational strategies, identify opportunities, promote the value of knowledge management, communicate best strategies, facilitate organizational learning, and develop/create metrics for assessing the impact of knowledge. Examples of the outcome of these six processes are strategic planning, hiring knowledge workers, and evaluating human resources. The leadership role is pivotal because leaders convey the messages of organizational change, and they send the signals that portray the importance of adopting knowledge management across an organization.
Culture refers to the organizational climate or pattern of sharing knowledge as related to organizational members’ behaviors, perceptions, openness, and incentive. Various committees and training development programs are examples of the culture process. Shaping an adequate culture is the most significant and challenging obstacle to overcome for successful knowledge management (Davenport et al., 2008).
Technology refers to the infrastructure of devices and systems that enhance the development and distribution of knowledge across an organization. The literature review revealed that most knowledge management researchers address the significant impact of technology and its role in effective knowledge management. However, it is notable that an overemphasis on technology might cause conceptual confusion between information management 24 and knowledge management. Gold, Malhotra and Sedars (2011) stress that technology includes the structural dimensions necessary to mobilize social capital for the creation of new knowledge. The examples of this process are internal web-based networks, electronic databases, and so on.
Finally, measurement indicates the assessment methods of knowledge management and their relationships to organizational performance. Skyrme and Amidon (2008) suggest that knowledge management can be assessed in four dimensions: customer, internal process, innovation and learning, and financial. Although there has been skepticism regarding this type of measurement, they attempt to measure it in a way that includes benchmarking and allocating organizational resources.
The Domains of Knowledge Management
Leadership
The literature reviewed in this study affirms the pivotal role of leadership in driving organizational change and adopting and implementing knowledge management. Leadership is also essential for knowledge management systems in matters such as decision making, assigning tasks, and integrating and communicating with people. Desouza and Vanapalli (2005) assert that a leader as a knowledge champion initiates and promotes knowledge management. Seagren, Creswell, and Wheeler (2013) specifically stress that leaders need to address complicated and, yet, urgent issues through strategic planning processes that are needed to transform the institution to successfully respond to social demands. Developing quality leadership is critical at all levels 25 of an organization. Higher education leaders, in particular, must pay attention to human resources, the structure, and the cultural and political climate of the institution. Skyrme (2009) emphasizes the roles of leadership in knowledge management by delineating the work tasks of “Chief Knowledge Officer.” Leadership tasks of this role include: help the organization formulate strategy for development and exploitation of knowledge; support implementation by introducing knowledge management techniques; provide coordination for knowledge specialists; oversee the development of a knowledge infrastructure; and facilitate and support knowledge communities.
Strategies of Leadership
The literature review suggests four key characteristics of leadership that are vitally important to knowledge management: vision, motivation, value of learning, and strategic planning.
Vision
Vision is a leading factor in leadership that transforms organizations, both in terms of culture and structure. The leadership literature provides various perspectives about the concept and function of vision. Dierkes (2011) suggests that organizations in an uncertain environment require visionary leadership. In a knowledge-creating organization, Nonaka (2011) also points out that managers with vision provide a sense of direction that helps members of an organization create new knowledge. This literature review portrays vision as a characteristic that enables leaders to set a standard, facilitate the coordination of organizational activities and systems, and guide people to achieve goals. Visionary leaders address uncertainties that pose threats to an organization.
Motivation
A key to the success of knowledge management is to understand how members in an organization come to believe that they can better perform and contribute to continuous improvement. One of the contributing factors of visionary leadership is to motivate people (Dierkes, 2001). In this regard, motivation is a precondition to continuously justify the vision. Incentives designed to encourage people to share their knowledge seem to have a more positive relation with the cumulative nature of knowledge (Cohen and Levinthal, 2010; Organization for Economic Co-operation and Development [OECD], 2004). By offering vision and incentives, leadership can promote knowledge sharing and encourage people to participate in creating knowledge (Nonaka, 2011; Smith, McKeen and Singh, 2016).
Value of Learning
Learning is widely recognized as critical to the successful implementation of knowledge management strategies. Learning, or organizational learning, described in the literature converts individual, un-codified, irrelevant information or knowledge to organized, codified and, therefore, sharable and relevant knowledge (Dierkes, 2011; Nonaka and Takeuchi, 2005). Hamel (2011) posits that core competencies of organizations reside in collective learning. The development of technology reinforces innovation efforts such as facilitating collaboration as well as organizational learning (OECD, 2004).
Strategic Planning
In an uncertain environment, specific preferences for the future are difficult to predict. Sanchez (2001) stresses the importance of developing future scenarios and 26 preparing responses for them. In his view, organizational learning plays a pivotal role in identifying organizational capabilities, shaping effective strategies, and creating valued knowledge. Long-term, comprehensive strategic planning involves integrating expectations and technology into a vision that enables an organization to prepare for the future (Kermally, 2002). In summary, a number of factors contribute to the role of leadership in knowledge management practices. Based on this literature review, leadership refers to the ability that enables higher education leaders to align knowledge management behaviors with organizational strategies, offer an opportunity and a direction, identify and recognize best practices and performances, and facilitate organizational learning in order to achieve the established goals.
Culture
Based on the literature review, culture is defined as an organizational environment and a behavioral pattern that enables people to share their ideas and knowledge. According to Trice and Beyer (2013), culture is reflected in values, norms, and practices. Values are embedded, tactical in nature, and, therefore, difficult to articulate and change. Values inspire people to do something. Norms are formulated by values, but more visible than values. If members in an organization believe that sharing knowledge would benefit them, they are more likely to support the idea of sharing their skills and knowledge. Practices are the most tangible form of culture. These three forms of culture influence the behaviors of members in an organization. Organizational culture provides the context within which organizational strategies and policies are decided. A shift of organizational culture is a precondition to successfully implement knowledge management. Knowledge management must be integrated within an existing culture of an organization (Lam, 2005). Shaping a viable culture is vital to successful knowledge management (Davenport et al., 2018).
Chapter 3: Research Method
1. Begin with an introduction and restatement of the problem and purpose sentences verbatim.
2. Provide a brief overview of the contents of this chapter, including a statement that identifies the research methodology and design.
3. Only include the following content in this section:
The problem to be addressed in this study is why medium-sized enterprises experience lower competitive advantage when faced with the inability to utilize organizational cultural strategies that promote knowledge management. The current challenge is that Medium-sized Enterprise’s experience lower competitive advantage when faced with the inability to utilize organization culture strategies that promote knowledge and management (Li et al., 2022). The purpose of this study is to examine the impact of organizational culture norms on investment in knowledge management strategies in medium-sized enterprises. The study is important because it seeks to unravel the organizational factors that hinder the medium-sized enterprises from attaining the desired competitive advantage from implementation of knowledge management systems.
Comment by Dr. Deanna Davis: Begin Chapter 3 with these paragraphs.
Medium-Sized Enterprises face resource constraints in terms of human resources, finances, and time (Mustafa & Elliott, 2019). This inhibits their capability to take advantage of knowledge management benefits that provide them with a competitive advantage in the market (Golinska-Dawson et al., 2021). Knowledge management systems form concrete foundation for the establishment and growth of a company. Diverse systems of knowledge management exist in the market and deploying these systems depends on various factors existing in the environment (Yekkeh et al., 2021). However, the effectiveness of the deployment of these systems continues to infer mixed levels of advantages and disadvantages to the enterprises (Asada et al., 2020). The knowledge management systems have caused drastic development and growth of many current large-scale enterprises. The same could be the case if the medium-sized enterprises completely embraced the systems (Hussain et al., 2021).
The chapter will provide an overview of the various research approaches, followed by a discussion of the research design for case studies and the role they play in the investigation. The following section provides an overview of alternative research designs, discussing their advantages for application, as well as their limits and the reasons why they are not suitable for the study. The next section provides information regarding the participants, including the target demographic, the appropriateness of the target group, the sample size, the sampling technique, and the eligibility requirements. In the section on data collecting, specifics regarding the interview procedure, data gathering methods, and the type of analysis conducted are provided. The final section discusses the assumptions, limitations, delimitations and ethical considerations. This section also includes a summary section that concludes with a discussion of the chapter’s contents.
This chapter describes the qualitative research methodology and case study research design In qualitative research, non-numerical information, such as text, visual, or audio, is gathered and analyzed to understand better ideas, perspectives, or encounters (Tomaszewski et al., 2020). It can be used to gain an in-depth understanding of an issue or find innovative investigation solutions. To gain an understanding of how people see the world, qualitative research is conducted. Even though there are many methods for doing qualitative research, its defining characteristics are typically flexibility and an emphasis on preserving rich interpretation when analyzing data Methods such as grounded theory, ethnography, action research, phenomenological study, and narrative research are examples of common research methodologies. They emphasize diverse goals and viewpoints yet share significant commonalities in their overall structure (Stenfors et al., 2020). Each of the research strategies involves utilizing one or more procedures for data gathering, such as; observation, Interviews, Focus Groups, Polls, and Secondary Sources. The culture of medium-sized businesses is the subject of this study, and as a human element, it lends itself particularly well to examination through the qualitative research approach.
Medium-Sized Enterprises face resource constraints in terms of human resources, finances, and time (Mustafa & Elliott, 2019). This inhibits their capability to take advantage of knowledge management benefits that provide them with a competitive advantage in the market (Golinska-Dawson et al., 2021). Knowledge management systems form concrete foundation for the establishment and growth of a company. Diverse systems of knowledge management exist in the market and deploying these systems depends on various factors existing in the environment (Yekkeh et al., 2021). However, the effectiveness of the deployment of these systems continues to infer mixed levels of advantages and disadvantages to the enterprises (Asada et al., 2020). The knowledge management systems have caused drastic development and growth of many current large-scale enterprises. The same could be the case if the medium-sized enterprises completely embraced the systems (Hussain et al., 2021).
The problem to be addressed in this study is why medium-sized enterprises experience lower competitive advantage when faced with the inability to utilize organizational cultural strategies that promote knowledge management. The current challenge is that Medium-sized Enterprise’s experience lower competitive advantage when faced with the inability to utilize organization culture strategies that promote knowledge and management (Li et al., 2022). The purpose of this study is to examine the impact of organizational culture norms on investment in knowledge management strategies in medium-sized enterprises. The study is important because it seeks to unravel the organizational factors that hinder the medium-sized enterprises from attaining the desired competitive advantage from implementation of knowledge management systems.
The first part of this chapter provides an overview of the various research approaches, followed by a discussion of the research design for case studies and the role they play in the investigation. The following section provides an overview of alternative research designs, discussing their advantages for application, as well as their limits and the reasons why they are not suitable for the study. The next section provides information regarding the participants, including the target demographic, the appropriateness of the target group, the sample size, the sampling technique, and the eligibility requirements. In the section on data collecting, specifics regarding the interview procedure, data gathering methods, and the type of analysis conducted are provided. The final section discusses the assumptions, limits of the study, and ethical considerations. This section also includes a summary section that wraps up with a discussion of the chapter’s most important themes.
Research Methodology and Design
Research Methodology and Design
The studies that investigate social aspects of the population employ quantitative, qualitative, and mixed research methods to achieve their objectives (Trochim & Donnelly, 2008). The choice of the research methodology depends on the research problem under investigation (Schwardt, 2007; Creswell & Tashakkori, 2007; Teddlie & Tashakkori, 2007). Kothari (2004) explains what constitutes a researchable problem, testable hypotheses, and how to frame a problem in such a way that it can be investigated using particular designs and procedures. Teddlie and Tashakkori (2007) looked at how to select and develop appropriate means of collecting data. Based on the nature of the problem, it is important that after identifying an area of interest, the researcher should identify appropriate method(s) to approach the problem (Abbott & McKinney, 2012).
A common challenge that social scientists’ face is to choose between qualitative and quantitative research methods for conducting research to meet intended objectives (Goldschmidt & Matthews, 2022). Goldschmidt and Matthews (2022) states that research designs are developed from research questions and purposes. The research questions and purpose of the study play an essential role in justifying the most appropriate method and design for the research. Cronje (2020) states that every research methodology is good, but the design chosen must focus on the appropriateness of the method used to investigate the problem. An appropriate research design and method ensures that the data collected is ideal and relevant towards answering the research questions in place (Dina, 2012; Goldschmidt & Matthews, 2022). The determination of which research method to use and why fundamentally depends on the research goal (Abbott & McKinney, 2012).
To understand the impact of knowledge management strategies on the competitive advantage of medium-sized enterprises, qualitative research methodology was chosen. A qualitative research methodology is appropriate because it will collect descriptive data which include people’s opinion, experience, and observations. Qualitative research is defined as a research method that focuses on obtaining data through open-ended and conversational communication (Goldschmidt & Matthews, 2022). This method is not only about “what” people think but also “why” they think so. In qualitative research, non-numerical information, such as text, visual, or audio, is gathered and analyzed to understand better ideas, perspectives, or encounters (Tomaszewski et al., 2020). Qualitative methods can be used to gain an in-depth understanding of an issue or find innovative investigation solutions. To gain an understanding of how people see the world, qualitative research is conducted. Even though there are many methods for doing qualitative research, its defining characteristics are typically flexibility and an emphasis on preserving rich interpretation when analyzing data.
Remove extra space
The qualitative research design is effective in investigating the identified themes that are linked to the purpose, problem, and research questions derived from same participants (Mehrad & Zangeneh, 2019). The study may take parametric form by analyzing numerical effect on the financial matters, or non-parametric form by analyzing the observations, experiences and respondents’ opinions (Goldschmidt & Matthews, 2022). In analyzing a competitive advantage of an enterprise, aspects like market niche, profitability, expansion, information storage and retrieval, knowledge flow and usage, decision-making systems, and customer satisfaction can be considered. These aspects are non-parametric, and thus should be analyzed by collecting data using qualitative tools like questionnaires and interviews (Bergman et al., 2012).
A quantitative research method postulates a research hypothesis and uses numerical data to validate or reject the hypothesis, defines trends in the variables and predicts the future events (Shank, 2006; Mearsheimer & Walt, 2013). Quantitative methods encompass parametric variables that occur after a manipulation of the environment (Maxwell, 2012). Often, a control group is used to compare the effect of the treatment on the sample population. In most cases, experimental researchers use quantitative methods (Punch, 2013; Ormston et al., 2014). For instance, causal-comparative research design is a quantitative method that aims to identify cause-effect between identified variables without manipulating the dependent variable (Umstead & Mayton, 2018). Causal-comparative research design utilizes the scientific method to manipulate the independent variable to identify the outcome of the dependent variable in a controlled environment did not apply to this study. Quantitative research was inappropriate for this study because variables identified will not be manipulated to investigate the impact using numerical parameters. The current study will not investigate the effect of knowledge management systems on the medium-sized enterprises competitive advantage or the effect of organizational culture on adoption of knowledge management systems.
Qualitative research design consists of ethnography, phenomenology, narrative inquiry, grounded theory, and case-study research methods (Abbott & McKinney, 2012). It is critical to assess the nature of these methods in order to ascertain the preference of the case study technique to be applied in this research. An ethnographic approach requires the researcher to experience the culture firsthand, either as an integral role or a spectator., to understand the IT professionals’ customs, beliefs, behaviors, and reactions to various situations. Ethnography is a qualitative research design whereby researchers are allowed to interact with observers or participants who are taking part in the study in their real-life experience (Parker & Silva, 2013). This research design explores in detail on how complex interventions operate in the community (Jayathilaka, 2021). Aspers and Corte (2019) also noted that ethnography is effective because it exposes the researcher to high scope of data and pinpoints business needs while making accurate predictions. The method aims at investigating how things happen and explain why they happen (Bergold & Thomas, 2012). In this study, the researcher is not investigating an event but factors that inhibit adoption of knowledge management system and impact it might cause if implemented. The study would thereof not use ethnography method because the study is not focused on understanding the study participants’ customs, beliefs, behaviors, and reactions to various situations. there is not a particular event or happening that is under investigation. Comment by Dr. Deanna Davis: What is IT? Please do not begin a sentence with It. Eliminate vague pronoun references. A pronoun must refer to a specific word in the sentence. “It” is a vague pronoun reference because it does not refer to a specific word in the sentence. Comment by Dr. Deanna Davis: End of sentence
The phenomenology research design focuses on the experiences of the individuals in a population to explain particular events observed in the population in general (Aydoğdu and & Yüksel, 2019). The researcher observes the events in the natural setting and explains them based on their understanding and information from literature. Phenomenological study aims at establishing a social phenomenon in the population (Aydoğdu and & Yüksel, 2019). However, this study is not investigating any social phenomenon in the medium-sized enterprises but rather the impact of the organizational culture on knowledge systems adoption and the systems role on enterprises competitive advantage. Therefore, the phenomenology qualitative research design is inappropriate in the study of this topic.for this study.
A set of systematic inductive methods for methods to perform qualitative research aimed at theory development is referred to as a grounded theory (Leung, 2015). The grounded theory approach is also an effective research strategy, and it begins with formulating a query or gathering evidence (Tomaszewski et al., 2020). This method is made of flexible strategies that enhance the inquiry process which aims at establishing theories linking the data collected with applicable theories (Ormston et al., 2014). Inductive approach does not imply disregarding theories when formulating research questions and objectives. This approach aims to generate meanings from the data set collected in order to identify patterns and relationships to build a theory (O’Kane et al., 2019). However, the inductive approach does not prevent the researcher from using existing theory to formulate the research question to be explored. The methodology is less appropriate in this study because the study plans to focus on organizational issues impacting medium-sized enterprises to help make informed decision on employing strategies that promote knowledge management to edge competitive advantage, rather than developing a new theory to explain a social phenomenon.
A case study will help explore the topic’s key characteristics, meanings, and implications. The findings of this study will also help understand the general overview of the events happening in the entire medium-sized enterprise population due to organizational culture and the adoption of knowledge management systems (Aithal, 2017). An exploratory case study research design will be used to enhance data collection reliability. The single case exploratory study was purposively chosen to increase study validity and reliability while reducing sampling errors and inconveniences (Yin, 2009). This method is appropriate for this study because it investigates the issues organizations face while enhancing their competitive advantage in the market based on appropriate knowledge management strategies and the organization’s culture that hinder its implementation.
Case studies are the most applicable research designs used in qualitative research studies (Aithal, 2017). In a case study, the researcher focuses on the study of the complex and contemporary phenomena of the sample population so that the findings can be generalized on the entire population (Trochim & Donnelly, 2008; Yin, 2009). A case study is a systematic investigation of a particular group, community or unit to generate an in-depth understanding of a complex issue in a real-life context to generalize other units (Njie & Asimiran, 2014; Leung, 2015). Case studies were one of the first types of research to be used in the field of qualitative methodology. Today, they account for a large proportion of the research presented in books and articles in psychology, history, education, and medicine (Njie & Asimiran, 2014; Rhee et al., 2015). Much of what we know today about the empirical world has been produced by case study research, and many of the most treasured classics in each discipline are case studies (Flyvbjerg, 2011). Case study will helps explore key characteristics, meanings, and implications of the topic. The findings of this study will also help understand the general overview of the events happening in the entire medium-sized enterprise population due to organization culture and the adoption of knowledge management systems (Aithal, 2017).
To enhance the reliability of data collection, an exploratory case study research design will be used. The single case exploratory study was purposively chosen to increase study validity and reliability, while reducing sampling errors and inconveniences (Yin, 2009). This method is appropriate for this study because it involved investigation of the issues that organizations face while enhancing their competitive advantage in the market based on appropriate knowledge management strategies, and the organizations culture that hinder its implementation. The appropriateness of a case study is based on its ability to provide factual evidence to persuade during the research process (Rhee et al., 2015). In addition, a case study will enhance the understanding of the variables knowledge management systems and organizational culture norms in Medium-Sized Enterprise.
The appropriateness of a case study is based on its ability to provide factual evidence to persuade during the research process (Rhee et al., 2015). In addition, a case study will enhance understanding of the variables of knowledge management systems and organizational culture norms in Medium-Sized Enterprise. The current study investigates the effect of knowledge management systems on the medium-sized enterprises competitive advantage or the effect of organizational culture on adoption of knowledge management systems. A qualitative research methodology is appropriate because it will collect descriptive data which include people’s opinion, experience and observations. To enhance the reliability of data collection, an exploratory case study research design will be used. The single case exploratory study was purposively chosen to increase study validity and reliability, while reducing sampling errors and inconveniences (Yin, 2009). This method is appropriate for this study because it involved investigation of the issues that organizations face while enhancing their competitive advantage in the market based on appropriate knowledge management strategies, and the organizations culture that hinder its implementation.
This research aims to establish the effect of organizational culture on the adoption of knowledge management systems in medium-sized enterprises and the effect of knowledge management systems on the competitive advantage of the enterprises. The study investigates the participants’ opinions from a similar non-manipulated setting regarding the organization and knowledge management systems. The research questions are developed from these objectives. Considering the nature of the study questions, purpose, and problem, a design that investigates a particular entity to help understand its operations, culture, and overall effect on adopting knowledge management systems and outcomes in competitiveness is appropriate. A case study was, therefore, most relevant in accomplishing these objectives.
Population and Sample
There are 153 medium-sized IT Companies located in the northeastern United States. The sample population for this study will be approximately 10-12 employees from three medium-sized Information Technology (IT) companies. The researcher will purposively sample 30-36 participants from the identified 153 qualified knowledge management workers in the target companies. The nature of the study required a specific sample population that would generate desired data quickly, making the type of sampling effective (Serra et al., 2018). The sampling entailed setting aside the kind of characteristics desired among relevant participants that require examination associated with the topic of the research, knowledge-based, in order to come up with only those that fully cover the range of characteristics required (Etikan, 2016; Wu et al., 2016).
Although it is not the only industry that propels the economy of the Northeastern United States, the technology sector is among the most prominent industries. The research appears to be leaning toward choosing a state or state located in the northeastern United States due to variables like human resources, government subsidies, infrastructure, established IT facilities, and web servers. The location was selected because medium-sized information technology companies place a greater emphasis on bachelor’s degrees than they do on advanced engineering degrees (Burke, 2018). This is perhaps a more suitable fit with the research topic of this study, which focuses on knowledge-management techniques. Innovations in information and communications technology are entering the market in the Northeast at an ever-quickening rate (Burke, 2018). This study provides a summary of the cause-and-effect association between attributes of the three IT businesses that were chosen for analysis. With characteristics such as knowledge indispensability, higher growth rate, shortened life of brands, high importance of human understanding, and process approach, they all serve the same purpose: to generate new concepts and innovative expertise that will foster perseverance in an industry that is highly competitive.
The first company is regarded as a middle-sized information technology company due to the fact that its headquarters, located in the Northeastern region, has between 100 and 200 people (Forbes, 2022). With an industry-leading prediction performance of even more than 99.9 percent, The second company is the only cloud-native Statistics engine that gathers and transforms data recorded by paper documentation, including handwriting, into information suitable for business use. The company, which has a capacity of 32 employees, is one of the rapidly developing IT enterprises in the Northeastern region (Forbes, 2022). The third company is a privately held business that has been operating in this sector for the past decade. Despite this, its number of employees allows it to be classified as a medium-sized IT company. The firm employs between 100 and 250 people (Forbes, 2022).
The eligibility criteria for participants will include 1) individuals employed in a medium-sized IT company in the northeastern part of the US and 2) employed for 5 years or more in the company. Social media is a very effective recruitment tool, particularly in IT market research when a particular audience must be targeted. Utilizing a pre-recruited market analysis panel to locate high-quality respondents for this project is yet another guaranteed approach to this case study (Tracy et al., 2021). In addition, motivating participants to recommend a friend is an unexpectedly effective method for recruiting high-quality volunteers and gaining attention in this research. Permission to conduct the studies will be sought from the human resource personnel of each of the three companies. An email will be sent to the three companies, requesting their approval to use recruit some for their employeesemployees from their companies for this research. The email may be sent through the University’s admin to arouse thewill be sent to companies’ to obtain their interests and approval for their members to participate in this research. The participants will be provided with IRB approved consent form, which will clearly state the objective of the study, ethical issues addressed, outcomes, and terms of participation. Follow-up calls must be made to authenticate them and ensure that they are a better match this research project. The participants will not be presented with any incentives to participate in the study. Comment by Dr. Deanna Davis: Please delete this sentence because IRB will have an issue with this. Comment by Dr. Deanna Davis: You will need to include snowball sampling. When participants recommend others it is called snowball sampling. Please research this and include this information here.
Please do the following checklist items:
1. Explain the type of sampling used (Purposive Sampling AND Snowball Sampling) and why it is appropriate for the dissertation proposal methodology and design. Discuss how you will use Purposive Sampling and Snowball Sampling.
2. For qualitative studies, evidence must be presented that saturation will be (proposal) or was (manuscript) reached. You need to discuss how you will reach data saturation. Please research saturation, describe and discuss.
Instrumentation
Qualtrics will be used as the software application for designing, distributing, and analyzing the questionnaire. For status, Qualtrics software will help design the questionnaires. Qualtrics tools offers a comprehensive solution to questionnaire formulation, which has sample validated questions with guiding instructions that help make effective interview questions. The questionnaire will maintain the standard format, employing free-form text lines, boxes, and checkboxes for participants’ responses to structured questions. This questionnaire design will help the researcher identify consensus in response and thus use it to draw a definite theme. Comment by Dr. Deanna Davis: You will develop the interview questions and Qualtrics will design the interview questionnaire.
The interview questions were divided into three sections covering general, body and conclusion (Oprit-Maftei, 2019). Under general section, the interview will asked questions in relation to the organizational structure and trajectory over the last two years. This section helped the researcher understandwill decipher how different companies have been fairing and their Wstructural set up (Roberts, 2020). The body section covered questions in relation to the existing knowledge management strategies and how the enterprise has incorporated the same to its operations. This section was critical in establishing which knowledge management strategies different companies have put in place. The conclusion section asked questions to establish the participants own view of the strategy employed by the respective enterprise (Gilbert et al., 2018). In this, the researcher obtained information on the individual’s perspective on the different knowledge management strategies used by the companies they work for. This design of the questionnaire was ideal for capturing all the necessary information that would help the researcher to identify the perspectives of the employees working in the knowledge management departments, the management and other systems operating the medium-sized enterprises, on the knowledge management systems. Again, the questionnaire will be able to dig into the factors that enhance or inhibit management from embracing and implementing competitive knowledge management systems in their enterprises for edging competitive advantage in the market. Comment by Dr. Deanna Davis: You can divide the interview questions with demographic questions and general questions that answer the research questions. Comment by Dr. Deanna Davis: You have to use future tense. Comment by Dr. Deanna Davis: What do you mean by this? I just reviewed your interview questions and they do not discuss organizational structure and trajectory.
The interview questionnaires were structured to capture both researcher fixed points and participants opinions from observation and experiences (Walker, 2019). The introductory section of the questionnaires consisted of semi-structured questions, which sought participants’ position in the topic under investigation. The body section was made up of a mixture of closed andinterview questions will be open-ended questions. The researcher offered direction of answering questions and also gave the participants open air to give their opinions and experiences on the basic information on the strategy being used by the enterprise and the participants understanding of the same. Comment by Dr. Deanna Davis: You have not conducted the interviews yet so you cannot offer direction. You have to speak in Future tense because you have not conducted the study yet. Please ensure that you are using future tense.
In the introductory session of the interview, the researcher described will describe the purpose of the research and informed the participants’ voluntary participation. The ethical issues were will be addressed and benefits and expectations of the researcher study explained. The demographic data, information on the basic information on the strategy being used by the enterprise and the participants understanding of the same were will be asked. In the proceeding sessions of the interview, key components of the study were will be asked and participants wereto provide participants with given sufficient time to respond to each question conclusively. An interview protocol will be used to structure the way to conduct the research interviews (Ahmad, 2020). This will help the researcher to know what to ask about and in what order and it ensures a candidate experience that is the same for all applicants. The guide offeredwill offer direction on seven elements thus ensuring conclusiveness of the data collection process (Ahmad, 2020). These elements consistedinterview elements will consist of the invitation and briefing, setting the stage, welcoming participants, questions, candidates’ questions, wrap-up and scoring.
The checklistchecklist was largely used in preparation for collection of data on the respondents’ demographic data, enterprises location, and size of the enterprise. Again, aspects of the categories of the resource constraints were captured in the checklist. Sufficient note booksjournaling will be provided and will be labeled and dated well to ensure each correspondent’s information was distinct and separate from the others. In the note booksjournals, specific areas were set to collect data on the resourcesresource’s challenges that Medium-Sized Enterprises encountered were noted in the categories identified as finance, time, human resource, and infrastructure. An area to capture the nature of the challenge caused was extracted from the participants’ opinions. The nature of the hindrances that constraints caused were also to be recorded afterwards as the respondents’ identified them.
The information on the enterprises organizational structure and norms that hinder adoption and implementation of the knowledge management systems in the Medium-Sized Enterprises are to be recorded in the tables drawn in the note books after their extraction. The format of the tables will be aligned with the number of the respondents that posed them. This is significant in detailing the percentages of the respondents that view of the various organizational culture and norms as key factors that influence the adoption and implementation of the knowledge management systems in medium-sized enterprises. These instruments will be well formulated to ensure that all the key areas of the study will be addressed and sufficient information was provided to necessitate the accomplishment of the research objective. The format will be aligned with the chosen tool and thus appropriate for efficient and effective collection of data during the interview, to help researcher collect reliable, relevant and sufficient information for the making of the conclusive conclusion.
You have not discussed triangulation. Please discuss how you will triangulate the data. Please view https://www.youtube.com/watch?v=PfbsJyHRjrs
You will use triangulation with interviews, journaling and document analysis about the companies. Please discuss the following:
1. Triangulation and how you will use it.
2. Journaling instead of notetaking
3. Member Checking
and Analysis
The researcher will conduct a preliminary data collection using questionnaires structured for the purpose of establishing the data preliminary categorizations that would effectively collect sufficient data. During the interviews, 1030-36 respondents will be approached randomly and given 20-30 minutes to respond to the questions as structured. Interviewing in social research is very important in promoting adequate preparations. During this practice, the researcher will identify the resources required in the actual research study and adjustments that are necessary in the planning process. For instance, the preliminary study will provide the researcher with views that enable the categorization of the constraints to adoption and implementation of the knowledge management systems in medium-sized enterprises in America. Moreover, the responses will make it clear that the questionnaire has been formulated well. However, slight modification on the language and style of asking questions is inevitable and should also be considered. This is important in assisting the respondents to clearly understand the question and deliver response that is both informed and deliberate. Making sections in the questionnaire is also adopted from the questionnaire, while the time required administering and collecting the responses is adjusted alongside the entire time that each respondent will be allocated to complete the interview. Comment by Dr. Deanna Davis: You stated 30-36 respondents in your sample size
Study Procedure
After obtaining IRB approval for research, a list of the knowledge management personnel from the human resource department in the target companies will be consulted to recruit participants. All IT employees in the selected companies will be emailed the consent form and request for participation. Those who will reply in acknowledgement for the request, and state availability to take part in the study will be listed and contacted for a face-to-face appointment on a scheduled date. On the appointment day, the researcher will check each respondent’s details to validated inclusion in the study.
Data Collection
An appropriate data collection method is important in enhancing inclusivity of data collected, in-depth data collection, speed and reducing the cost of collecting data (Mellinger & Hanson, 2016; Oprit-Maftei, 2019; Ahmad, 2020). In this study, an in-depth interview will be employed through diversified lenses for the purposes of revealing multiple facets of the study topic. The use of in-depth interviews is most appropriate when the researcher is interested in obtaining concrete, contextual in-depth awareness about a specific real-world subject (Crowe et al., 2011; Gilbert et al., 2018). In this study, participants will be drawn from many known medium-sized enterprises, increasing the reliability and validity of the findings to generalize the conclusion on the entire population. Diverse samples will help increase diversity of the findings and reduce the bias of information, while collecting sufficient views that can be used to draw statistical conclusion (Rhee et al., 2015).
After all the necessary adjustment is completed, the data collection process will commence. Data collection will be done using checklists, tables, and note books. Observation and voice recording is also part of the data collection procedures. Checklists will be used to validate the participants’ criteria for inclusion in the study. Again, the list will help the researcher validate and meet the sample size required. The checklist will be largely used in preparation to the collection of data on the respondents’ demographic data, enterprises location, and size of the enterprise. Again, aspects of the categories of the resource constraints will be captured in the checklist. Sufficient note booksjournaling will also be provided. and theThe note booksjournaling will record specific information on the resource challenges that Medium-Sized Enterprises encountered will be noted in the categories identified as finance, time, human resource, and infrastructure. The nature of the challenge caused will be extracted from the participants’ opinions and recorded. The nature of the hindrances that identified constraints caused will also be recorded afterwards. Again, the information on the enterprises organizational structure and norms that hinder adoption and implementation of the knowledge management systems in the Medium-Sized Enterprises will be recorded in the tables drawn in the note books.
The collected data will be checked for completeness, cleaned, processed coded and captured into Microsoft Excel and NVivo software for analysis. Descriptive analysis will produce tables, frequencies, weighted mean, and percentages. These will be used to describe the basic features of the data in a study. They will provide simple summaries about the sample and the measures obtained from the participants.
A descriptive data analysis will be conducted in NVivovivo software and SPSS softwares which are critical in detecting dependent varablesthe phenomenon. The analysis will describe various themes obtained from the data set, and percentages of the participants that populated a certain theme. This information will be used to understand the themes and associated frequencies of the issues that affect knowledge management strategies in Medium-sized enterprises. Through the software, the respondents will provide key information about knowledge Management systems, which will be preliminary, classified as (i) Codification and efficiency, (ii) Efficiency and personalization, (iii) Innovation and codification, and (iv) Innovation and personalization. In the Nvivo NVivo analysis tool, these themes will be coded and input on its worksheet to showcase the relationship between the independent and dependent variables. The rest of the data regarding the limitations, demographic information and cause of the organizational reluctance in the adoption and implementation of the knowledge management systems will be coded on the questionnaires and imported in the tool. Comment by Dr. Deanna Davis: You cannot use SPSS software for a qualitative study. Comment by Dr. Deanna Davis: NVivo – correct spelling
During the analysis in Nvivo NVivo statistical tool, independent variables associated with the role of organizational culture and norms on the adoption and implementation of the knowledge management systems in the medium-sized enterprises will be done. The frequency tables, means, standard deviations and percentages will be derived. These results will be exported to the word documents for discussion. In the SPSS analysis tool, demographic data will be analyzed to yield table of frequencies with standard deviations and correlations between resource constraints and adoption of the knowledge management systems in the medium-sized enterprises. The themes derived from the data set will be evaluated for coherence using internal validity table. The significance of the variation in the number of the respondents’ listing financial constraints, time constraints, human resource constraints, and infrastructure constraints will be considered dependent variables and will be tested using the t-tests tool at p-value of 0.5% (Huang, 2019). This is likely to generate an ANOVA table that will be used to inference the significances of the various constraints identified in hindering the adoption and implementation of the knowledge management systems. Moreover, the significance of the knowledge management systems on the enhancing of the competitive advantage of the medium-sized enterprises will be tested with t-tests tool. The results will be presented in tables for precise and efficient display. Charts will also be developed to show the competitive advantage enhanced by the implementation of knowledge management systems to some extent. Comment by Dr. Deanna Davis: Please research the difference between quantitative and qualitative data analysis so that you can understand them. You are using quantitative terms.
Please research, describe and discuss how you will use:
1. Coding
2. Thematic Analysis
Assumption
Assumptions are concepts that researchers and peers who read the dissertation or thesis accept as true or at least reasonable (Hu & Plonsky, 2021). To understand the study’s findings, it is crucial to know what assumptions and restrictions will be used. The decisions researchers make in relation to the research methods have a direct impact on the conclusion and recommendation made at the end of the research. By adopting qualitative research, reality is structured and understood in a particular way. Therefore, some sought of assumptions must be made to achieve the research overall objectives. This research will be based on the case qualitative research method. Always, qualitative research methods have grey areas that must be observed and concluded before the research begins. Unless this is done, the study is likely to ignore important indicators of the sources of outliers, which might be difficult to identify later.
The first assumption the study is likely to make is the truthfulness and honest of the members. It is assumed that all the respondents will remain truthful and honest in the giving of the information. The data to be collected in this study heavily relies on the premise of honesty and truthfulness on participants. Obtaining key information from certain business organizations is not easy. Some employees are under oath not to give any information regarding the organization culture, structure, finances and other aspects without consent from the management. The study therefore may assume that the identified participants will give true and fair views of their business performance, challenges and future plans regarding the adoption and implementation of knowledge management systems. On the same note, the researcher may assume that such views cannot be quantified or analyzed apart from just a general observation by the researcher and see if the feedback reflects the physical outlook of the enterprises. In some scenario, comments made by the participants might be so diverse such that the researcher only asks for further verbal clarification, a scenario that might not be quantifiable. Comment by Dr. Deanna Davis: Please do not begin a sentence with It. Eliminate vague pronoun references. A pronoun must refer to a specific word in the sentence. “It” is a vague pronoun reference because it does not refer to a specific word in the sentence.
The second assumption is that the collection of the data will observe the normal distribution of the respondents and collect data from diverse employees in the medium-sized enterprises. This assumption is critical while developing the data collection tool and sampling of the respondents so that data collected reflect the divers views in the entire population represented. The third assumption is that the respondents will be mature, emotionally stable and partial about the working environment. The respondents therefore will not give their views retaliation and with intent to paint a bad image on the enterprises that have not implemented knowledge management systems. This assumption is very important especially when the objective of the study is to design a strategy that will help mitigate the negative impacts of not implementing the knowledge management systems. In such as instance, sober and informed observations and conclusions must be made.
Otherwise, the strategies that will be developed will be misleading and thus result in further detrimental effects of the medium-sized enterprises that implement knowledge management systems based on misleading information. Such enterprises might lose huge amount of money, experience high employee turnover, loose customers and loose confidential information fraudulently. The observations made therefore will consider the assumptions made and draw conclusive inferences. When applying the assumption techniques, the content assumed is reasoned to be cross cutting to most of the people that would come across the documentation. For instance, when conducting a qualitative research, an assumption can be that people will assume someone is a nerd if he were glasses but the reality could be different and the person may turn out to be an average person and not as witty as it is depicted by the way of wearing glasses.
Limitations
Social research studies face many limitations for complete implementations. Some significant implications will be drawn from this research. Therefore, the limitations of the study might hinder complete attainment of the research objective. Future research may look at the limitations of the study and develop a research problem. The first limitation might be time factor which should be handled before beginning the entire research process. In a population of 153, only 26 respondents might be a small population to collect representative data from the entire medium-sized enterprise population. Although, control factors such as firm size, industry type, process type and technology type will not be considered in this study’s initial restriction, such information might be very important if included in the study to enrich the findings. A second possible limitation is the complete investigation of the influence of knowledge management and product management on organizational performance as measured at the individual level. This information might not be easily obtained from the employees. However, organizations encourage their workers to collaborate in such research practices.
Discuss the measures taken to mitigate these limitations.
Delimitations
The study’s findings may have significant ramifications for the suggested paradigm. The link between knowledge management and product management may not be considered in the study’s suggested model. Product managers maybe assumed not to have access to a wealth of knowledge to be successful. Organizations operating in circumstances that demand rapid innovation will benefit greatly from product management efforts that include knowledge management (Hassan & Raziq, 2019). The product management operations are centered upon using, creating, and managing knowledge. Researchers should use various data gathering methods and provide specifics on the kind of questionnaires and interview questions they intend to utilize to ensure that any ambiguity is removed. However, the study’s delimitations is a major concern, the number of organizations participating in the study would appreciate information that is quantified to help them in making informed financial choices upon data analysis of their financial trends and future projections.
Ethical Assurances
The study will be prepared and sought approval from the IRB. The panel will approve consent letter, which details the research objectives, participants’ voluntary participation, benefits and incentives, expectations of the research and ultimately those of the data collected. This approval is important in ensuring confidentiality and privacy protection of data. The explanation of the study goal will enable the participants gauge the nature of the study and make informed decision to take part or not. The participation will be voluntary, signifying that the respondent is free to end the participation at whatever stage of the study without notice. Entirely, the terms in the consent letter will be used to ensure protection of the study and participants ethical dignity. While the participants will be adults, the researcher will prepare the following materials for the IRB application; (i) CITI certificate, (ii) eligibility criteria, (iii) recruitment materials, (iv) consent letter, (v) readability report and (vi) data collection instruments.
To participate in the interviews, each respondent will first fill the consent letter accepting to take part in the research. This also implies that the participant will understand the objectives of the study, expectations, benefits, and security of privacy and confidentiality of the information involved. There will be no physical or psychological damage inflicted on the respondents or the research assistants during the study. All data will be stored in a secured file to safeguard the privacy and confidentiality of the information for three years. Each participant will be recorded in a pseudonym to enhance privacy and data protection. While the data is stored in folders, such folders will be protected in safe cages and storage devices in addition to each of the files in the folder having a unique password. The data will therefore be used in an ethical manner and any inference will not be used for personal benefits. The respondents will not be given any incentives whatsoever, but will be given open guarantee to access the reports of the study at will.
Summary
This study looks at organizational culture and norms that promote investment in knowledge management strategies in Medium-Sized Enterprises. This research aims at developing informed inferences about the possible systematical management of the knowledge assets in the Medium-Sized Enterprise to meet strategic and tactical requirements and create value for the organization. Researchers will look at how procedurally the enterprises managed the knowledge in the organizations, and ease of generation of the new ideas, concepts and informed ideas ns to help SMEs transform into successful Multinational business enterprises.
The study will employ a qualitative research, which conducts interviews to collect data from the participants working in medium-sized enterprises in America. The researcher will use purposive sampling technique to attain a sample size of 26 30-36 participants with credible information and experience in the medium-sized enterprises in America to accomplish the objectives of the study. Individual in-person and video-conferencing interviews are the main research instruments since they will give the researcher more valid and reliable information on the topic under study. Ethical issues and conflict of interest will not be observed. Data will be analyzed using Microsoft Excel and NVivo software. Both tools will allow the research to deduce different descriptive statistics which will be critical in making the study conclusion and recommendations as per the results. Thematic data analysis will also be effectively used. This method will be appropriate for describing and understanding data set in relation to knowledge management strategies in the enterprises.
The study is likely to identify that infrastructure, finances, time and human resources are greater predictors of the medium-sized enterprises adoption and implementation of knowledge management systems. It is likely to be observed that medium-sized enterprises that have sound organizational culture and norms to improve knowledge management in the organizations increase its completive advantage in the market, grew and expanded rapidly. This may however be influenced by the changes in management styles posed by new persons in management position.
Chapter 4: Findings
This chapter outlines analysis of research data, research findings and finding discussions. The findings were evaluated according to research objectives and methodology to ensure that research questions are answered. The findings contain results related to demographic characteristics, descriptive analysis and inferential statistics. The study was carried out in the three universities based on the defined criteria in the methodology where lecturers, students and e-learning administrators were requested to provide their views and perception regarding knowledge management of Medium-Sized Enterprise on e-learning platforms.
Reliability of the Data
The Knowledge management (KM) strategies were evaluated and categorized by six criteria: KM objectives, processes, problems, content, strategy, and type of knowledge. The purpose was to find similarities among the sample units. Size, industry, and background information of the company, globalization (national, international), knowledge intensity of the industry, products, business processes, importance of innovation, and main audience of the KM initiative (business unit or whole organization) were also taken into account. Thus, the success of the knowledge management strategies was assessed using two criteria referring to organizational impact:
i. Was the identified problem resolved by the KM initiative (i.e. usefulness of knowledge management strategies)?
ii. Can the companies report monetary or non-monetary success stories (i.e. business performance)?
Results
The cases show that knowledge management (KM) strategies do not necessarily apply to the whole organization. Almost half of the cases supported business units or departments within an organization. Thus, we considered the business strategy of the company if the KM strategies apply to the whole company and we considered the business strategy of the unit if the KM initiative applies to a business unit. For example, we examined the KM strategies in the audit department of company D. The success of the department is based on the quality and the number of audit reports created by the department. The department delivers the reports directly to the executive board. Thus, its business strategy is to deliver fast and reliable reports to the executives and the goal is to make the audit process as efficient as possible.
The KM strategies can be categorized into four combinations of business strategy and KM strategy:
i. Codification and efficiency
ii. Efficiency and personalization.
iii. Innovation and codification.
iv. Innovation and personalization.
How does organizational culture affect knowledge management within the Medium-Sized Enterprise?
Conversely, companies who use knowledge management in order to improve the efficiency of operational processes use databases and information systems to disseminate ‘‘best practices’’ independently from the ‘‘human knowledge carrier’’.
Research Question 2
How does investment in knowledge management improve the competitive advantage for the Medium-Sized Enterprise?
The efficiency strategy relies primarily on the re-use of existing knowledge. It is not necessary to bring people together to share their knowledge directly and combine that knowledge by dialogue in order to create new knowledge.
Evaluation of the Findings
The analysis supported the relationship between business strategy and primary KM strategy. It also showed that some companies deploy both approaches – codification and personalization – within the same KM initiative. This supports propositions that codification and personalization are not two extremes but rather dimensions that can be combined. For example, some KM initiatives with the objective to improve process efficiency mainly relied on the codification strategy and also used instruments like discussions forums or newsgroups to give their employees the opportunity to exchange knowledge and best practices directly.
The case studies did not clearly indicate a higher level of success for the companies that used both approaches. But it can be assumed that a sole reliance on one strategy may be too one-sided, e.g. a sole concentration on codification and reuse of knowledge may not be enough to face the dynamic and turbulence of the market. On the other side, bringing people together does not necessarily lead to innovation if the knowledge is not exploited. We argued that the fit between efficiency and codification on the one side and innovation and personalization on the other side enhances the level of success of a KM initiative. However, it is not clear whether the combination of efficiency and personalization or innovation and codification necessarily lead to less performance of the organization in the long run.
Summary
The findings strongly suggest a relationship between the success of KM in terms of improving business performance of the organization or business unit respectively and the alignment of KM strategy and business strategy. The findings show a matching fit between KM strategy and business strategy. An organization whose business strategy requires efficiency of processes should rely primarily on a codification strategy. An organization whose business strategy requires product or process innovation should rely primarily on a personalization strategy. In addition, the KM initiative should support the objective of the business strategy. For the audit department of Company D, it was important to improve the quality and number of audits. It would have been less important for example to improve the process efficiency for booking flights for the auditors. The KM initiative did support the strategy that added the most value to the department. These findings can also be explained by organizational information processing theory that explains the need for processing information in order to reduce uncertainty and equivocality. Uncertainty deals with the problem of absence of information whereas equivocality means ambiguity and the existence of multiple and conflicting interpretations. Organizations that focus on innovations face high equivocality and need communication channels with high media richness such as face-to-face. Organizations with a focus on efficiency may face less equivocality and codification of knowledge is thus adequate for them.
Benefits
A. Why did your organization join this endeavor at first?
B. Why do you actually participate?
C. Where would you wish the company to be in terms of design and successes in 5 years?
Barriers encountered
D. Has collaborating with governments and civil society brought any unique difficulties?
E. Does your involvement with larger companies consume a considerable portion of your time?
F. You had difficulty establishing the firm’s agenda and objectives. Do you believe the group shares a vision?
Membership with local community
G. Do the majority of your customers favor your complicity?
H. Does opposition counter your participation in a large-scale endeavor?
I. Do you worry about losing contact with your regional support network?
J. Do you worry about straying from your initial mission?
References
Abbott, ML. & McKinney, J. (2012). Understanding and Applying Research Design. Hoboken, NJ: John Wiley & Sons.
Ahmad, F. (2020). Using video-conferencing for observations and interviews: Gathering data from ‘home’ while studying in New Zealand. Waikato Journal Of Education, 25, 109-117.
https://doi.org/10.15663/wje.v25i0.758
Aithal, P. S. (2017). ABCD Analysis as Research Methodology in Company Case Studies. International Journal of Management, Technology, and Social Sciences (IJMTS), 2(2), 40-54.
Asada, A., Basheerb, M. F., Irfanc, M., Jiangd, J., & Tahir, R. (2020). Open-Innovation and knowledge management in Small and Medium-Sized Enterprises (SMEs): The role of external knowledge and internal innovation. Revista Argentina de Clínica Psicológica, 29(4), 80-90.
Aspers, P., & Corte, U. (2019). What is Qualitative in Qualitative Research. Qual Sociol 42, 139–160. Retrieved from
https://doi.org/10.1007/s11133-019-9413-7
Aydoğdu, B., & Yüksel, M. (2019). Psychological Problems and Needs of Deaf Adolescents: A Phenomenological Research. Journal Of Qualitative Research In Education, 7(3), 1-18. https://doi.org/10.14689/issn.2148-624.1.7c.3s.7m
Bergman E., et al (2012). A guide to research paradigms relevant to medical education. Academic Medication Journal.
Bergold, J., & Thomas, S. (2012). Participatory research methods: A methodological approach in motion. Historical Social Research 191-222.
Blaikie, N. (2018). Confounding issues related to determining sample size in qualitative research. International Journal of Social Research Methodology, 21(5), 635-641.
Burke, M. J. (2018). Energy democracy and the co-production of social and technological systems in northeastern North America 1. In Energy, Resource Extraction and Society (pp. 88-104). Routledge.
Chen, D. N., Liang, T. P., & Lin, B. (2010). An ecological model for organizational knowledge management. Journal of Computer Information Systems, 50(3), 11-22.
Cronje, J. (2020). Designing Questions for Research Design and Design Research in eLearning. Electronic Journal Of E-Learning, 18(1). https://doi.org/10.34190/ejel.20.18.1.002
Crowe, S., Cresswell, K., Robertson, A. et al. The case study approach. BMC Med Res Methodol 11, 100 (2011).
https://doi.org/10.1186/1471-2288-11-100
Davenport, H. T., DeLong, D. W., & Beers, M. (2008). Successful knowledge management projects. Sloan Management Review, 39(2), 43-57.
Desouza, K. C., & Vanapalli, G. K. (2015). Securing knowledge in organizations. In K. C. Desouza (Ed.), New frontiers of knowledge management (pp. 76-98). NY: Palgrave/Macmillan.
Dierkes, M. (2011). Visions, technology, and organizational knowledge: An analysis of the interplay between enabling factors and triggers of knowledge generation. In J. de la Mothe & D. Foray (Eds.), Knowledge management in the innovation process (pp. 9-42). Boston, MA: Kluwer Academic Publishers.
Etikan, I. (2016). Comparison of Convenience Sampling and Purposive Sampling. American Journal Of Theoretical And Applied Statistics, 5(1), 1.
https://doi.org/10.11648/j.ajtas.20160501.11
Forbes 2022. Best Mid-size Employers.
https://www.forbes.com/lists/best-midsize-employers/?sh=6a764677210f
Gilbert, B., Meister, A., & Durham, C. (2018). RETRACTED: Escaping the Traditional Interview Approach: A Pilot Study of an Alternative Interview Process. Hospital Pharmacy, 54(1), NP2-NP4. https://doi.org/10.1177/0018578718758970
Gold, A. H., Malhotra, A., & Sedars, A. H. (2011). Knowledge management: An organizational capabilities perspective. Journal of Management Information Systems, 18(1), 185-214.
Goldschmidt, G., & Matthews, B. (2022). Formulating design research questions: A framework. Design Studies, 78, 101062. https://doi.org/10.1016/j.destud.2021.101062
Golinska-Dawson, P., Werner-Lewandowska, K., & Kosacka-Olejnik, M. (2021). Responsible Resource Management in Remanufacturing—Framework for Qualitative Assessment in Small and Medium-Sized Enterprises. Resources, 10(2), 19. https://doi.org/10.3390/resources10020019
Grimsdottir, E., & Edvardsson, I. R. (2018). Knowledge management, knowledge creation, and open innovation in Icelandic SMEs. Sage Open, 8(4), 2158244018807320.
Hamel, G. (1991). Competition for competence and inter-partner learning within international strategic alliances. Strategic Management Journal, 12(4), 83-103.
Hammarberg, K., Kirkman, M., & de Lacey, S. (2016). Qualitative research methods: when to use them and how to judge them. Human reproduction, 31(3), 498-501.
Hassan, N., & Raziq, A. (2019). Effects of knowledge management practices on innovation in SMEs. Management Science Letters, 9(7), 997-1008.
Huang, L. F. (2019, July). Using App Inventor to provide the one-way ANOVA table with blocks. In Proceedings of International Academic Conferences (No. 8710559). International Institute of Social and Economic Sciences.
Hussain, I., Mujtaba, G., Shaheen, I., Akram, S., & Arshad, A. (2020). An empirical investigation of knowledge management, organizational innovation, organizational learning, and organizational culture: Examining a moderated mediation model of social media technologies. Journal of Public Affairs, e2575.
Hu, Y., & Plonsky, L. (2021). Statistical assumptions in L2 research: A systematic review. Second Language Research, 37(1), 171-184.
Jayathilaka, A. (2021). Ethnography and Organizational Ethnography: Research Methodology. Open Journal Of Business And Management, 09(01), 91-102. https://doi.org/10.4236/ojbm.2021.91005
Jones, A., & Shideh, R. (2020). The Significance of Knowledge Management in the Knowledge Economy of the 21st Century. Significance, 13(3). Page numbers?
Kermally, S. (2002). Effective knowledge management: A best practice blueprint. New York, NY: Jon Wiley & Sons.
Kothari, C. R. (2004). Research methodology: Methods and techniques. New Age International.
Lam, W. (2005). Successful knowledge management requires a knowledge culture: A case study. Knowledge Management Research and Practice, 3(4), 206-217.
Leung, L. (2015). Validity, reliability, and generalizability in qualitative research. Journal of family medicine and primary care, 4(3), 324.The article was only one page?
Li, H., Chai, J., Qian, Z., & Chen, H. (2022). Cooperation strategies when leading firms compete with small and medium-sized enterprises in a potentially competitive market. Journal Of Management Science And Engineering.
https://doi.org/10.1016/j.jmse.2022.02.003
Lichtman, M. (2013). Qualitative research for the social sciences. SAGE publications.
Martins, V. W. B., Rampasso, I. S., Anholon, R., Quelhas, O. L. G., & Leal Filho, W. (2019). Knowledge management in the context of sustainability: Literature review and opportunities for future research. Journal of cleaner production, 229, 489-500.
Maxwell, J. A. (2012). Qualitative research design: An interactive approach. Sage publications.
Mazorodze, A. H., & Buckley, S. (2019). Knowledge management in knowledge-intensive organizations: Understanding its benefits, processes, infrastructure and barriers. South African Journal of Information Management, 21(1), 1-6.
Mearsheimer, J. J., & Walt, S. M. (2013). Leaving theory behind: Why simplistic hypothesis testing is bad for International Relations. European Journal of International Relations, 19(3), 427-457.
Mehrad, A., & Zangeneh, M. H. T. (2019). Comparison between qualitative and quantitative research approaches: Social sciences. International Journal For Research In Educational Studies, Iran.
Mohajan, H. K. (2018). Qualitative research methodology in social sciences and related subjects. Journal of Economic Development, Environment and People, 7(1), 23-48.
Mustafa, M., & Elliott, C. (2019). The curious case of human resource development in family‐small‐to‐medium sized enterprises. Human Resource Development Quarterly, 30(3), 281-290. https://doi.org/10.1002/hrdq.21370
Njie, B., & Asimiran, S. (2014). Case study as a choice in qualitative methodology. Journal of Research & Method in Education, 4(3), 35-40.
Nonaka, I., & Takeuchi, H. (2015). The knowledge creating company. New York: Oxford University Press.
Oprit-Maftei, C. (2019). Developing Interview Skills in English: How to Handle Interview Questions. International Multidisciplinary Scientific Conference On The Dialogue Between Sciences &Amp; Arts, Religion &Amp; Education, 3(1), 279-2284. https://doi.org/10.26520/mcdsare.2019.3.279-284
Ormston, R., Spencer, L., Barnard, M., & Snape, D. (2014). The foundations of qualitative research. Qualitative research practice: A guide for social science students and researchers, 2(7), 52-55.
Parker Webster, J., & Marques da Silva, S. (2013). Doing educational ethnography in an online world: methodological challenges, choices and innovations. Ethnography and Education, 8(2), 123-130.
Przysucha, Ł. (2017, August). Knowledge management in corporations–synergy between people and technology. Barriers and benefits of implementation. In IFIP International Workshop on Artificial Intelligence for Knowledge Management (pp. 1-11). Springer, Cham.
Raudeliūnienė, J., Davidavičienė, V., & Jakubavičius, A. (2018). Knowledge management process model. Entrepreneurship and Sustainability Issues, 5 (3), 542-554.
Ravi. (2022). Phenomenological Research: Methods And Examples. Harappa. Retrieved 28 May 2022, from
https://harappa.education/harappa-diaries/phenomenological-research/
.
Roberts, R. (2020). Qualitative Interview Questions: Guidance for Novice Researchers. The Qualitative Report. https://doi.org/10.46743/2160-3715/2020.4640
Schröpfer, V. L. M., Tah, J., & Kurul, E. (2017). Mapping the knowledge flow in sustainable construction project teams using social network analysis. Engineering, Construction and Architectural Management.
Seagren, A. T., Creswell, J. W., & Wheeler, D. W. (2013). The department chair: New roles, responsibilities, and challenges (Higher Education Report No. 1). Washington, DC: ASHE-ERIC.
Serra, M., Psarra, S., & O’Brien, J. (2018). Social and Physical Characterization of Urban Contexts: Techniques and Methods for Quantification, Classification and Purposive Sampling. Urban Planning, 3(1), 58-74. https://doi.org/10.17645/up.v3i1.1269
Shank, G. D. (2006). Qualitative research: A personal skills approach (2nd ed.). Upper
Saddle River, NJ: Pearson Prentice Hall.
Skyrme, D. J. (2009). Knowledge networking: Creating the collaborative enterprise. Woburn, MA: PlantATree.
Stenfors, T., Kajamaa, A., & Bennett, D. (2020). How to… assess the quality of qualitative research. The clinical teacher, 17(6), 596-599.
Syed, M., & McLean, K. (2022). Disentangling paradigm and method can help bring qualitative research to post-positivist psychology and address the generalizability crisis. Behavioral and Brain Sciences, 45. https://doi.org/10.1017/s0140525x21000431
The Tech Tribune. (2022). Best Tech Startups: 2018 Best Tech Startups in Oakland. Thetechtribune.com. retrieved from
Tomaszewski, L. E., Zarestky, J., & Gonzalez, E. (2020). Planning qualitative research: Design and decision making for new researchers. International Journal of Qualitative Methods, 19, 1609406920967174.
Tracy, E. M., Billingsley, J., Pollack, J. M., Barber Iii, D., Beorchia, A., Carr, J. C., … & Sheats, L. (2021). A behavioral insights approach to recruiting entrepreneurs for an academic study during the COVID-19 pandemic. Journal of Business Venturing Insights, 16, e00287.
Trice, H. M., & Beyer, J. M. (2013). The cultures of work organizations. Englewood Cliffs, NJ: Prentice Hall.
Trochim, W. M. K., & Donnelly, J. P. (2008). The research methods knowledge base (3rd
ed.). Mason, OH: Cengage Learning.
Umstead, L. K., & Mayton, H. (2018). Using correlational and causal-comparative research designs in Practice: Exploring relations among client variables. In Making Research Relevant (pp. 95-108). Routledge.
Walker, S. (2011). The interview process and beyond. The Bottom Line, 24(1), 41-45. https://doi.org/10.1108/08880451111142042
Wang, S., & Wang, H. (2020). Big data for small and medium-sized enterprises (SME): a knowledge management model. Journal of Knowledge Management.
Wei, Y., & Miraglia, S. (2017). Organizational culture and knowledge transfer in project-based organizations: Theoretical insights from a Chinese construction firm. International Journal of Project Management, 35(4), 571-585.
Wu, C., Shu, M., & Liu, S. (2016). A Situationally Sample-Size-Adjusted Sampling Scheme Based on Process Yield Verification. Quality and Reliability Engineering International, 33(1), 57-69. https://doi.org/10.1002/qre.1990
Yekkeh, H., Jafari, S. M., Mahmoudi, S. M., & ShamiZanjani, M. (2021). Designing the adaptive fuzzy-neural inference system to measure the benefits of knowledge management in the organization. Iranian Journal of Information processing and Management, 37(1), 288-303.
Yin, R. K. (2009). Case study research design and methods (4th ed.). Thousand Oaks,
CA: Sage.
Appendix A
Interview Questions
Your questions need to begin with How, What or Describe. Please see the attachments on how to align your interview questions with your Research Questions. You cannot use any of these questions because they do not relate to your research study. Please develop appropriate interview questions.
A.
Barriers encountered
D
Membership with local community
G.
Appendix
Interview Protocol Script
Before interview:
Hello, _____________. Thank you for your participation in this ________________ study examining ___________________________________________. I just want to take a few minutes to review the informed consent document before we begin.
Share the informed consent document on the screen using Zoom or In-Person. Please read through each section.
Ask participant if they have any questions or concerns about the informed consent document
Remind participant that their personal information will not be shared, and participation is voluntary and confidential.
Request permission if the interview will be audio, video or digitally recorded. Also inform the interviewee if you plan to take notes during the interview.
· Provide approximate length of interview
· Explain the purpose of the study
· Describe the interview structure (audio recording, video recording, note-taking)
· Ask the interviewee if they have questions before beginning
· Define any necessary terms
During the Interview:
· List questions from Interview Guide in this section
· Ask Follow up questions, if needed
After the Interview:
· Reiterate that this is a confidential interview
· Ask the participant if they have any questions or comments
· Inform participant when they can expect to receive transcripts/interview summary
· Discuss dates/times for any follow-up meetings
· Thank participant again for their time and contribution to the study
Interview Questions
RQ1: How soon will I complete my dissertation so that I can graduate?
RQ2: Why do dissertation students have to write so much content for their research study?
Research Questions |
Interview Questions |
RQ1: How soon will I complete my dissertation so that I can graduate? |
1. How do you feel about completing your dissertation in 2022? 2. How do you feel about completing your dissertation in 2023? 3. How do you feel about completing your dissertation in 2024? 4. Why do you think you can complete you dissertation today? |
RQ2: Why do dissertation students have to write so much content for their research study? |
5. What are the most important chapters in the dissertation? 6. How do you think research should be conducted? 7. Why do you love conducting research as a doctoral student? 8. Please describe how you feel research should be conducted for your dissertation. |
OR you can list the questions: 1. How do you feel about completing your dissertation in 2022? (RQ1) 2. What are the most important chapters in the dissertation? (RQ2) 3. |
InterviewProtocol
Introduction
Greetings and thank you for agreeing to take part in the study. My name is _________and my
study is on _______________. My research study will focus on _______________.I would like
to video and audio record our conversations today. For your information, all recordings and
transcripts will be kept confidential and will be eventually destroyed.
I have planned this interview to last no longer than _________ minutes. During this time, I have
several questions that I would like to ask you. While they are structured questions, please feel
free to elaborate as much as you feel necessary. After the interviews have been transcribed, I will
send you the transcription via email for you to review. I will also be sending you an incentive
(describe incentives)
Interview Questions
Demographic Questions:
1.
Research Question 1: XXXXXXXXXXXXXX
1. What interview question answers RQ1?
2. What interview question answers RQ1?
3. What interview question answers RQ1?
4. What interview question answers RQ1?
Research Question 2: XXXXXXXXXXXXX
5. What interview question answers RQ2?
6. What interview question answers RQ2?
7. What interview question answers RQ2?
8. What interview question answers RQ2?
Research Question 3: XXXXXXXXXXXXXXXXXX
9. What interview question answers RQ3?
10. What interview question answers RQ3?
11. What interview question answers RQ3?
Thank you again for agreeing to participate in this study. Once I am finished transcribing all
documents, I will share them with you via email. If you can take some time (10-15 minutes) to
review these for accuracy, I would greatly appreciate it.
Data Collection Materials
Semi-Structure Interview Questions Aligned by Research Questions
Research Questions Interview Questions
RQ1:
1.
2.
3.
RQ2:
4.
5.
6.
Appendix
Interview Protocol Script
Title of Dissertation:
Name of Interviewer:
Name of Interviewee:
Date: __________________________ Location: ______________________
Starting Time: ________ Ending Time: _______ Zoom Link: ____________________
Before interview:
Hello, _____________. Thank you for your participation in this ________________ study examining
___________________________________________. I just want to take a few minutes to review the
informed consent document before we begin.
Share the informed consent document on the screen using Zoom or In-Person. Please read through
each section.
Ask participant if they have any questions or concerns about the informed consent document
Remind participant that their personal information will not be shared, and participation is voluntary and
confidential.
Request permission if the interview will be audio, video or digitally recorded. Also inform the interviewee
if you plan to take notes during the interview.
• Provide approximate length of interview
• Explain the purpose of the study
• Describe the interview structure (audio recording, video recording, note-taking)
• Ask the interviewee if they have questions before beginning
• Define any necessary terms
During the Interview:
• List questions from Interview Guide in this section
• Ask Follow up questions, if needed
After the Interview:
• Reiterate that this is a confidential interview
• Ask the participant if they have any questions or comments
• Inform participant when they can expect to receive transcripts/interview summary
• Discuss dates/times for any follow-up meetings
• Thank participant again for their time and contribution to the study
Nurse Educatron TO&Y (1991) 11,461-466
0 Longman Group UK Lrd 1991
WORK
A method of analysing interview transcripts in
qualitative research
Philip Burnard
A method of analysing qualitative interview data is outlined as a stage-by-stage
process. Some of the problems associated with the method are identified. The
researcher in the field of qualitative work is urged to be systematic and open to the
difficulties of the task of understanding other people’s perceptions.
INTRODUCTION
Qualitative research methods are being used
increasingly to explore aspects of nurse edu-
cation. Often, such methods involve the use of
unstructured or semi-structured interviews as a
principle methodology. Sometimes, the inter-
view process is straightforward enough (though,
as we shall see, this is by no means always the
case). The difficulty often lies with the question
of how to analyse the transcrips once the inter-
views have been completed. As in all research, it
is essential to know what sort of method of
analysis you are going to use before you collect
data. This paper offers one method of analysis.
It should be noted that it is one method:
essentially, it is one that can be described as a
method of thematic content analysis. It has been
adapted from Glaser and Strauss’ ‘grounded
theory’ approach and from various works on
content analysis (Babbie 1979; Berg 1989; Fox
1982; Glaser & Strauss 1967).
Philip Burnard PhD MSc RMN RGN DipN Cert Ed RNT
Director of Postgraduate Nursing Studies, University
of Wales College of Medicine, Heath Park, Cardiff,
Wales
(Requests for offprints to Pl3)
Manuscript accepted 3 July 1991
Assumptions about the data
No one method of analysis can be used for all
types of interview data. The method described
here assumes that semi-structured, open-ended
interviews have been carried out and that those
interviews have been recorded in full. It is also
assumed that the whole of each recording has
been transcribed. It is suggested that an adapted
version of this method could also be used for
data arising from more clearly structured
interviews.
The method used to categorise and codify the
interview transcripts is best described though the
stages that are worked through by the resear-
cher. The method was developed out of those
described in the grounded theory literature
(Glaser & Strauss 1967; Strauss 1986) and in the
literature on content analysis (Babbie 1979;
Couchman & Dawson 1990; Fox 1982) and out
of other sources concerned with the analysis of
qualitative data (Bryman 1988; Field & Morse
1985).
AIM OF THE ANALYSIS
The aim is to produce a detailed and systematic
461
462 NURSE EDUCATION TODAY
recording of the themes and issues addressed in
the interviews and to link the themes and inter-
views together under a reasonably exhaustive
category system. Herein lies the first problem
that the researcher must remain aware of. To
what degree is it reasonable and accurate to
compare the utterances of one person with those
of another? Are ‘common themes’ in interviews
really ‘common’? Can we assume that one per-
son’s world view can be linked with another
person’s? The method described here takes it for
granted that this is a reasonable thing to do.
However, the researcher should stay open to the
complications involved in the process and not
feel that the method described here can be used
in a ‘doing by numbers’ sort of way.
STAGES OF THE ANALYSIS
l A major category seems to be emerging to
do with ‘coping with anger in counselling’.
Stage three
Transcripts are read through again and as many
headings as necessary are written down to
describe all aspects of the content, excluding
‘dross’. Field and Morse (1985) use the term
dross to denote the unusable ‘fillers’ in an
interview – issues that are unrelated to the topic
in hand. The ‘headings’ or ‘category system’
should account for almost all of the interview
data. This stage is known as ‘open coding’ (Berg
1989); categories are freely generated at this
stage. An example of this sort of coding is found
in Table 1.
Stage one Table 1
Notes are made after each interview regarding
interview transcript Open coding
the topics talked about in that interview. At times I suppose most people Open coding
throughout the research project, the researcher
need counselling at Most people need
some point in their lives. counselling . . .
also writes ‘memos’ (Field & Morse 1985) about I would think that some Some nurses are good at
ways of categorising the data. These serve as nurses are quite good at it. . .
memoryjoggers and to record ideas and theories
it. They have the skills. They have the skills . . .
Although I’m not sure if Not sure about
that the researcher has as he works with the data. many nurses get counselling training in
Such memos may record anything that attracts ~tin~f’~~~~r’,l~~~g as nurse education.. .
the researcher’s attention during the initial training.
phases of the analysis.
Stage two
Transcripts are read through and notes made,
throughout the reading, on general themes
within the transcripts. The aim, here, is to
become immersed in the data. This process of
immersion is used to attempt to become more
fully aware of the ‘life world’ of the respondent;
to enter, as Rogers (1951) would have it, the
other person’s ‘frame of reference’. Examples of
such notes could include:
The list of categories is surveyed by the resear-
cher and grouped together under higher-order
Stage four
headings. The aim, here, is to reduce the
numbers of categories by ‘collapsing’ some of the
ones that are similar into broader categories. For
example, it may be decided that the following
categories are all collapsed into one category
entitled ‘Counselling Training for Nurses’:
l There are lots of different sorts of counsell-
ing being described in these interviews,
l Some nurses have counselling training,
l Nurses have training in counselling,
l Need for counselling training.
Stage five
The new list of categories and sub-headings is
worked through and repetitious or very similar
headings are removed to produce a final list.
Stage six
Two colleagues are invited to generate category
systems, independently and without seeing the
researcher’s list. The three lists of categories are
then discussed and adjustments made as neces-
sary. The aim of this stage is to attempt to
enhance the validity of the categorising method
and to guard against researcher bias.
Stage seven
Transcripts are re-read alongside the finally
agreed list of categories and sub-headings to
establish the degree to which the categories
cover all aspects of the interviews. Adjustments
are made as necessary.
Stage eight
Each transcript is worked through with the list of
categories and sub-headings and ‘coded’ accord-
ing to the list of categories headings. Coloured
highlighting pens can be used here to distinguish
between each piece of the transcript allocated to
a category and sub-heading. Examples of the
way such colours could be used are as follows:
l Definitions of counselling: blue,
l Patients’ needs for counselling: red,
l Counselling and nurse training: green.
Alternatively, these categories can be identi-
fied on a computer, using a wordprocessor and a
coding scheme devised by the individual resear-
cher (Table 2).
Stage nine
Each coded section of the interviews is cut out of
the transcript and all items of each code are
collected together. Multiple photocopies of the
transcripts are used here to ensure that the
context of the coded sections is maintained.
NLJRSE EDUCATION TODAY 463
Table 2
Transcript Categories
Definition of counselling Counselling, for me, is a
question of helping
people to sort out their
own problems in their
own way.
There are other
definitions, of course.
This is not the only one
I would think that a lot of
nurses get taught
something like this
during their training
I suppose that we were
taught the non-directive
approach which says
that patients have to find
their own solutions.
Definition of counselling
Counselling and nurse
training
Definition of counselling
Everything that is said in an interview is aaid in a
context. Merely to cut out strings of words,
devoid of context is to risk altering the Tneaning of
what was said. A crude example of this might be
as follows. Here, the bracketed words have been
cut out to leave a phrase which, on its own,
clearly means something different than it does
when read with the words in brackets.
[I’m not sure. I don’t think that] eveTone needs
counselling in nursing, [to say that would be to
exaggerate a lot. . .]
The multiple copies allow for the sections
either side of the coded sections to be cut out
with the coded areas. A note of caution must be
sounded here. Once sections of interviews are
cut up into pieces, the ‘whole’ of the interview is
lost: it is no longer possible to appreciate the
context of a particular remark or piece of
conversation. For this reason, a second ‘com-
plete’ transcript must be kept for reference
purposes.
Stage ten
The cut out sections are pasted onto sheets,
headed up with the appropriate headings and
sub-headings.
464 NURSE EDUCATION TODAY
Stage eleven
Selected respondents are asked to check the
appropriateness or otherwise of the category
system. They are asked: ‘Does this quotation
from your interview fit t%zS category? . . . Does
this? . . .’ Adjustments are made as necessary.
This allows for a check on the validity of the
categorising process to be maintained. Another
method that can be used to validate the findings
is described in the next section.
Stage twelve
All of the sections are filed together for direct
reference when writing up the findings. Copies
of the complete interviews are kept to hand
during the writing up stage as are the original
tape recordings. If anything appears unclear
during the writing up stage of the project, the
researcher should refer directly back to the
transcript or the recording.
Stage thirteen
Once all of the sections are together, the writing
up process begins. The researcher starts with the
first section, selects the various examples of data
that have been filed under that section and offers
a commentary that links the examples together.
That researcher then continues on to the next
section and so on, until the whole project is
written up. All the time that this writing up
process is being undertaken, the researcher stays
open to the need to refer back to the original
tape recordings and to the ‘complete’ transcripts
of the interviews. In this way, it is possible to stay
closer to original meanings and contexts.
Stage fourteen
The researcher must decide whether or not to
link the data examples and the commentary to
the literature. Two options are available here.
First, the researcher may write up the findings,
using verbatim examples of interviews to
illustrate the various sections. Then, he may
write a separate section which links those
findings to the literature on the topic and make
comparisons and contrasts. Second, the resear-
cher may choose to write up the findings along-
side references to the literature. In this way, the
‘findings’ section of the research becomes both a
presentation of the findings and a comparison of
those findings with previous work. The first
approach seems more ‘pure’ but the second is
often more practical and readable.
In any analysis of qualitative data there is the
problem of what to leave out of an analysis of a
transcript. Ideally, all the data should be accoun-
ted for under a category or sub category (Glaser
& Strauss 1967). In practice there are always
elements of interviews that are unusable in an
analysis. Field and Morse (1985), as we have
noted, refer to this data as ‘dross’. In order to
illustrate what was not included in the analysis of
the interviews in a recent study, it may be helpful
to offer an example of data that were considered
not to be categorisable nor considered to add to
the general understanding of the field under
consideration.
‘I don’t know, like they say, now they say it was
alright, whereas before, perhaps, you
wouldn’t’.
Whilst the person in this example was trying to
convey something it would be difficult to know
what it was. As an aside to this discussion, it is
interesting to note that such ‘uncodable’ pieces
of transcript only appear to be unusable at the
analysis stage. During the interviews, all of what
was being said appeared to be quite coherent to
the researcher.
VALIDITY
The question of the validity of this categorisation
process must be considered. If, as Glaser and
Strauss (1967) suggest, the aim of ethnome-
thodological and phenomenological research is
to offer a glimpse of another person’s perceptual
world, then the researcher should attempt to
offset his own bias and subjectivity that must
creep through any attempt at making sense of
interview data. Two methods of checking for
validity can be recommended here. First, the
researcher asks a colleague who is not involved in
NURSE EDUCATION TODAY 465
any other aspect of the study but who is familiar
with the process of category generation in the
style of Glaser and Strauss, to read through three
transcripts and to identify a category system.
The categories generated in this way are then
discussed with the researcher and compared
with the researcher’s own category system. If the
two category analyses prove to be very similar, at
least three possibilities exist:
4
b)
4
the original category analysis was reason-
ably complete and accurate,
the original category analysis was too broad
and general in nature and thus easily
identified and corroborated by another
person,
the colleague was anticipating the sorts of
categories that the researcher may have
found and offering the researcher ‘what he
wanted to hear’.
The last possibility can be reasonably ruled out
if the colleague is unfamiliar with the subject and
content of the study prior to being asked to help
validate the category system.
The question of whether or not the category
system is too broad and general in nature may be
countered by the fact that the system described
here encourages a ‘funnelling’ process – many
categories are generated at first and these are
then distilled down to a smaller number by the
process of ‘collapsing’ described above. It is to be
hoped, therefore, that the agreement of two
independent parties over the category system
helps to suggest that the system has some inter-
nal validity.
The second check for validity is that of
returning to three of the people interviewed and
asking them to read through the transcripts of
their interviews and asking them to jot down
what they see as the main points that emerged
from the interview. This produces a list of
headings which can then be compared with the
researcher’s and the two lists can be discussed
with the respondents. Out of these discussions,
minor adjustments may be made to the category
system.
This paper has offered just one way of explor-
ing and categorising qualitative data. It is likely
that the method could be used with a range of
types of data – from interview transcripts to
articles and papers. It combines elements of
content analysis with aspects of the grounded
theory approach suggested by Glaser and
Strauss. One of the difficulties in this sort of
work is always going to be finding a method of
presenting findings in an honest and reliable
way. Arguably, the only real way of presenting
interview findings without any sort of manipula-
tion would be to offer the interview transcripts
whole and unanalysed. This, clearly, would not
be satisfactory and the reader of those trans-
cripts would have to find his own way of catego-
rising what was read. The method suggested
here is one that stays close to the original
material and yet allows for categories to be
generated which allow the reader of a researcher
report to ‘make sense’ of the data.
CONCLUSION
The issue of how to analyse qualitative data
remains a thorny one. This paper has identified
one method of attempting such analysis but has
also identified some of the many problems
associated with the process. It is stressed that this
is just one way to analyse data and that the
method must be used cautiously and with con-
stant awareness of possible problems. It is asser-
ted that the researcher needs to be both
systematic and alert to the complexity of the task.
On the other hand, the researcher must start
somewhere – attempts must be made to rep-
resent the thoughts and feelings of others in a
systematic but honest way. This paper has
offered one such method.
References
Babbie E 1979 The practice of social research, 3rd ed.
Wadsworth, Belmot, California
Berg B L 1989 Qualitative research methods for the
social sciences. Allyn and Bacon, New York
Bryman A 1988 Quantity and quality in social research.
Unwin Hyman, London
466 NURSE EDUCATION TODAY
Couchman W, Dawson J 1990 Nursing and health-care
research: the use and applications of research for
nurses and other health care professionals. Scutari,
London
Field P A, Morse J M 1985 Nursing research: the
application of qualitative approaches. Croom Helm,
London
ed. Appleton-Century-Crofts, Norwalk, New Jersey
Glaser B G, Strauss A L 1967 The discovery of
grounded theory. Aldine, New York
Rogers C R 195 1 Client centred therapy. Constable,
London
Fox D J 1982 Fundamentals of research in nursing, 4th
Strauss A L 1986 Qualitative data analysis for social
scientists. Cambridge University Press, Cambridge.
TheQualitative Report
Volume 21 | Number 5 How To Article 2
5-1-2016
Preparing for Interview Research: The Interview
Protocol Refinement Framework
Milagros Castillo-Montoya
University of Connecticut – Storrs, milagros.castillo-montoya@uconn.edu
Follow this and additional works at: https://nsuworks.nova.edu/tqr
Part of the Educational Methods Commons, and the Quantitative, Qualitative, Comparative, and
Historical Methodologies Commons
This How To Article is brought to you for free and open access by the
at NSUWorks. It has been accepted for inclusion in The
Qualitative Report by an authorized administrator of NSUWorks. For more information, please contact nsuworks@nova.edu.
Recommended APA Citation
Castillo-Montoya, M. (2016).
. The Qualitative
Report, 21(5), 811-831. Retrieved from https://nsuworks.nova.edu/tqr/vol21/iss5/2
http://nsuworks.nova.edu/tqr/?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
http://nsuworks.nova.edu/tqr/?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
https://nsuworks.nova.edu/tqr?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
https://nsuworks.nova.edu/tqr/vol21?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
https://nsuworks.nova.edu/tqr/vol21/iss5?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
https://nsuworks.nova.edu/tqr/vol21/iss5/2?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
https://nsuworks.nova.edu/tqr?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
http://network.bepress.com/hgg/discipline/1227?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
http://network.bepress.com/hgg/discipline/423?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
http://network.bepress.com/hgg/discipline/423?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
https://nsuworks.nova.edu/tqr/vol21/iss5/2?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
mailto:nsuworks@nova.edu
Preparing for Interview Research: The Interview Protocol Refinement
Framework
Abstract
This article presents the interview protocol refinement (IPR) framework comprised of a four-phase process
for systematically developing and refining an interview protocol. The four-phase process includes: (1)
ensuring interview questions align with research questions, (2) constructing an inquiry-based conversation,
(3) receiving feedback on interview protocols, and (4) piloting the interview protocol. The IRP method can
support efforts to strengthen the reliability of interview protocols used for qualitative research and thereby
contribute to improving the quality of data obtained from research interviews.
Keywords
Interviewing, Interview Protocols, Qualitative Pedagogy, Research Interviews
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.
Acknowledgements
Thank you to the graduate students in Assessment, Evaluation, and Research in Student Affairs, Blanca
Rincón, Sarah Woulfin, and Robin Grenier for valuable input on this manuscript.
This how to article is available in The Qualitative Report: https://nsuworks.nova.edu/tqr/vol21/iss5/2
https://goo.gl/u1Hmes
https://goo.gl/u1Hmes
http://creativecommons.org/licenses/by-nc-sa/4.0/
http://creativecommons.org/licenses/by-nc-sa/4.0/
http://creativecommons.org/licenses/by-nc-sa/4.0/
https://nsuworks.nova.edu/tqr/vol21/iss5/2?utm_source=nsuworks.nova.edu%2Ftqr%2Fvol21%2Fiss5%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages
The Qualitative Report 2016 Volume 21, Number 5, How To Article 1, 811-831
Preparing for Interview Research:
The Interview Protocol Refinement Framework
Milagros Castillo-Montoya
University of Connecticut, Storrs, Connecticut, USA
This article presents the interview protocol refinement (IPR) framework
comprised of a four-phase process for systematically developing and refining
an interview protocol. The four-phase process includes: (1) ensuring interview
questions align with research questions, (2) constructing an inquiry-based
conversation, (3) receiving feedback on interview protocols, and (4) piloting
the
interview protocol. The IRP method can support efforts to strengthen the
reliability of interview protocols used for qualitative research and thereby
contribute to improving the quality of data obtained from research interviews.
Keywords: Interviewing, Interview Protocols, Qualitative Pedagogy,
Research
Interviews
Interviews provide researchers with rich and detailed qualitative data for understanding
participants’ experiences, how they describe those experiences, and the meaning they make
of
those experiences (Rubin & Rubin, 2012). Given the centrality of interviews for qualitative
research, books and articles on conducting research interviews abound. These existing
resources typically focus on: the conditions fostering quality interviews, such as gaining access
to and selecting participants (Rubin & Rubin, 2012; Seidman, 2013; Weiss, 1994); building
trust (Rubin & Rubin, 2012); the location and length of time of the interview (Weiss, 1994);
the order, quality, and clarity of questions (Patton, 2015; Rubin & Rubin, 2012); and the overall
process of conducting an interview (Brinkmann & Kvale, 2015; Patton,
2015).
Existing resources on conducting research interviews individually offer valuable
guidance but do not come together to offer a systematic framework for developing and refining
interview protocols. In this article, I present the interview protocol refinement (IPR)
framework—a four-phase process to develop and fine-tune interview protocols. IPR’s four-
phases include ensuring interview questions align with the study’s research questions,
organizing an interview protocol to create an inquiry-based conversation, having the
protocol
reviewed by others, and piloting it.
Qualitative researchers can strengthen the reliability of their interview protocols as
instruments by refining them through the IPR framework presented here. By enhancing the
reliability of interview protocols, researchers can increase the quality of data they obtain from
research interviews. Furthermore, the IPR framework can provide qualitative researchers with
a shared language for indicating the rigorous steps taken to develop interview protocols
and
ensure their congruency with the study at hand (Jones, Torres, & Arminio, 2014).
IPR framework is most suitable for refining structured or semi-structured interviews.
The IPR framework, however, may also support development of non-structured interview
guides, which have topics for discussions or a small set of broad questions to facilitate the
conversation. For instance, from a grounded theory perspective, piloting interview
protocols/guides are unnecessary because each interview is designed to build from information
learned in prior interviews (Corbin & Strauss, 2015). Yet, given the important role the first
interview plays in setting the foundation for all the interviews that follow, having an initial
interview protocol vetted through the recursive process I outline here may strengthen the
quality of data obtained throughout the entire study. As such, I frame the IPR framework as a
viable approach to developing a strong initial interview protocol so the researcher is likely to
812 The Qualitative Report 2016
elicit rich, focused, meaningful data that captures, to the extent possible, the experiences of
participants.
The Four-Phase Process to Interview Protocol Refinement (IPR)
The interview protocol framework is comprised of four-phases:
Phase 1: Ensuring interview questions align with research questions,
Phase 2: Constructing an inquiry-based conversation,
Phase 3: Receiving feedback on interview protocols
Phase 4: Piloting the interview protocol.
Each phase helps the researcher take one step further toward developing a research instrument
appropriate for their participants and congruent with the aims of the research (Jones et al.,
2014). Congruency means the researchers’ interviews are anchored in the
purpose of the study
and the research questions. Combined, these four phases offer a systematic framework for
developing a well-vetted interview protocol that can help a researcher obtain robust and
detailed interview data necessary to address research questions.
Phase 1: Ensuring Interview Questions Align With Research Questions
The first phase focuses on the alignment between interview questions and research
questions. This alignment can increase the utility of interview questions in the research process
(confirming their purpose), while ensuring their necessity for the study (eliminating
unnecessary ones). A researcher wants intentional and necessary interview questions because
people have complex experiences that do not unravel neatly before the researcher. Instead,
helping participants explain their experiences takes time, careful listening, and intentional
follow up. A researcher wants to keep in mind:
The purpose of in-depth interviewing is not to get answers to questions… At
the root of in-depth interviewing is an interest in understanding the lived
experiences of other people and the meaning they make of that experience.…
At the heart of interviewing research is an interest in other individuals’ stories
because they are of worth. (Seidman, 2013, p. 9)
People’s lives have “worth” and a researcher wants to approach inquiring into their lives with
sensitivity. Given the complexity of people’s lives and the care needed to conduct an interview,
a researcher can benefit from carefully brainstorming and evaluating interview questions before
data collection. The questions help participants tell their stories one layer at a time, but also
need to stay aligned with the purpose of the study.
To check the alignment of questions you can create a matrix for mapping interview
questions onto research questions. Tables 1 and 2 offer examples of matrices with interview
questions listed in rows and research questions in columns. You can then mark the cells to
indicate when a particular interview question has the potential to elicit information relevant to
a particular research question (Neumann, 2008).
The process of creating this matrix can help display whether any gaps exist in what is
being asked. The researcher can now assess and adjust or add interview questions if too many
are related to one research question and too few to other research questions. Otherwise, you
may not notice the potential information gap until after data collection is complete. Also, the
matrix can help the researcher observe when questions are asked (e.g., beginning, middle, end).
Milagros Castillo-Montoya 813
Ideally, the researcher asks the questions most connected to the study’s purpose in the middle
of the interview after building rapport (Rubin & Rubin, 2012). Once a researcher has a sense
of which interview questions are most likely to address which research questions, he/she/ze
can mark them in the final interview protocol as the key questions to ask during the interview.
Confirming the alignment between interview questions and research questions does not
suggest that a researcher mechanically creates interview questions directly from the research
question without attention to the contexts shaping participants’ lives including
their everyday
practices or languages—a point further discussed below in phase 2. As Patton (2015) stated,
“you’re hoping to elicit relevant answers that are meaningful and useful in understanding the
interviewee’s perspective. That’s basically what interviewing is all about” (p. 471). In
summary, phase 1 focuses on the researcher developing an interview protocol aligned with the
study’s purpose. In the second phase, the researcher focuses on ensuring the interview protocol
supports
an inquiry-based conversation.
Phase 2: Constructing an Inquiry-Based Conversation
A researcher’s interview protocol is an instrument of inquiry—asking questions for
specific information related to the aims of a study (Patton, 2015) as well as an instrument for
conversation about a particular topic (i.e., someone’s life or certain ideas and experiences). I
refer to this balance between inquiry and conversation as an inquiry-based conversation. To
guide a conversation and move an inquiry forward takes both care and hard work (Rubin &
Rubin, 2012). Phase 2 entails the researcher developing an inquiry-based conversation through
an interview protocol with: a) interview questions written differently from the research
questions; b) an organization following social rules of ordinary conversation; c) a variety of
questions; d) a script with likely follow-up and prompt questions.
To develop a protocol that promotes a conversation, compose interview questions
different from how you would write research questions. As noted in phase 1, research
questions are different from interview questions. Maxwell (2013) pointed out the functional
difference between research questions and interview questions:
Your research questions formulate what you want to understand; your interview
questions are what you ask people to gain that understanding. The development
of good interview questions (and observational strategies) requires creativity
and insight, rather than a mechanical conversion of the research questions into
an interview guide or observation schedule, and depends fundamentally on your
understanding of the context of the research (including your participants’
definitions of this) and how the interview questions and observational strategies
will actually work in practice. (p. 101)
As the researcher you can use your knowledge of contexts, norms, and every-day
practices of potential participants, to write interview questions that are understandable and
accessible to participants. Brinkmann and Kvale (2015) stated, “The researcher questions are
usually formulated in a theoretical language, whereas the interview questions should be
expressed in the everyday language of the interviewees” (p. 158). As such, consider the terms
used by participants, ask one question at a time, and avoid jargon (Merriam, 2009; Patton,
2015).
Table 1 offers an example of the differences between research questions and interview
questions. It is an interview matrix I created for a study on first-generation college students’
developing sociopolitical consciousness through their learning of sociology (Castillo-Montoya,
814 The Qualitative Report 2016
2013). I interviewed the students who participated in that study three times throughout one
academic semester. Most of the first interview is represented in the Table 1.
Table 1—Interview Protocol Matrix for Study on College Students’ Sociopolitical
Consciousness (First Interview of Three)
Script prior to interview:
I’d like to thank you once again for being willing to participate in the interview aspect of my study. As I have
mentioned
to you before, my study seeks to understand how students, who are the first in their families to go to college,
experience
learning sociological concepts while enrolled in an introductory sociology course. The study also seeks to understand
how learning sociological concepts shapes the way students think about themselves, their community, and society. The
aim of this research is to document the possible process of learning sociological concepts and applying them to one’s life.
Our interview today will last approximately one hour during which I will be asking you about your upbringing, decision
to attend college, the college/university where you are enrolled, your sociology class and other college classes you’ve
taken, and ideas that you may have about yourself and your community (i.e. family, neighborhood, etc.).
[review aspects of consent form]
In class, you completed a consent form indicating that I have your permission (or not) to audio record our conversation.
Are you still ok with me recording (or not) our conversation today? ___Yes ___No
If yes: Thank you! Please let me know if at any point you want me to turn off the recorder or keep something you said off
the record.
If no: Thank you for letting me know. I will only take notes of our conversation.
Before we begin the interview, do you have any questions? [Discuss questions]
If any questions (or other questions) arise at any point in this study, you can feel free to ask them at any time. I would be
more than happy to answer your questions.
Research
Question #1: At
the start of an
introductory
sociology
course, how do
first-generation
African
American and
Latino
students
in a highly
diverse
institution of
higher
education
reflect
sociopolitical
consciousness
in their
discussions
about their
lives and sense
of self and
society?
How and to
what extent do
student
discussions
about their lives
and sense of self
and society
indicate:
Background
Information
awareness of
sociopolitical
forces (i.e.
race, class,
gender,
citizenship
status, etc.)?
understanding
of
sociopolitical
forces?
knowledge of
the
interconnection
of
sociopolitical
forces?
acts of
critiquing and
analyzing
sociopolitical
forces?
other ways of
thinking or
acting toward
sociopolitical
forces?
How do the
students
describe
themselves
and society in
relation to the
sociopolitical
forces
operating in
their everyday
lives?
Milagros Castillo-Montoya 815
Upbringing
To begin this interview, I’d like to ask you some questions about the neighborhood where you grew up.
1. Based on the
information
that
you provided in
the
questionnaire,
you went to high
school at
______. Did you
grow up in
______
___?
If yes: Go to
question #2
If no: Where did
you grow up?
[Open-ended
way to ask
question: Let’s
begin by
discussing the
neighborhood
where you grew
up. Where did
you grew up?
Follow up:
What was that
neighborhood/to
wn like when
you were
growing up
there?]
X
2. How would
you describe
__
_______
(state
neighborhood
where they grew
up)? In
answering this
question you can
focus on the
people, the
families, the
organizations, or
anything else
that stands out to
you the
most
when
you think
about your
childhood
neighborhood.
X X X X
3. People have
different ways
of viewing the
way their
neighborhoods
and
communities
function. How
would you
compare the
way you view
the
neighborhood
where you grew
X X X X X
816 The Qualitative Report 2016
up, to the way
your
parents
(or
guardians) view
that
neighborhood?
Follow up: Do
you see your
childhood
neighborhood in
the same way or
in a different
way from your
parents? How
so?
Follow up:
Why do you
think you see
your childhood
neighborhood
different or
similar to your
parents (or
guardians)?
[Rephrased to
avoid asking a
“why” question:
Can you tell me
more about what
makes you think
that you have a
different or
similar view of
your childhood
neighborhood
than your
parents (or
guardians)?
4. How do you
think that
growing up in
_________
influenced who
you are today?
X X X X X X
5. Sometimes a
common
experience,
language, or
way of being
leads a group of
people to
identify as a
community. For
example, there
are some people
who
identify as
part of a cultural
group because
they share a
common
experience. Is
there a
community with
which you
identify?
If says yes:
Which
X X X X X X X
Milagros Castillo-Montoya 817
community is
that?
Follow up:
A) What makes
you identify
with that
community?
B) Is there some
common
experience,
language, or
way of being
that defines
_
____ (name of
community) as a
community?
What are they?
C) How did you
know that you
also belonged to
____ (name of
community)?
D) When did
you realize that
you identified
with that
community?
E)
Do you think
others in your
family also
identify as
belonging to
____ (name of
community)
community?
Prompt: Please
tell me more
about this.
If says no: In the
questionnaire
you completed,
you marked off
that you identify
as ____(mention
what they
marked off).
Can you tell me
more about why
you identify as
___?
Follow up: Do
other people
who are ____
(identity marked
off) form a
community for
you?
6. Sometimes
there are
differences in
the way people
are viewed or
treated within a
community. The
differences
could be based
on lots of things.
X X X X X
818 The Qualitative Report 2016
Do you think
that being a
____ (male or
female)
influences the
way others in
your community
______) view
you or interact
with
you?
If says yes: How
so?
If says no: How
did you come to
see that being a
____ (male or
female) does not
matter in the
_______
community?
Follow up: Are
there other
differences that
matter within
the ____
community?
Prompt: Please
tell me more
about that.
Decision to Attend College
Thank you for you responses. I’d like to now ask you questions regarding your decision to attend
college.
7. In your
questionnaire,
you said that
your ___
(mother, father,
or guardian) had
a ___education.
Is
that correct?
If says yes: Does
that mean that
you are the first
in your family to
enroll in
college? If says
no: Who else in
your family has
gone to college?
X
8. Can you tell
me a bit about
how you went
about
making
the decision to
pursue a college
education?
Follow up: You
mentioned that
______ lead you
to decide to go
to college. Was
anyone else
involved in or
influential to
your decision to
go to
college?
If says yes: Who
else was
X X X X X
Milagros Castillo-Montoya 819
involved or
influential (i.e.
parents,
guidance
counselor,
etc.)? How
were they
involved or
influential in
your decision
making
process?
Follow up: Was
there anything
else that you
think made you
want to go to
college? How
did _____
influence you to
want to go to
college?
9. How did your
family respond
to your decision
to go to college?
X X X X
10. Once you
decided to
attend college,
how did you go
about selecting
which college to
attend?
X X X X X
Institution
Thank you for sharing information about your decision to attend college. I’d like to now ask you a few questions about
your college/university.
11. You
mentioned
earlier that you
went about
selecting a
college by___
(use
participant’s
words). At the
point that you
made the
decision to come
to this college,
what most
attracted you to
this school?
Follow up: Can
you tell me a bit
about that?
X X X X X
12. You’ve
taken ____
classes at this
college, what
classes stand out
to you the most?
Follow up: Can
you tell me what
made those
classes stand out
to you?
X X X X
820 The Qualitative Report 2016
Sociology Course
Thank you. I’d like to now ask you a few questions specifically about your sociology course.
13. Is this your
first class in
sociology?
If says yes:
What do you
think
the word
sociology
means?
If says no: What
other sociology
class have you
taken before?
Follow up:
A) When did
you take
that class
(or
classes)?
B) What
would you
say is the
most
important
thing you
learned in
that course
(or in those
classes)?
C) Based on
your
experience
in that class
(or classes),
what do
you think
the word
sociology
means?
X X X
Students Doing Something with What They Know
My final set of questions are focused on getting to know more about your outside of class experiences.
14. I know that
you have taken
____ (number)
classes college
classes so far.
Have you found
that sometimes
you remember
something that
you learned in
one class while
you are doing
something or
talking to
someone outside
of school?
If says yes: Can
you give me an
example of a
time when that
happened for
you?
Follow up:
X X X X X
Milagros Castillo-Montoya 821
A) What was
that
experience
like?
B) Does that
happen to
you often?
Before we conclude this interview, is there something about your experience in this college/university that you think influences how you
engage in your classes that we have not yet had a chance to discuss?
Table 1 includes the study’s first research question and related sub-questions: At the
start of an introductory sociology course, how do first-generation African American and Latino
students in a highly diverse institution of higher education reflect sociopolitical consciousness
in their discussions about their lives and sense of self and society? The sub-questions to this
first research question can be found across the first row. I did a similar, but separate matrix for
my second and third research questions. See Table 2 for an example of what an interview
protocol matrix would look like when the researcher includes all the research questions.
Table 2—Example of Interview Protocol Matrix
Background
Information
Research
Question 1
Research
Question 2
Research
Question 3
Interview Q 1 X
Interview Q 2 X
Interview Q 3 X
Interview Q 4 X X
Interview Q 5 X
Interview Q 6 X X
Interview Q 7 X
Interview Q 8 X X X
Interview Q 9 X
Interview Q 10 X
If I turned the research question from my study directly into an interview question, it would
look something like this: Please describe your sociopolitical consciousness relative to your life
and sense of self and society. This question, however, would overwhelm most people and is
likely too broad and difficult to answer. To get responses to address my research questions, I
asked a variety of interview questions (listed in Table 1). Some questions had students
822 The Qualitative Report 2016
discussing and describing the neighborhoods where they grew up. For instance, I asked, How
would you describe _________ (state neighborhood where they grew up)? Asking about their
childhood neighborhoods was not the only way to get at students’ sociopolitical consciousness,
but one way. It helped me capture whether they already viewed aspects of their neighborhood
from a structural perspective (thus reflecting a sociological view—a focus of that study). This
question, in particular, yielded valuable data, some of which was unexpected such as a theme
about violence in urban neighborhoods. The idea here is my research questions guided my
study’s purpose, while the interview questions’ tone and language made them accessible to the
participants.
A researcher may also want to follow the “social rules that apply to ordinary
conversation” (Rubin & Rubin, 2012, p. 96). In addition to making interview questions
distinct from research questions, a researcher wants to ask participants questions they can
answer by virtue of what they know or the time since the incident at hand (Willis, 1999). For
instance, question 10 in Table 1 asked students how they made the decision to pursue a college
education. Since at the time of the study they were enrolled in college, the question was
bounded by a period they could recall.
You also want to ask only one question at a time, try not interrupting participants when
they are speaking, indicate understanding through nodding or other gestures, ask clarifying
questions, transition from one topic to another, express gratitude, and communicate any
intentions to follow up before the interview ends (Rubin & Rubin, 2012). In Table 1, I have
included some transitions I used between topics. I also included places where I expressed
gratitude such as when I transitioned into asking participants about their decision to attend
college, Thank you for you responses. I’d like to now ask you questions regarding your decision
to attend college (see Table 1). Lastly, while in a social conversation you may inquire further
by asking why, in an interview participants may perceive why questions as judgmental. As the
researcher, you want to avoid framing questions from the position of why (Rubin & Rubin,
2012). See question 3 in Table 1 for an example of a why question reframed. Rubin and Rubin
(2012) suggest these alternatives to asking why: “What influenced, what caused, what
contributed to, and what shaped.” These rules can help you obtain important information while
maintaining a conversational tone.
Unlike an ordinary conversation, however, the purpose of an interview is to gain
further information relative to the study at hand. You can preserve the conversational and
inquiry goals of the research act by including four types of questions: (1) introductory
questions, (2) transition questions, (3) key questions, and (4) closing questions (Creswell, 2007;
Krueger & Casey, 2009; Merriam, 2009; Rubin & Rubin, 2012). Table 3 explains each type of
question and points to examples found in Table 1.
Introductory questions serve to help the researcher begin the interview with easy, non-
threatening questions that ask for narrative descriptions. For example, early in student
interviews I asked participants about where they grew up (see introductory example in Table
3). This question was non-threatening and provided the participants the opportunity to get used
to describing experiences (Patton, 2015). It was also relevant because one’s neighborhood may
shape one’s views of social relations, structures, and opportunities. Students’ responses to this
question lead me to ask additional questions more central to their upbringing, which provided
insights into their existing sociopolitical consciousness. This start to the interview helped set
the tone of a conversation, but also distinguished the interview as a form of inquiry.
Transition questions move the interview toward the key questions (Krueger & Casey,
2009) and keep the conversational tone of the interview. In Table 3, I provided an example of
a transitional question whereby I referred to the response the student provided in a
questionnaire to transition to questions about their first-generation college-going status. Each
interview I conducted (first or follow up interviews) had questions transitioning us slowly from
Milagros Castillo-Montoya 823
one topic to another. Under each new topic I started with less intrusive questions and slowly
worked toward asking questions that were more personal.
Table 3—Types of Interview Questions
Type of Question Explanation of Type of
Question
Example of Type of Question
Introductory Questions Questions that are relatively
neutral eliciting general and non-
intrusive information and that are
not threatening
Based on the information that you
provided in the questionnaire, you
went to high school at ______. Did
you grow up in _________?
If yes: Go to question #2
If no: Where did you grow up?
(see question 1 in Table 1)
Transition Questions Questions that that link the
introductory questions to the key
questions to be asked
In your questionnaire, you said
that your ___ (mother, father, or
guardian) had a ___education. Is
that correct?
If says yes: Does that mean that
you are the first in your family to
enroll in college? If says no: Who
else in your family has gone to
college? (see #9 in Table 1)
Key Questions Questions that are most related to
the research questions and
purpose of the study
What makes you identify with that
community? (see questions listed
under #7 in Table 1)
Closing Questions Questions that are easy to answer
and provide opportunity for
closure
Before we conclude this interview,
is there something about your
experience in this
college/university that you think
influences how you engage in your
classes that we have not yet had a
chance to discuss?
(see end of Table 1)
Key questions, also referred to as main questions, tend to solicit the most valuable
information (Krueger & Casey, 2009; Rubin & Rubin, 2012). The practice of identifying key
questions provides the researcher with a sense of the core questions to ask in the interview. For
example, in the first interview I held with students about their sociopolitical consciousness a
key question focused on whether and how they identified with a particular type of community.
Once students identified a community, I asked a series of questions to slowly get at the
communities with which students identified (see question 5 in Table 1) and eventually asking,
What makes you identify with that community? The question directly related with my research
focus on students’ sociopolitical consciousness as I had defined it for the study. Students’
answers to the series of questions that comprised question 5 (Table 1) was instrumental to my
learning of their awareness and understanding of cultures and other social identities, as well as
social structures shaping those identities. Students’ responses to question 5 (Table 1) lead to
important insights of how students’ identified and why. I was later able to analyze those
statements to arrive at a finding about the differences and similarities in students’ sociopolitical
consciousness regarding themselves and others.
As an interview ends, a researcher may want to ask easier questions and provide the
participant an opportunity to raise any issues not addressed. For instance, I ended the first
interview with students as follows: Before we conclude this interview, is there something about
your experience in this college/university that you think influences how you engage in your
classes that we have not yet had a chance to discuss? This question provided the participants
824 The Qualitative Report 2016
an opportunity to insert information and reflect, but also signaled a conclusion. Another closing
question asks participants to give advice: If you could give advice to another first-generation
college student to help them with their transition to college, what would that be? These sorts
of questions help the participants slowly transition out of the interview experience. They may
solicit unexpected and valuable responses, but their main purpose is to provide the participant
with a reflective, closing experience to the interview. The overall organization of questions
(beginning, transitional, key, and closing questions) can shape the interview protocol toward
an inquiry-based conversation.
To support the development of an inquiry-based conversation, a researcher may
also draft a script as part of the interview protocol. A script—written text that guides the
interviewer during the interview—supports the aim of a natural conversational style. In writing
a script, the researcher considers what the participants needs to know or hear to understand
what is happening and where the conversation is going. Developing a script also helps support
a smooth transition from one topic to another (Brinkmann & Kvale, 2015; Patton, 2015; Rubin
& Rubin, 2012) or one set of questions to another set of questions. A researcher might
summarize what they just learned and inform the participant that the conversation is now going
in a slightly different direction. For example, between questions 6 and 7 in Table 1 I said, Thank
you for you responses. I’d like to now ask you questions regarding your decision to attend
college.
A researcher may not read the script word-for-word during an actual interview, but
developing a script can mentally prepare the researcher for the art of keeping an interview
conversational. In part, the script is as much for the researcher (please stop and remember this
person needs to know what is happening) as it is for the participants (oh, I see, this person now
wants to discuss that part of my life).
Consider likely follow-up questions and prompts. As a final feature of preparing an
inquiry-based conversation, the researcher may want to also spend time considering the likely
follow-up questions and prompts that will help solicit information from the participant. Rubin
and Rubin (2012) provide detailed information on types of follow up questions and prompts
researchers may want to ask during an interview and their purpose. Essentially, while some
follow-up questions and prompts will surface on the spot, a researcher may want to think of
some possible follow-up questions likely needed to solicit further detail and depth from
participants. Doing so helps the researcher, again, consider the place of the participant and how
gently questions need to be asked. By gently I mean that instead of asking someone, “what
made you drop out of college?” a researcher may want to slowly build toward that sort of
information by asking questions and then follow ups and prompts. For instance, one may
instead ask about how long the person was in college, the area of study pursued, what college
was like, and then ask how he/she/ze reached the decision not to continue going to college.
Consideration of possible follow-ups can help the researcher identify the pace of questioning
and how to peel back information one layer at a time.
Phase 3: Receiving Feedback on the Interview Protocol
Through phases 1 and 2, the researcher develops an interview protocol that is both
conversational and likely to elicit information related to the study’s research questions. The
researcher can now work on phase 3—receiving feedback on the developed interview protocol.
The purpose of obtaining feedback on the interview protocol is to enhance its reliability—its
trustworthiness—as a research instrument. Feedback can provide the researcher with
information about how well participants understand the interview questions and whether their
understanding is close to what the researcher intends or expects (Patton, 2015). While a variety
Milagros Castillo-Montoya 825
of activities may provide feedback on interview protocols, two helpful activities include close
reading of the interview protocol and vetting the protocol through a think-aloud activity.
Table 4— Activity Checklist for Close Reading of Interview Protocol
Read questions aloud and mark yes or no for each item depending on whether you see that item present
in the interview protocol. Provide feedback in the last column for items that can be improved.
Aspects of an Interview Protocol Yes No Feedback for Improvement
Interview Protocol Structure
Beginning questions are factual in nature
Key questions are majority of the questions and are placed
between beginning and ending questions
Questions at the end of interview protocol are reflective and
provide participant an opportunity to share closing comments
A brief script throughout the interview protocol provides
smooth transitions between topic areas
Interviewer closes with expressed gratitude and any intents to
stay connected or follow up
Overall, interview is organized to promote conversational flow
Writing of Interview Questions & Statements
Questions/statements are free from spelling error(s)
Only one question is asked at a time
Most questions ask participants to describe experiences and
feelings
Questions are mostly open ended
Questions are written in a non-judgmental manner
Length of Interview Protocol
All questions are needed
Questions/statements are concise
Comprehension
Questions/statements are devoid of academic language
Questions/statements are easy to understand
826 The Qualitative Report 2016
A close reading of an interview protocol entails a colleague, research team member, or
research assistant examining the protocol for structure, length, writing style, and
comprehension (See Table 4 for an example of a guide sheet for proofing an interview
protocol). The person doing the close read may want to check that interview questions
“promote a positive interaction, keep the flow of the conversation going, and stimulate the
subjects to talk about their experiences and feelings. They should be easy to understand, short,
and devoid of academic language” (Brinkmann & Kvale, 2015, p. 157). When closely reading
over the protocol, researchers ask the people doing the close reading to put themselves in the
place of the interviewees in order to anticipate how they may understand the interview
questions and respond to them (Maxwell, 2013).
After engaging in a close reading of the protocol, it is important to “get feedback from
others on how they think the questions (and interview guide as a whole) will work” (Maxwell,
2013, p. 101). Insight into what participants are thinking as they work through their responses
to interview questions can elucidate whether questions are clear, whether interviewees believe
they have relevant answers, and whether aspects of questions are vague or confusing and need
to be revised (Fowler, 1995; Hurst et al., 2015; Willis, 1999, 2004). To get this feedback from
others the researcher can recruit a few volunteers who share similar characteristics to those
who will be recruited for the actual study. These volunteers can be asked to think-aloud as they
answer the interview questions so the researcher can hear the volunteer response and also ask
questions about how the participants arrived at their responses (Fowler, 1995). For example, to
see if the question is clear, you could ask: How difficult was it to answer that question? (Willis,
1999). For insight on participants’ thoughts as they answer questions, you could ask: Can you
describe what you were thinking about when I used the word, ______? It is important for the
researcher to spend time initially orienting participants on the purpose of a think-aloud
interview and how it will proceed so that they are not confused about why they are being asked
to answer the question as well as describe their thought process (Willis, 1999).
For my study on students’ sociopolitical consciousness, I shared some of the interview
questions with a couple of college students currently enrolled in the university where my study
took place, but who would not be participants in my study. Likewise, I also sought feedback
from faculty with similar teaching backgrounds on my faculty interview protocol. The feedback
was immensely helpful toward refining my interview protocols because I had a glimpse of how
the questions came across to potential participants and how I could refine them to make them
accessible and understandable.
Some studies have such a small sample that obtaining possible volunteers is difficult.
In that case, teaching assistants or other students may serve as “practice participants” where
they role-play and try to answer the questions as if they were the participants. While it is based
on role-play, students in my graduate courses have found it useful to gain hands-on practice
obtaining and providing feedback on interview protocols through peer review whereby peers
engage in close reading of each other’s interview protocols and think-aloud activities. Students
have expressed that the feedback is useful for refining their interview protocols because they
gain a better sense of what is unclear or confusing for others. They use those insights to refine
the interview protocol, thus enhancing its quality and trustworthiness.
This process of getting feedback from multiple sources aligns with the iterative nature
of qualitative research whereby the researcher is seeking information, feedback, and closely
listening for ways to continuously improve interviews to increase alignment with participants’
experiences and solicit relevant information for the study (Hurst et al., 2015). Further, this
process of obtaining feedback can be done in the beginning of study, but can also be a helpful
guide as a qualitative researcher tweaks questions once in the field. Obtaining feedback on
interview questions may be one way for a researcher to check on how his/her/zer evolving
questions will be heard and therefore responded to by participants. Hurst et al. (2015) pointed
Milagros Castillo-Montoya 827
to the possible value of this process for qualitative research: “Projects that neglect pretesting
run the risk of later collecting invalid and incomplete data. But, completing a pretest
successfully is not a guarantee of the success of the formal data collection for the study” (p.
57).
Phase 4: Piloting the Interview Protocol
After the three previous phases, the researcher has developed an interview protocol
aligned with the study’s purpose, the questioning route is conversational in nature, but also
inquiry-driven. The researcher has examined each question for clarity, simplicity, and
answerability. The researcher has also received feedback on the questions through close
reading of the protocol and think-aloud activities. At this point, the researcher is ready to pilot
the refined interview protocol with people who mirror the characteristics of the sample to be
interviewed for the actual study (Maxwell, 2013).
Distinct from phase 3, in phase 4 the researcher is simulating the actual interview in as
real conditions as possible. Any notes taken toward improving the interview protocol are based
on the interviewer’s experience of conducting the interview and not from an inquiry of the
interviewee’s thought process. Merriam (2009) pointed out that the “best way to tell whether
the order of your questions works or not is to try it out in a pilot interview” (p. 104). In this
step, the interviewer conducts interviews simulating rapport, process, consent, space,
recording, and timing in order to “try out” the research instrument (Baker, 1994). Through
piloting, the researcher aims to get a realistic sense of how long the interview takes and whether
participants indeed are able to answer questions. In phase 4, you take note of what might be
improved, make final revisions to interview protocols, and prepare to launch the study
(Maxwell, 2013). Some researchers may not have the time, money, or access to participants to
engage in a piloting phase. In that case, phase 3 (feedback) becomes even more crucial to
refining the interview protocol.
The Interview Protocol Refinement Framework
The interview protocol refinement framework (IPR) is comprised of four phases to
systematically develop and refine an interview protocol, to the extent possible, before data
collection (see Table 5). I developed these phases based on integration of the existing literature
and my own experience teaching and conducting qualitative research. Phase 1 entails the
researcher creating an interview protocol matrix to map the interview questions against the
research questions to ensure their alignment. In phase 2, the researcher balances inquiry with
conversation by carefully wording and organizing questions so they are clear, short,
understandable, and in a conversational order. Phase 3 involves researchers obtaining feedback
on their interview protocol through close reading and think-aloud activities. The feedback
gained through these activities can provide the researcher an opportunity to fine-tune the
interview protocol. Lastly, phase 4 is the piloting stage. In phase 4 the researcher has a small
sample of people who share similar characteristics with the study sample and carries out
interviews under real conditions. Here the researcher has a final opportunity to see how the
interview protocol functions live before conducting the actual study. This last phase, however,
is not possible for all researchers given other constraints (i.e., time, money, access).
While all four phases together comprise the IPR framework, some researchers may only
be able to carry out phases 1-3. In such cases, those researchers have taken important steps to
increase the reliability of their interview protocol as a research instrument and can speak to that
effort in their IRB applications as well as any presentations or publications that may result from
their research. The IPR framework makes transparent the effort and intentionality required
828 The Qualitative Report 2016
from researchers for developing effective interview protocols. IPR can be used by novice
researchers as well as researchers that are more experienced because it supports the aim to
garner rich and productive data to answer pressing research questions across a variety of fields.
Table 5—Interview Protocol Refinement (IPR) Method
Phase Purpose of Phase
Phase I: Ensuring interview questions align with
research
questions
To create an interview protocol matrix to map
the interview questions against the research
questions
Phase 2: Constructing an inquiry-based
conversation
To construct an interview protocol that balances
inquiry with conversation
Phase 3: Receiving feedback on interview
protocol
To obtain feedback on interview protocol
(possible activities include close reading and
think-aloud activities)
Phase 4: Piloting the interview protocol To pilot the interview protocol with small
sample
Although the IPR framework can support researchers’ efforts to have well-vetted and
refined interview protocols, it does not mean that a researcher cannot “unhook” from the
interview protocol (Merriam, 2009, pp. 103-104). The interview protocol is a research
instrument, but in qualitative research, the most useful instrument is the researcher. He/she/ze
can listen carefully and adjust, change paths, and otherwise follow intuition in a way that
his/her/zer protocol will never be able to do. Yet, by following the IPR framework, even if
some departure occurs in the field, the researcher will be more prepared (cognitively) to follow
intuition and yet, still have a map in their minds of the sorts of questions they hope to ask.
As such, the IPR framework can support the evolving nature of qualitative research that
often requires the researcher to be responsive to the data that emerges and possibly calling for
flexibility and openness to change.
The IPR framework is promising because it does not prohibit change, flexibility, or
openness. Rather, the IPR framework supports the development and refinement of interview
protocols whether at the beginning stage or throughout the life of a research project. It is
important to note that changes in interview protocols and even in research questions are
sometimes necessary in qualitative research. Nonetheless, changes that occur in the field
require careful thought. Interview questions developed in the field can solicit rich data when
they maintain congruence with any changes in the research questions (Jones et al., 2014). As
such, the IPR framework offers the researcher support to fine-tune an interview protocol and
ensure, to the extent possible, a well-developed instrument to engage in interview research.
References
Baker, T. L. (1994). Doing social research (2nd ed.). New York, NY: McGraw-Hill, Inc.
Brinkmann, S., & Kvale, S. (2015). Interviews: Learning the craft of qualitative research
interviewing (3rd ed.). Thousand
Oaks, CA: Sage.
Corbin, J., & Strauss, A. (2015). Basics of qualitative research: Techniques and procedures
for developing grounded theory.
Thousand Oaks, CA: Sage.
Creswell, J. W. (2007). Qualitative inquiry and research design: Choosing among five
approaches (2nd ed.). Thousand Oaks, CA: Sage.
Milagros Castillo-Montoya 829
Fowler, F. J. (1995). Presurvey evaluation of questions. Improving survey questions: Design
and evaluation (pp. 104-135). Thousand Oaks,
CA: Sage.
Hurst, S., Arulogun, O. S., Owolabi, M. O., Akinyemi, R., Uvere, E., Warth, S., & Ovbiagele,
B. (2015). Pretesting qualitative data collection procedures to facilitate methodological
adherence and team building in Nigeria. International Journal of Qualitative Methods,
15, 53-64.
Jones, S. R., Torres, V., & Arminio, J. (2014). Issues in analysis and interpretation. In
Negotiating the complexities of qualitative research in higher education: Fundamental
elements and issues (2nd ed., pp. 157-173). New York, NY: Routledge.
Krueger, R. A., & Casey, M. A. (2009). Developing a questioning route. In Focus groups: A
practical guide for applied research (pp. 35-60).
Thousand Oaks, CA: Sage.
Maxwell, J. (2013). Qualitative research design: An interactive approach (3rd ed.). Thousand
Oaks, CA: Sage.
Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San
Francisco, CA: Jossey-Bass.
Neumann, A. (2008, Fall). The craft of interview research. Graduate course at Teachers
College, Columbia University, New York, NY.
Patton, M. Q. (2015). Qualitative research & evaluation methods (4th ed.). Thousand Oaks,
CA: Sage.
Rubin, H. J., & Rubin, I. S. (2012). Qualitative interviewing: The art of hearing data (3rd ed.).
Thousand Oaks, CA: Sage.
Seidman, I. (2013). Interviewing as qualitative research: A guide researchers in education and
the social sciences (4th ed.). New York, NY: Teachers College Press.
Weiss, R. S. (1994). Learning from strangers: The art and method of qualitative interview
studies. New York, NY: The Free Press.
Willis, G. B. (1999). Cognitive interviewing: A ‘how to’ guide. Research Triangle Institute:
Research Triangle, NC. www.hkr.se/pagefiles/35002/gordonwillis
Willis, G. B. (2004). Cognitive interviewing: A tool for improving questionnaire design.
Thousand Oaks, CA: Sage.
Author Note
Milagros Castillo-Montoya is assistant professor in the department of educational
leadership in the Neag School of Education at the University of Connecticut. Her research
focuses on teaching and learning in higher education with particular attention to classrooms
diverse by race, ethnicity, and social class. She also teaches courses on research and
assessment, specializing in qualitative methodologies with expertise in classroom observations
and interview research. Correspondence regarding this article can be addressed directly to:
milagros.castillo-montoya@uconn.edu.
Copyright 2016: Milagros Castillo-Montoya and Nova Southeastern University.
Acknowledgements
Thank you to the graduate students in Assessment, Evaluation, and Research in Student
Affairs, Blanca Rincón, Sarah Woulfin, and Robin Grenier for valuable input on this
manuscript.
http://www.hkr.se/pagefiles/35002/gordonwillis
mailto:milagros.castillo-montoya@uconn.edu
830 The Qualitative Report 2016
Article Citation
Castillo-Montoya, M. (2016). Preparing for interview research: The interview protocol
refinement framework. The Qualitative Report, 21(5), 811-831. Retrieved from
http://nsuworks.nova.edu/tqr/vol21/iss5/2
Chapter3 – Evaluation Rubric
Criteria Does Not Meet 0.01 points Meets/NA 1 point
Introductory Remarks The section is missing; or some topic areas are not included
in the Introduction or are not explained clearly.
The chapter outline is not provided and/or is unclear.
The reader is adequately oriented to the topic areas
covered. An outline of the logical flow of the chapter is
presented.
All major themes/concepts are introduced.
Research Methodology and
Design
There is a lack of alignment among the chosen research
method and design and the study’s problem, purpose, and
research questions.
There is a lack of justification and alternate choices for
methods.
For Qualitative Studies: Lacks clear discussion of the
study phenomenon, boundaries of case(s), and/or
constructs
explored.
Describes how the research method and design are aligned
with the study problem, purpose, and research questions.
Uses scholarly support to describe how the design choice
is consistent with the research method, and alternate
choices are discussed.
For Qualitative Studies: Describes the study
phenomenon, boundaries of case(s) and/or constructs
explored.
Population and Sample Lacks a description of the sample, demographics, and the
representation of the sample to the broader population.
There is little to no description of the inclusion/exclusion
criteria used to select the participants (sample)
of the study.
For Quantitative Studies: A power analysis is not
described and appropriately cited.
Provides a description of the target population and the
relation to the larger population.
Inclusion/exclusion criteria for selecting participants
(sample) of the study are noted.
For Quantitative Studies: Power analysis is
described and appropriately cited.
Materials/
Instrumentation
Lacks a description of the instruments associated with the
chosen research method and design used. Details missing
regarding instrument origin, reliability, and
validity.
For Quantitative Studies: (e.g., tests or surveys). Lacks
explanation of any permission needed to use the
instrument(s) and cites properly. Instrument permissions are
missing in appendices
For Qualitative Studies: (e.g., observation
checklists/protocols, interview or focus group discussion
Handbooks). Did not clearly explain the process for
conducting an expert review of instruments (e.g., provides
justification of reviewers being credible – reviewers may
include, but not limited to NCU dissertation team members,
professional
colleagues, peers, or non-research participants
representative of the greater population); and/or did not
clearly explain use of a field test if practicing the
administration of the
instruments is warranted.
For Pilot Study: Does not clearly explain the procedure for
conducting a pilot study (did not conduct pilot) if using a
self-created instrument (e.g., survey questionnaire); does not
include explanation of a field test if practicing the
administration
of the instruments is warranted.
Provides a description of the instruments associated with
the chosen research method and design used. Includes
information regarding instrument origin, reliability, and
validity.
For Quantitative Studies: (e.g., tests or surveys).
Includes any permission needed to use the instrument(s)
and cites properly.
For Qualitative Studies: (e.g., observation
checklists/protocols, interview or focus group discussion
Handbooks). Describes process for conducting an expert
review of instruments (e.g., provides justification of
reviewers being credible – reviewers may include, but not
limited to NCU dissertation team members, professional
colleagues, peers, or non-research participants
representative of the greater population); describes use of
a field test if practicing the administration of the
instruments is warranted.
For Pilot Study: Explains the procedure for conducting a
pilot study (requires IRB approval for pilot) if using a
self-created instrument (e.g., survey questionnaire);
explains use of a field test if practicing the administration
of the instruments is warranted.
Operational Definitions
of Variables
(Quantitative Studies
Only)
Discussion of the study variables examined is lacking
information and/or is unclear.
Describes study variables in terms of being measurable
and/or observable.
(Reviewer – mark Meets/NA for Qualitative studies)
Procedures Procedures are not clear or replicable. Steps are missing;
recruitment, selection, and informed consent are not
established. IRB ethical practices are missing or unclear.
Describes the procedures to conduct the study in enough
detail to practically replicate the study, including
participant recruitment and notification, and informed
consent. IRB ethical practices are noted.
Data Collection and
Analysis
Does not clearly provide a description of the data and the
processes to collect data. Lack of alignment between the
data collected and the research questions and/or hypotheses
of the study.
For Quantitative Studies: Does not clearly provide the data
analysis processes including, but not limited to, clearly
describing the statistical tests performed and the
purpose/outcome, coding of data linked to each RQ, the
software used (e.g.,
SPSS, Qualtrics).
For Qualitative Studies: Does not clearly identify the
coding process of data linked to RQs; does not clearly
describe the transcription of data, the software used for
textual analysis (e.g., Nivo, DeDoose), and does not justify
manual analysis by researcher. There is missing or unclear
explanation of the use of a member check to validate data
collected.
Provides a description of the data collected and the
processes used in gathering the data. Explains alignment
between the data collected and the research questions
and/or hypotheses of the study.
For Quantitative Studies: Includes the data analysis
processes including, but not limited to, describing the
statistical tests performed and the purpose/outcome,
coding of data linked to each RQ, the software used (e.g.,
SPSS, Qualtrics).
For Qualitative Studies: Identifies the coding process of
data linked to RQs. Describes the transcription of data,
the software used for textual analysis (e.g., Nivo,
DeDoose), and describes manual analysis by researcher.
Describes the use of a member check to validate data
collected.
Assumptions/Limitations/
Delimitations
Does not clearly outline the
assumptions/limitations/delimitations (or has missing
components) inherent to the
choice of method and design.
For Quantitative Studies: Does not include or lacks key
elements such as, but not limited to, threats to internal and
external validity,
credibility, and generalizability.
For Qualitative Studies: Does not include or lacks key
elements such as, but not limited to threats to credibility,
trustworthiness,
and transferability.
Outlines the assumptions/limitations/delimitations to the
choice of method and design.
For Quantitative Studies: Includes key elements such as,
but not limited to, threats to internal and external validity,
credibility, and generalizability.
For Qualitative Studies: Includes key elements such as,
but not limited to threats to credibility, trustworthiness,
and transferability.
Ethical Assurances Lacks discussion of compliance with the standards to
conduct research as appropriate to the proposed research
design and is not aligned to IRB requirements.
Describes compliance with the standards to conduct
research as appropriate to the proposed research design
and aligned to IRB requirements.
Summary Chapter does not conclude with a summary of key points
from the Chapter – elements are missing, incomplete,
and/new information is presented.
Chapter concludes with an organized summary of key
points discussed/presented in the Chapter.
Exploring
t
he Relationship between the Knowledge Quality of an Organization’s
Knowledge
Management System, knowledge worker productivity, and employee satisfaction
Dissertation Manuscript
Submitted to Northcentral University
School of Business
in Partial Fulfillment of the
Requirements for the Degree of
DOCTOR OF BUSINESS ADMINISTRATION
by
LAURALY DUBOIS
La Jolla California
June
2021
Approval Page
By
Approved by the Doctoral Committee:
Dissertation Chair: INSERT NAME Degree Held Date
Committee Member: INSERT NAME Degree Held Date
Committee Member: INSERT NAME Degree Held Date
10/04/2021 | 06:45:52 MSTPh.D.
Robert Davis
Leila Sopko
Ph.D., MBA 10/01/2021 | 06:01:14 MST
10/03/2021 | 08:46:21 MSTPh.D.
Garrett Smiley
LAURALY DUBOIS
Exploring the Relationship between the Knowledge Quality of an Organization’s
Knowledge Management
System, knowledge worker productivity, and employee satisfaction
Abstract
The problem addressed by this study was that there is often great difficulty encountered in trying
to retrieve knowledge assets about events in the past required for strategic decision-making
without an effective, in-place Knowledge Management System (KMS) (Oladejo & Arinola,
2019).
The purpose of this quantitative, correlational study was to explore the relationship
between the knowledge quality of an organization’s KMS, knowledge worker productivity, and
employee satisfaction for software industry organizations in California. The Jennex and Olfman,
Knowledge Management (KM) Success Model served as the basis of the framework resulting in
knowledge quality as the independent variable and knowledge worker productivity and employee
satisfaction as the dependent variables (Jennex & Olfman, 2006). Data collected from 154
participant surveys guided answers to the research questions. A Spearman correlation analysis
between knowledge quality and knowledge worker productivity was assessed for the first
research question. A significant positive correlation was observed (rs = 0.94, p < .001, 95% CI
[0.92, 0.96]). This correlation indicates that as knowledge quality increases, knowledge worker
productivity tends to increase. A Spearman correlation analysis between KMS knowledge quality
and employee satisfaction was assessed for the second research question. A significant positive
correlation was observed (rs = 0.93, p < .001, 95% CI [0.91, 0.95]). This correlation indicates
that as knowledge quality increases, employee satisfaction tends to increase. The most significant
implication from this study was the unexpected strength in the correlation coefficient for each
research question. This study contributed to the Knowledge Management research community
due to the failure of organizations to implement a successful KMS in the workplace. Further
research to include updated Knowledge Management performance indicators may be helpful to
organizations in several industries worldwide.
Acknowledgments
I would like to give thanks and praise to my Lord Jesus Christ for giving me the strength
to do all things! I cannot proclaim enough love and appreciation for my wonderful husband, Bob
as my motivator and cheerleader during this entire journey. I am thankful to my boys Erik, Brian,
and Roby for encouraging me during the rough times. I want to leave a legacy to my
grandchildren Lily, Noah, Elijah, Gideon, Gabriella, and Josiah that if Nana can do it, so can
you. Many famly members also supported me through the years and I love you all.
Hey Mom, I did it!
I would like to give a word of gratitude to my chair, Dr. Garrett Smiley for his guidance
and support that kept me going through the challenging moments. I appreciate very much Dr.
Robert Davis and Dr. Leila Sopko for serving as my Northcentral University dissertation
committee members and the feedback throughout this process.
Table of Contents
Chapter 1: Introduction ……………………………………………………………………………………………………. 1
Statement of the Problem ……………………………………………………………………………………………. 3
Purpose of the Study ………………………………………………………………………………………………….. 4
Theoretical Framework ………………………………………………………………………………………………. 5
Nature of the Study ……………………………………………………………………………………………………. 8
Research Questions ……………………………………………………………………………………………………. 9
Hypotheses ……………………………………………………………………………………………………………….. 9
Significance of the Study ……………………………………………………………………………………………. 9
Definitions of Key Terms …………………………………………………………………………………………. 10
Summary ………………………………………………………………………………………………………………… 12
Chapter 2: Literature Review ………………………………………………………………………………………….. 15
Theoretical Framework …………………………………………………………………………………………….. 20
Knowledge Worker ………………………………………………………………………………………………….. 28
Knowledge Management ………………………………………………………………………………………….. 32
Knowledge Management System ………………………………………………………………………………. 43
Knowledge Worker Productivity ……………………………………………………………………………….. 55
Employee Satisfaction ……………………………………………………………………………………………… 58
Summary ………………………………………………………………………………………………………………… 60
Chapter 3: Research Method …………………………………………………………………………………………… 63
Research Methodology and Design ……………………………………………………………………………. 65
Population and Sample …………………………………………………………………………………………….. 68
Instrumentation ……………………………………………………………………………………………………….. 69
Operational Definitions of Variables ………………………………………………………………………….. 70
Study Procedures …………………………………………………………………………………………………….. 74
Data Analysis ………………………………………………………………………………………………………….. 75
Assumptions ……………………………………………………………………………………………………………. 77
Limitations ……………………………………………………………………………………………………………… 78
Delimitations …………………………………………………………………………………………………………… 78
Ethical Assurances …………………………………………………………………………………………………… 79
Summary ………………………………………………………………………………………………………………… 80
Chapter 4: Findings ……………………………………………………………………………………………………….. 81
Validity and Reliability of the Data ……………………………………………………………………………. 83
Results ……………………………………………………………………………………………………………………. 87
Evaluation of the Findings ………………………………………………………………………………………… 91
Summary ………………………………………………………………………………………………………………… 93
Chapter 5: Implications, Recommendations, and Conclusions ……………………………………………. 95
Implications…………………………………………………………………………………………………………….. 96
Recommendations for Practice ………………………………………………………………………………… 103
Recommendations for Future Research …………………………………………………………………….. 104
Conclusions …………………………………………………………………………………………………………… 105
References ………………………………………………………………………………………………………………….. 107
Appendices …………………………………………………………………………………………………………………. 131
Appendix A ………………………………………………………………………………………………………………… 132
Appendix B ………………………………………………………………………………………………………………… 133
Appendix C ………………………………………………………………………………………………………………… 134
Appendix D ………………………………………………………………………………………………………………… 140
Appendix E ………………………………………………………………………………………………………………… 141
Appendix F…………………………………………………………………………………………………………………. 146
List of Tables
Table 1 Research Study Variables …………………………………………………………………………………. 140
Table 2 Shapiro-Wilk Test Results for all Study Variables Test for Normality ………………………. 87
Table 3 Summary of Descriptive Statistics ……………………………………………………………………….. 89
Table 4 KMS Success Survey Participants by Gender………………………………………………………. 141
Table 5 KMS Success Survey Participants by Age …………………………………………………………… 141
Table 6 KMS Success Survey Years Employed ………………………………………………………………… 142
Table 7 KMS Success Survey Years of KMS Usage ………………………………………………………….. 143
Table 8 KMS Success Survey Education Level ………………………………………………………………… 143
Table 9 KMS Success Survey Employment Position …………………………………………………………. 144
Table 10 KMS Success Survey Industry Employed …………………………………………………………… 145
Table 11 Spearman Correlation Result: KMS KQ and KWP ………………………………………………. 90
Table 12 Spearman Correlation Results: KMS Knowledge Quality and Employee Satisfaction . 9
1
List of Figures
Figure 1 Scatterplot of KMS KQ and KWP ………………………………………………………………………. 86
Figure 2
Scatterplot of KMS KQ and employee satisfaction
……………………………………………….. 86
Figure 3 G*Power Statistics Analysis…………………………………………………………………………….. 132
Figure 4 Halawi’s (2005) KMS survey permission request/approval ………………………………….. 133
Figure 5
Halawi KMS Survey Questions (2005)
………………………………………………………………. 134
Figure 6 IRB Approval Letter ……………………………………………………………………………………….. 146
1
Chapter 1: Introduction
Harnessing the power of organizational knowledge through Knowledge Management
(KM) activities complemented by an efficient Knowledge Management System (KMS) support
the utilization of knowledge assets to meet strategic objectives aimed to gain and maintain a
competitive advantage (Oladejo & Arinola, 2019). KM activities encompass the accumulation,
retrieval, distribution, storage, sharing, and application of learned knowledge (Al-Emran et al.,
2018; Shujahat et al., 2019). The internal and external learned knowledge leads to valuable
knowledge assets during the transformation of tacit information such as undocumented and
implicit knowledge into documented, explicit information for future consumption (Andrawina,
Soesanto, Pradana, & Ramadhan, 2018; Putra & Putro, 2017). Over twenty years ago,
Knowledge Management’s rapid growth facilitated the need to leverage these knowledge assets
within a KMS, acting as the mechanism to promote management capabilities for organizational
knowledge (Orenga-Roglá & Chalmeta, 2019). KMS appeared out of a specific information
technology system to support knowledge-centric practices to manage organizational learned
knowledge (Orenga-Roglá & Chalmeta, 2019; Wilson & Campbell, 2016).
Knowledge workers utilize an organization’s KMS to store and retrieve knowledge,
improve knowledge sharing, and access knowledge sources promoting Knowledge Management
capabilities (Levallet & Chan, 2018; Orenga-Roglá & Chalmeta, 2019; Surawski, 2019; Wang &
Yang, 2016; Xiaojun, 2017; Zhang & Venkatesh, 2017). Knowledge workers within an
organization represent employees assigned to a classified business position performing tasks
requiring a specific skill set to be productive when performing the assigned job role (Surawski,
2019). When knowledge workers take part in KM activities, these actions contribute to
knowledge assets within the KMS supporting future knowledge work (Shrafat, 2018). The
2
intended flow of knowledge from daily knowledge exchange events between knowledge
workers, the KMS, and organizational management lay the foundation for business leaders to
augment strategic decision-making (Alaarj et al., 2016; Buenechea-Elberdin et al., 2018). An
effective KMS to support KM activities is critical for capturing, retrieving, storing, sharing, and
applying organizational knowledge assets (Al-Emran et al., 2018; Shujahat
et al., 2019). The
successful implementation of the organization’s KMS lays the framework for KM activities to
generate knowledge assets to foster future decision-making capabilities (Orenga-Roglá &
Chalmeta, 2019; Putra & Putro, 2017). De Freitas and Yáber (2018) describe three factors
required for the successful implementation of an organization’s KMS, including stakeholders,
technology, and organizational constructs. The need to measure the successful KM activities
within a KMS of an organization resulted in the arrival of the Jennex and Olfman KM Success
Model (Jennex, 2017; Jennex & Olfman, 2006; Karlinsky-Shichor & Zviran, 2016). In this study,
the implementation of the KMS as a tool to support KM activities spotlights the technical aspect
of the KMS from the vantage point of the knowledge worker’s use of the KMS.
The Jennex and Olfman KM Success Model named six performance indicators for
measuring the implementation of an organization’s KMS to support KM activities (Jennex, 2017;
Jennex & Olfman, 2006). Six high-level categories serving as performance indicators include
knowledge quality, system quality, service quality, intent to use/perceived benefit, use/user
employee satisfaction, and net system benefits
(Jennex, 2017; Jennex & Olfman, 2006).
The
implementation of the KMS affects the knowledge quality component determining the accuracy
and timeliness when knowledge workers retrieve the stored knowledge assets within the correct
context to perform job tasks (Wang & Yang, 2016). The ability to retrieve accurate, timely, and
contextual knowledge assets when knowledge workers query the KMS enables value-added
3
benefits supporting knowledge worker productivity (Kianto, Shujahat, Hussain, Nawaz, & Ali,
2019; Shujahat et al., 2019). The KMS knowledge quality affects knowledge workers’
productivity, enabling value-added activities, and increased business performance (Drucker,
1999; Iazzolino & Laise, 2018).
Scholars report business leaders fail to implement an effective KMS empowering KM
activity necessary to promote knowledge worker productivity (Jennex, 2017; Karlinsky-Shichor
& Zviran, 2016; Sutanto, Liu, Grigore, & Lemmik, 2018; Vanian, 2016; Xiaojun, 2017). The
failure to enable knowledge worker productivity during KM activities influences business
et al., 2019). The
relationship between the knowledge quality of an organization’s Knowledge Management
System, knowledge worker productivity, and employee satisfaction forms the basis of this study.
Statement of the Problem
The problem addressed by this study was that there is often great difficulty encountered
in
trying to retrieve knowledge assets about events in the past required for strategic decision-
making without an effective, in-place Knowledge Management System (KMS) (Oladejo &
Arinola, 2019). Knowledge Management (KM) is challenging to implement and requires
exploration and improvement in its’ continued application and development (Putra & Putro,
2017). Additional difficulties associated with the lack of an effective KMS include knowledge
asset unavailability, improper knowledge asset documentation, excessive time consumption
associated with searching for knowledge assets, decision-making overhead, and duplication of
effort (Oladejo & Arinola, 2019). A substantial number of
Knowledge Management System
(KMS) implementations have not achieved their intended outcomes, such as employee
performance and employee satisfaction (Zhang & Venkatesh, 2017).
4
Researchers continue to seek additional perspectives on factors preventing knowledge
workers from retrieving expected benefits from an organization’s KMS (Iazzolino & Laise, 2018;
Karlinsky-Shichor & Zviran, 2016; Shujahat et al., 2019; Zaim et al., 2019).
This problem is
significant as knowledge systems have infiltrated every aspect of the business process requiring
an organization’s capability to implement and successfully use a KMS (Nusantara, Gayatri, &
Suhartana, 2018). According to Vanian (2016), the continued loss of millions of dollars flows
from businesses’ failure to implement an efficient KMS to support knowledge worker
productivity that requires further research. When organizations fail to implement a successful
KMS, KM strategies depending on the use of knowledge assets for knowledge worker
productivity and employee satisfaction also fail (De Freitas & Yáber, 2018; Demirsoy &
Petersen, 2018; Putra & Putro, 2017; Xiaojun, 2017).
Purpose of the Study
The purpose of this quantitative, correlational study was to explore the relationship
between the knowledge quality of an organization’s KMS, the knowledge worker productivity,
and employee satisfaction for software industry organizations in California. This study is
relevant and contributes to the Knowledge Management research community as millions of
dollars in losses from unsuccessful KMS implementations fail to satisfy expected benefits in
knowledge assets to support business performance (Fakhrulnizam et al., 2018; Levallet & Chan,
2018; Nusantara et al., 2018; Vanian, 2016). Multiple challenges remain toward achieving KM
capabilities and the successful implementation of an organization’s KMS to benefit the use of
knowledge assets for knowledge worker productivity and employee satisfaction (De Freitas &
Yáber, 2018; Demirsoy & Petersen, 2018; Putra & Putro, 2017; Xiaojun, 2017).
5
The researcher conducted this study with an online survey as the research instrument and
gathered data from knowledge workers employed in software industry firms in California. The
independent variable, KMS knowledge quality construct, contains dimensions of the KM
strategy/process, richness, and linkages originating from the Jennex and Olfman KM Success
Model, providing the framework in the theoretical context of performance indicators (Jennex,
2017; Jennex & Olfman, 2006). Knowledge worker productivity and employee satisfaction
within the context of KMS usage represent the dependent variables (Jennex, 2017; Jennex &
Olfman, 2006). Halawi granted permission in writing to extract KMS survey questions to
operationalize the variables within the online survey using a 7-point Likert scale (Halawi, 2005).
An encrypted Microsoft Excel program served as the tool for storing and analyzing the survey
data using IBM SPSS Statistics version 26. G*Power 3.1.9.4 version software produced the
output for a priori power analysis (medium effect size = .0625, error = .05, power = .95,
predictors = 1) resulting in 153 as a required sample size in participants displayed in
Appendix A
Figure 3.
Theoretical Framework
Initially, the design of only one information system designated to support management
processes and decision-making considered a reasonable cost for the organization (Carlson, 1969;
Ermine, 2005). At this time, information systems theory (IST) offered connections between
computational logic and the technology used to process data for supplying information known
only as the information system (IS) (Lerner, 2004). The rapid progression of emerging
technologies shifted the business processes needs spawning the separation of information
systems based on the purpose of the system, including Management Information Systems (MIS),
Decision Support Systems (DSS), and Expert Systems (ES) (Devece Carañana et al., 2016;
6
Medakovic & Maric, 2018; Mentzas, 1994). MIS supplied management with the capability to
analyze the business information for the organization related to technology management and the
management of technology use (Devece Carañana et al., 2016). DSS diverged from the primary
information system as a mechanism to supply graphical or logical data analysis for semi-
structured business problems in support of strategic decision-making and support (Medakovic &
Maric, 2018; Mentzas, 1994). ES enabled the gathering and organizing of organizational learned
knowledge toward specific technology applications for all management levels (Medakovic &
Maric, 2018; Mentzas, 1994). The split of an all-encompassing information system into distinct
organizational support systems set up the framework for systems used worldwide.
Answering the call to the evolution of innovative information systems, the DeLone and
McLean Information System (IS) Success Model supplied organizations the context to measure
the performance indicators of their various information systems (DeLone & McLean, 1992;
DeLone & McLean, 2003; DeLone & McLean, 2004). This model provided the approach in
measuring dimensions of information quality, system quality, service quality, system use and
usage intentions, user employee satisfaction, and net system benefits (Liu, Olfman, & Ryan,
2005; Zuama, Hudin, Puspitasari, Hermaliani, & Riana, 2017). The emergence of the KMS
became popular due to the infusion of knowledge-centric practices to manage knowledge assets
giving birth to the Knowledge Management System (Alavi & Leidner, 2001; Wu & Wang, 2006;
Zhang & Venkatesh, 2017). The need to measure Knowledge Management’s success resulted in
the introduction of the Jennex and Olfman KM Success Model to find performance indicators
while using the organization’s Knowledge Management Systems (Jennex, 2017; Jennex &
Olfman, 2006). The transformation of the DeLone and McLean Information System (IS)
Success Model into the Jennex and Olfman KM Success Model brought the Knowledge
7
Management System constructs within an organization’s information system for insertion of
the Knowledge Management processes.
The KM Success Model maintained the six similar categories as the DeLone and McLean
IS Success Model transforming only the measurement components as needed to accommodate
the specific measurement needs of the KMS (DeLone & McLean, 1992; DeLone & McLean,
2003; DeLone & McLean, 2004; Jennex, 2017; Jennex & Olfman, 2006). The knowledge quality
dimension within an organization’s KMS described in the KM Success Model guided the basis of
the theoretical framework in this study (Jennex, 2017; Jennex & Olfman, 2006). The dimensions
of KMS knowledge quality as an independent variable operationalize into three components
defined as KM strategy/process, richness, and linkages as measurements of success within the
KMS (Jennex, 2017; Jennex & Olfman, 2006; Liu et al., 2008). Numerous researchers have
studied how the implementation and maintenance of an organization’s KMS determine the
knowledge workers’ ability to retrieve accurate and timely organizational stored knowledge
(Andrawina et al., 2018; De Freitas & Yáber, 2018; Ferolito, 2015; Xiaojun, 2017; Zhang &
Venkatesh, 2017). The role of the knowledge worker and the outcome of knowledge worker
productivity and employee satisfaction undoubtedly continue to evolve in reaction to future KMS
features and capabilities to address KMS knowledge quality challenges in the workplace
(Fakhrulnizam et al., 2018; Jabar & Alnatsha, 2014). As businesses seek to increase knowledge
worker productivity, this demand for the successful KMS implementation deems an improved
usage of an organization’s KMS (Levallet & Chan, 2018). Therefore, examining the relationship
between the knowledge quality of an organization’s KMS, knowledge worker productivity, and
employee satisfaction assists researchers and business leaders in identifying potential barriers in
the implementation strategies of the organization’s KMS.
8
Nature of the Study
This study is a quantitative, correlational study to explore the relationship between the
knowledge quality of an organization’s KMS, knowledge worker productivity, and employee
satisfaction within the software industry in California. The quantitative research method is
appropriate and applied in this study to explore the relationship between the independent and
dependent variables based on reliable data collection methods and instruments for interpreting
the data analysis to present unbiased results (Hancock et al., 2010). The correlational research
design in this study statistically determined the relationship between the designated study
variables with online survey data (Hancock et al., 2010). The quantitative, correlational study
supported the study’s problem statement, purpose, and research questions as reflected in the
operationalized variables and statistical tests based on the relationship between the variables
(Field,
2013).
Probability sampling to collect data from a random sample of employees with the desired
characteristics evaluated the potential relationship between the variables (O’Dwyer & Bernauer,
2013). The target sample size included at least 153 qualified knowledge worker participants. The
online survey questions were available on the Qualtrics web platform, presenting a 7-point Likert
scale for each question grounded on Halawi’s KMS Success survey (Halawi, 2005). The survey
questions supplied the basis of measurement in the relationship between the designated study
variables after analyzing the collected data from knowledge workers in their natural environment
(O’Dwyer & Bernauer, 2013). IBM SPSS and Microsoft Excel tools analyzed the collected data
with statistical tests to determine if the rejection of the null hypothesis was necessary (Field,
2013).
9
Research Questions
The list of research questions applicable in this survey research method supports the
quantitative method. The research questions support the goal of this study to examine the
relationship between the knowledge quality of an organization’s Knowledge Management
System, knowledge worker productivity, and employee satisfaction. These research questions
form the basis for the research method and design, reflecting the statement of the problem and
purpose of the study. Each research question corresponds with the hypothesis statements.
RQ1.
To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and knowledge worker productivity?
RQ2. To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and employee satisfaction?
Hypotheses
H10. There is not a statistically significant relationship between the knowledge quality of
an
organization’s KMS and knowledge worker
productivity.
H1a. There is a statistically significant relationship between the knowledge quality of an
organization’s KMS and
knowledge worker productivity.
H20. There is not a statistically significant relationship between the knowledge quality of
an
organization’s KMS and employee satisfaction.
H2a. There is a statistically significant relationship between the knowledge quality of an
organization’s KMS and employee satisfaction.
Significance of the Study
The continued failure to manage knowledge assets costs businesses millions of dollars in
10
(IDC) identify the various attempts to harness the knowledge assets by implementing KMS
online systems that have not provided the desired productivity result (Ferolito, 2015; Vanian,
2016). A global attempt to address the lack of standards for implementing KMS resulted in
creating ISO 30401:2018 to support Knowledge Management standards within organizations
(“ISO 30401:2018,” 2018). Researchers continue to seek additional perspectives to identify other
contingency factors preventing knowledge workers from retrieving expected benefits from an
organization’s KMS (Iazzolino & Laise, 2018; Karlinsky-Shichor & Zviran, 2016; Shujahat et
al., 2019; Zaim et al., 2019). This problem is significant as knowledge systems have infiltrated
every aspect of the business process, necessitating the implementation and successful usage of a
KMS (Nusantara et al., 2018). According to Vanian (2016), the continued loss of millions of
dollars from the failure of businesses to implement an efficient KMS to support knowledge
worker productivity requires further
research.
Definitions of Key Terms
Data
Data is a fact collected or communicated without a specific meaning until analyzed and
transitioned into information (Becerra-
Fernandez et al., 2008).
Employee Satisfaction
Employee satisfaction is one of the components of the KMS knowledge quality
independent variable in showing a successful experience based on the actual use of the KMS
from each use.
Explicit Knowledge
Explicit knowledge is represented in a codified such as words or numbers and easily
shared in any format (Becerra-Fernandez et al., 2008).
11
Information
Information is the transformation of data representing a specific set of values (Becerra-
Fernandez et al., 2008).
KM Strategy/Process
KM Strategy/Process is one of the three components of the knowledge quality
performance indicator is based on the specific actions of the knowledge users to locate the
knowledge asset within the KMS and the process for the knowledge strategy when using the
system
(Jennex, 2017; Jennex & Olfman, 2006).
Knowledge
Knowledge is the transformation of information into facts capable of making decisions
and enabling responses (Becerra-Fernandez et al., 2008).
Knowledge Management
KM is the management of knowledge collected within the organization within an
organization for strategic decision-making (Becerra-Fernandez et al., 2008).
Knowledge Work
Knowledge work is the action of creating and using knowledge to perform tasks required
to generate output required for an organization’s products and services (Shujahat
et al., 2019).
Knowledge Worker
A knowledge worker is an employee within an organization assigned to a classified
business position performing tasks requiring a specific skill set to perform the job role
(Surawski, 2019).
12
Knowledge Worker Productivity
The productivity of knowledge workers demands the timely completion of the intellectual
task delivering a quality service or product in an efficient manner while exhibiting innovative
methods (Shujahat et al., 2019).
Linkage
Linkage is one of the components of the KMS knowledge quality independent variable
describing the internal mappings of code to provide the results from the search query entered by
the knowledge worker (Jennex, 2017; Jennex & Olfman, 2006; Levallet & Chan, 2018).
Richness
Richness is one of the components of the KMS knowledge quality independent variable
indicating the accuracy and timeliness of the knowledge retrieved from the KMS, and the
applicable context expected by the user of the KMS (Jennex, 2017; Jennex & Olfman, 2006).
Tacit Knowledge
Tacit knowledge of an intangible nature requiring interpretation for contextual
understanding (Becerra-Fernandez et al., 2008).
Summary
The failure of businesses to implement a successful KMS causes them to lose millions of
dollars annually from reduced knowledge worker productivity (Ferolito, 2015; “ISO
30401:2018,” 2018 et al., 2019). International organizations and academic institutions
call for further research to identify factors contributing to the lack of KMS success (Iazzolino &
Laise, 2018; Karlinsky-Shichor & Zviran, 2016; Shujahat et al., 2019; Vanian, 2016; Zaim et al.,
2019). The researcher used the quantitative, correlational research method and design to explore
the relationship between the knowledge quality of an organization’s Knowledge Management
13
System, knowledge worker productivity, and employee satisfaction. The Jennex and Olfman KM
Success Model support the theoretical framework representing the KMS containing knowledge-
centric practices to provide knowledge assets utilized to accomplish an organizational business
purpose (Alavi & Leidner, 2001; Ermine, 2005; Jennex, 2017; Jennex & Olfman, 2006; Wu &
Wang, 2006).
The list of research questions applicable in this correlational design supports the
quantitative method exploring if a relationship exists between the knowledge quality dimension
within the KMS, knowledge worker productivity, and employee satisfaction. This problem is
significant as knowledge systems have infiltrated every aspect of the business process requiring
an organization’s capability to implement and successfully use a KMS (Nusantara et al., 2018).
The continued loss of millions of dollars from the failure of businesses to implement an efficient
KMS to support knowledge worker productivity requires further research (Iazzolino & Laise,
2018; Karlinsky-Shichor & Zviran, 2016; Shujahat et al., 2019; Zaim
et al., 2019).
In Chapter 2, a synthesized review studied the relationship between the quality of a
KMS, knowledge worker productivity, and employee satisfaction and examined the historical
and current research for each major theme. The domains contributing to the knowledge of this
topic included knowledge worker (KW), Knowledge Management (KM), Knowledge
Management System (KMS), knowledge worker productivity (KWP), and employee satisfaction.
In chapter 3, descriptions of the research design and method supported the identified population
and sample participant selection. The planned instrumentation for preparing the data collection
and analysis of variables led to the assumptions, limitations, delimitations, and ethical assurances
applicable in this study. In chapter 4, the findings and evaluation from data analysis allowed the
researcher’s presentation summary of the research questions and hypothesis results. In chapter 5,
14
the final implications, recommendations, and conclusions will serve as the basis for this
researcher to present recommendations for future researchers in this domain. The researcher will
present this study to contribute to the body of knowledge contributing to the failure of
organizations to implement a successful KMS to support knowledge worker productivity and
employee satisfaction in the workplace.
15
Chapter 2: Literature Review
Business managers continue to fail in the successful implementation of the organization’s
Knowledge Management System (KMS), causing millions of dollars in annual losses from lack
of knowledge worker productivity (Ferolito, 2015; Levallet & Chan, 2018; Vla
This correlational study examines whether a relationship exists between knowledge worker
productivity and employee satisfaction with the knowledge quality of the organization’s KMS
(Jennex, 2017; Jennex & Olfman, 2006; Liu et al., 2008). The research questions align with the
problem and purpose statements in this study. The researcher used the research questions as a
guide to support the problem statement and purpose statement. The researcher used the first
research question to guide the data collection and analysis and evaluated if a relationship existed
between the knowledge quality of
an organization’s KMS and knowledge worker productivity.
Next, the evaluation of employee satisfaction during the usage of the organization’s KMS
supported the second research question.
Organizations implement a KMS expecting a return on investment through improved
performance, quality, and productivity (Fakhrulnizam et al., 2018; Gunadham &
Thammakoranonta, 2019; Jahmani, Fadiya, Abubakar, & Elrehail, 2018). Yet, businesses
continue to report a loss in worker productivity despite efforts in using the KMS to manage
knowledge assets ISO 30401
offered standards and guidance in an attempt to assist organizations across the globe to
implement a successful KMS (Byrne, 2019; Corney, 2018; “ISO 30401:2018,” 2018). Scholars
are asking for continued research to gain additional insight into unknown factors related to
knowledge worker productivity while using the organization’s KMS (Iazzolino & Laise, 2018;
Karlinsky-Shichor & Zviran, 2016; Shujahat et al., 2019; Zaim et al., 2019).
16
Knowledge worker productivity begins by harnessing the power of knowledge assets
within the KMS to retrieve explicit knowledge (Ali et al., 2016; Wilson & Campbell, 2016;
Yuqing Yan & Zhang, 2019). The knowledge worker must have tacit knowledge of the task
subject allowing the worker to seek unknown knowledge assets within the KMS. The knowledge
worker interacts with the KMS applying tacit knowledge to begin searching for the stored,
explicit knowledge within the KMS. Due to the knowledge worker’s tacit knowledge of the
subject, the knowledge worker understands if the retrieved results from the KMS displaying the
explicit knowledge offers the knowledge assets expected. When the KMS search results do not
return the expected explicit knowledge assets, the knowledge worker loses productivity resulting
in financial losses for the organization over the long
Researchers have found that knowledge worker productivity while using the KMS is impacted
by the ability to retrieve knowledge and locate explicit knowledge assets as intended (Andrawina
et al., 2018; Zaim & Tarim, 2019).
There is a lack of research studies investigating knowledge worker productivity with the
specific knowledge quality of the KMS. Researchers with similar studies exploring connections
between the components of the KMS and knowledge worker productivity give attention to the
social and cultural relationships and not the KMS itself (Kianto et al., 2019; Iazzolino & Laise,
2018; Shujahat et al., 2019). Overarching research in the literature reveals there are many
knowledge sharing barriers in the workplace while using the KMS from a social perspective that
produces a hindrance to knowledge worker performance (Alattas & Kang, 2016; AlShamsi, &
Ajmal, 2018; Caruso, 2017; Eltayeb & Kadoda, 2017; Ghodsian, Khanifar, Yazdani, & Dorrani,
2017; Muqadas, Rehman, Aslam, & Ur-Rahman, 2017). Knowledge sharing as a construct does
not apply to this research study due to the social nature connection to knowledge worker
17
productivity. This researcher seeks to generate theoretical constructs toward examining potential
relationships between the dimensions of knowledge quality and employee satisfaction within the
KMS and not from a social aspect.
Knowledge quality as a component of the Jennex and Olfman KM Success Model is
relevant to the success of the knowledge worker’s productivity based on the context of the
explicit knowledge provided by the KMS (Jennex, 2017). The knowledge quality component as
one of six performance indicators of the KMS is operationalized into knowledge quality
constructs from the Jennex and Olfman KM Success Model (2017) to identify potential
relationships between knowledge worker productivity and employee satisfaction in this study.
When knowledge worker productivity becomes linked to financial outcomes, the timeliness of
the results returned by the KMS search query provides a measured value-added based on time
wasted or gained by the knowledge worker to perform their job tasks from the results (Iazzolino
& Laise, 2018). When the search query provides no results or results without the correct context,
the knowledge worker must perform another search query or search for the knowledge assets
using another tool or method (Jennex, 2017; Karlinsky-Shichor & Zviran, 2016; Sutanto et al.,
2018;
Zhang, 2017).
Organizations’ consideration of time spent without results could equate to a loss in
potential earned revenue (Levallet & Chan, 2018). The problem faced by organizations using a
KMS realizing financial losses due to knowledge worker productivity presents a problem to the
a review of the literature to examine the relationship between KMS knowledge quality,
knowledge worker productivity, and employee satisfaction. The documentation and search
18
strategy used to accumulate scholarly and peer-review literature relevant to this study completes
the introduction.
A review of the literature included scholarly books, scholarly and peer-reviewed articles
in journals, empirical research, dissertations, and industry-focused Internet publications forming
the foundation of this study. A review in the origins of information systems later split into
various functions such as the KMS to harness knowledge assets and learning to form the
foundation for one dimension of the system. Further review to show the constructs of knowledge
worker productivity connections to KMS yielded the framework to investigate the relationship
between the knowledge quality of an organization’s KMS, knowledge worker productivity, and
employee satisfaction outcomes. The components of KMS knowledge quality as a performance
indicator incorporate KM strategy/process, richness, and linkages within the KMS as facets of
knowledge quality framed the search criteria for this study. Multiple databases accessed through
Northcentral University’s online library, including ProQuest, Sage, EBSCO, and Gale articles
enabled the literature basis for this study. Various searches from scholarly, primary resources
over the past four years using a combination of keywords included Information System Theory,
KMS, Knowledge Management System, information system, DeLone and McLean IS Success
Model, Jennex and Olfman KM Success Model, KM process, KM strategy, Knowledge
Management, KM model, knowledge worker, knowledge worker productivity, knowledge
sharing, information system theory, employee satisfaction, job satisfaction, knowledge theory,
organizational theory, and system quality.
In the next section, a review of the theories for the theoretical frameworks of this study
formulated the applicable knowledge themes based on the research. A brief review of multiple
seminal foundational theories evolving into the current model supported the basis of this study
19
beginning with Information Systems Theory (IST) (Langefors, 1977; Lerner, 2004). IST
supported the infancy of information systems when differing definitions between data and
information caused controversy among scholars and business leaders. The birth of separate
information systems to address separate business needs entreated performance indicators of
success resulting in the DeLone and McLean Information Systems (IS) Success Model (DeLone
& McLean, 1992; DeLone & McLean, 2003; DeLone & McLean, 2004; Liu et al., 2005; Zuama
et al., 2017). The rise in the digital management of knowledge assets resulted in Knowledge
Management Systems disrupting the information system arena resulting in the creation of the
Jennex and Olfman KM Success Model (Jennex, 2017; Jennex & Olfman, 2006). This
progression in specific technology systems to address a specific business need to be functioned
as a segway for each model to supply performance indicators of success for each system. The
need to finish standards and governance of Knowledge Management Systems has not progressed
since the offering of the Jennex and Olfman KM Success Model. Therefore, the context of
knowledge quality as a component of the KMS originates from the Jennex and Olfman KM
Success Model’s definition of knowledge quality, including the constructs. A review of these
models gives the content of the theoretical framework section to support the remaining theme
domains.
The review of each subcategory within each theme domain reveals a historical
perspective of past applications relevant to this research study. Next, current implications
applicable to the study topic became known based on existing research. Research findings on
each domain, including opposing views in the literature, allow a multi-faceted view of each
theme. Evidence of research in each theme forms the basis of this chapter in exploring if a
relationship exists between the knowledge quality of an organization’s KMS, knowledge worker
20
productivity, and employee satisfaction while using the KMS. The relevant knowledge themes
within this study form the major sections, including knowledge workers, Knowledge
Management, Knowledge Management Systems, knowledge worker productivity, and employee
satisfaction. Finally, the summary section of this chapter reiterates key concepts, reviews gaps in
the literature, and establishes the research and design methodology as a segway to chapter 3.
Theoretical Framework
The brief overview of seminal theories supported the selected theoretical framework as
the foundation aligning the purpose of this study, the problem statement, research questions,
and hypothesis. Next, an in-depth description of each framework applicable to this study
examined significant components. The next section describes theories within the management
of engineering and technology discipline and theories applicable to the study topic. Finally, an
introduction to the themes supporting the purpose of this study was included as a prelude to
the remaining Literature Review section and closed with a review of chapter two in the
summary section.
The historical evolution of the selected theoretical framework portrayed the natural
progression within the technology industry concerning the systematic management of
knowledge within an organization. Initially, one information system could support all facets
of the business needs due to the limited processing capabilities (Medakovic & Maric, 2018).
The grandfather of this framework developed by Borje Langefors originated from the
disagreement in definition and purpose between data and information within technological
systems progressed into Information Systems Theory (IST) (Langefors, 1977; Lerner, 2004).
Multiple information systems sprang quickly to life fostering the need for a framework
measuring information system performance met by the DeLone and McLean IS Success
21
Model (DeLone & McLean, 1992; DeLone & McLean, 2003; DeLone & McLean, 2004; Liu
et al., 2005; Zuama et al., 2017). Innovative technology later manifested the need to clarify
the definition, purpose, and value between information and knowledge within the business
units. The Jennex and Olfman KM Success Model repurposed the DeLone and McLean IS
Success Model performance indicators to address the flood of knowledge becoming readily
available throughout the organization derived from new innovative technology (Jennex, 2017;
Jennex & Olfman, 2006). The Jennex and Olfman KM Success Model support the purpose of
this study aligning the specific system component of knowledge quality within an
organization’s KMS to determine if a relationship exists based on the measurement of the
system component to the outcome of knowledge worker productivity and employee
satisfaction.
Information Systems Theory (IST)
The seminal foundational framework of the study originates from the information
systems theory (IST), historically addressing the underlying computational logic and the
technology used to process data for providing information known only as of the information
systems (Langefors, 1977; Lerner, 2004). Langefors (1977) developed the Information Systems
Theory to address the lack of theoretical framework distinguishing information as a separate
construct from data. During this era, the terms data and information had become synonymous
leading Langefors (1977) to develop the Information systems theory to identify the specific
output of information through multiple, distinct technology systems. Volkova and Chernyi
(2018) describe applications to Information Systems Theory within current systems to magnify
theoretical implications in an efficient information flow throughout the workplace, creating a
new cultural existence (Volkova & Chernyi, 2018). The emergence of the Management
22
Information Systems (MIS), Decision Support Systems (DSS), and Expert Systems (ES) now
served distinct functional purposes within the organization (Devece Carañana et al., 2016;
Medakovic & Maric, 2018; Mentzas, 1994). A standard method to measure the performance
indicators of these various information systems did not exist until DeLone and McLean (2006)
surveyed the literature to identify six main components.
DeLone and McLean IS Success Model
As these innovative technology systems gained popularity within the business arena, the
DeLone and McLean IS Success Model provided a framework for measuring individual
information systems success (DeLone & McLean, 1992; DeLone & McLean, 2003; DeLone &
McLean, 2004). Regardless of the purpose of the organization’s information system, the high-
level categories to measure performance indicators included information quality, system quality,
service quality, system use and usage intentions, user employee satisfaction, and net system
benefits (DeLone & McLean, 1992; DeLone & McLean, 2003; DeLone & McLean, 2004; Liu et
al., 2005; Zuama et al., 2017). The first component of the DeLone and McLean IS Success
Model offered information quality as a performance indicator serving as the foundation of the
system’s capabilities such as the storage of information and capability to deliver the stored
information (DeLone & McLean, 1992; DeLone & McLean, 2003; DeLone & McLean, 2004).
DeLone and McLean describe system quality as the second component acting as an indirect
capability of the system based on the benefits during usage of the system (DeLone & McLean,
2004). The third component, service quality, represents the user of the system’s initial intent to
use the system and monitor user employee satisfaction (2004). Next, DeLone and McLean
(2004) outline the system use/usage intentions as the fourth component are dependent on the
previous experience from using the same system and user allowing indication of future intent to
23
use the system based on the previous quality of results received from the system. DeLone and
McLean (2004) present user employee satisfaction as the fifth component focusing on the result
of the user’s experience upon completion of using the system. The final component of the
DeLone and McLean IS Success Model (2004) called net system benefits offers a combined
overall performance indicator of the information system based on the perceived value as a direct
result of the previous components’ performance indicator outcomes. Upon the evolution of
leveraging information into knowledge, quality as a performance indicator would necessitate a
performance indicator for the quality of knowledge addressed within in the offering of the
Jennex and Olfman KM Success Model (Jennex & Olfman, 2006).
Jennex and Olfman KM Success Model
Upon the birth of the KMS as an individual system encapsulating knowledge assets, the
Jennex and Olfman KM Success Model transformed the DeLone and McLean IS Success Model
(2004) to address the additional knowledge component of the organization’s Knowledge
Management System (Jennex, 2017; Jennex & Olfman, 2006). The six high-level performance
indicator categories within the Jennex and Olfman KM Success Model remained similar to the
DeLone and McLean IS Success Model (2004) except for replacing information quality with
knowledge quality specific to the KMS (Jennex, 2017; Jennex & Olfman, 2006). However,
Jennex and Olfman (2006) determined the underlying components of each performance indicator
category in the KM Success Model specifically support the needs of the KMS. The six high-level
categories comprising knowledge-specific performance indicators include knowledge quality,
system quality, service quality, intent to use/perceived benefit, use/user employee satisfaction,
and net system benefits (Jennex, 2017; Jennex & Olfman, 2006).
24
Knowledge quality as the first component of the Jennex and Olfman KM Success Model
transforms the DeLone and McLean IS Success Model’s first component of information quality
based on the differentiation of the knowledge system needs (Jennex, 2017; Jennex & Olfman,
2006). Jennex and Olfman (2006) incorporated subcategories within the knowledge quality
component containing three performance indicators: knowledge strategy/process, richness, and
linkages. KM strategy/process as a component of the knowledge quality performance indicator
focuses on the user’s specific actions and the process for the knowledge strategy when using the
system, according to Jennex and Olfman (2006). Richness, as the next performance indicator
within knowledge quality and one of the KMS knowledge quality components of this study,
indicates the accuracy and timeliness of the knowledge retrieved from the KMS, as well as the
applicable context expected by the user of the KMS (Jennex, 2017; Jennex & Olfman, 2006).
The third indicator of knowledge quality offered by Jennex and Olfman (2006) is the linkage
supplying the internal mappings of code to provide the results from the search query entered by
the user (Levallet & Chan, 2018).
The second component of the model includes system quality ascribing three performance
indicators as technological resources, the form of KMS, and levels of KMS tying performance
indicators to the organization’s specific KMS technology frameworks (Jennex, 2017; Jennex &
Olfman, 2006). Next, service quality as the third component describes management support, user
KM service quality, and information system KM service quality as indicators of performance
from the organization’s management and governance perspective (Jennex, 2017; Jennex &
Olfman, 2006). System use/usage intentions represent the fourth component in the Jennex and
Olfman KM Success Model (2006). These intentions become dependent on the previous
experience using the same system and user, allowing indication of future intent to use the system
25
based on the previous quality of results received from the system (2006). Similar to the DeLone
and McLean IS Success Model, Jennex and Olfman (2006) describe use/user employee
satisfaction as the fifth component of a performance indicator referencing a successful
experience based on the actual use of the KMS and employee satisfaction from each use. The
final component of the Jennex and Olfman KM Success Model (2006), similar to the DeLone
and McLean IS Success Model (2004), is called net system benefits as an indicator of the KMS
based on perceptions of value.
Frameworks out of Scope
A review of the literature found information-based theories out of scope, including the
Technology Acceptance Model (TAM) and Task Technology Fit Theory (TFF). The Technology
Acceptance Model (TAM) focuses on the user perspective from a social facet during interaction
with the technology system and not the system component (Nugroho & Hanifah, 2018). The
Task-Technology Fit (TTF) theory incorporates system components within the theoretical
constructs yet bases the outcome measurement on the relationship from the social perspective
(Wipawayangkool & Teng, 2016). These two frameworks do not align with the purpose of this
study based on the emphasis from the user perspective and not from the knowledge quality
component of the system. Organizational theories in the cited research denote the framework
evident in the collective behaviors, attitudes, and norms (Hamdoun et al., 2018; Mousavizadeh et
al., 2015; Nuñez et al., 2016; Ping-Ju Wu et al., 2015). Dynamic capabilities theory incorporates
the actions an organization must take to integrate competencies within business practices to
ensure competencies within the market, including knowledge-sharing processes (Meng et al.,
2014). Resource-based view within the literature further proposes an approach in supporting
business performance when knowledge sharing behaviors align with business processes (Meng et
26
al., 2014; Oyemomi, Liu, Neaga, Chen, & Nakpodia, 2018). The resource-based view provides
the framework of an organization’s capability to strategically manage the resources on an implicit
and explicit level to achieve a competitive advantage and often correlate with the organizational
culture comprised of the behaviors, and attitudes of the collective group within the organization
(Mousavizadeh et al., 2015; Nuñez Ramírez et al., 2016; Oyemomi et al., 2018; Ping-Ju Wu et
al., 2015). These organizational theories lean toward the social-cultural constructs which do not
align with the purpose of this study.
Relevant Knowledge Domains
Knowledge workers, Knowledge Management, Knowledge Management Systems,
knowledge worker productivity, and employee satisfaction comprise the five major themes
encircling subdomains forming the context of this research. A review of the literature provides
the basis for the theoretical framework included in each subdomain. The introduction of a
knowledge worker as a significant theme throughout the literature review introduces the
construct of knowledge work, knowledge workers in the technology industries, and the 21st-
century role of knowledge workers (Drucker, 1999; Iazzolino & Laise, 2018; Karlinsky-Shichor
& Zviran, 2016; Moussa et al., 2017; Shrafat, 2018; Shujahat et al., 2019; Surawski, 2019;
Turriago-Hoyos et al., 2016; Zhang, 2017; Zaim et al., 2019). The topic of Knowledge
Management within the literature includes the role of KM in an information technology industry,
the business leader’s use of KM, and KM utilization within the KMS (Banerjee et al., 2017;
Iazzolino & Laise, 2018; Karlinsky-Shichor & Zviran, 2016; Martinez-Conesa et al., 2017;
Shrafat, 2018; Shujahat et al., 2019; Zaim et al., 2019). KMS as the third major theme includes
the digital management of knowledge assets, implementation strategies of the KMS, and the
components of the KMS and the measurement of the components (Karlinsky-Shichor & Zviran,
27
2016; Nusantara et al., 2018; Shrafat, 2018; Shujahat et al., 2019; Surawski, 2019; Zhang, 2017;
Zaim et al., 2019). Knowledge worker productivity is the fourth theme in this study incorporates
research exploring the financial costs related to knowledge worker productivity using the KMS
and the promotion and enablement of knowledge worker productivity using the KMS (Drucker,
1999; Karlinsky-Shichor & Zviran, 2016; Iazzolino & Laise, 2018; Shrafat, 2018; Shujahat et al.,
2019; Zhang, 2017; Zaim et al., 2019). A review of the literature supports employee satisfaction
as a final theme in this study identifying user satisfaction during KMS use, and KMS knowledge
quality constructs including KM process/strategy, richness, and linkage to indicate performance
success of KM activities (Jennex, 2017; Jennex & Olfman, 2006; Zamir, 2019; Zhang &
Venkatesh, 2017).
The historical evolution in the separation of data, information, and knowledge into a
unique asset within organizations led to the natural progression of framework models. The
Information System Theory (Langefors, 1977), DeLone and McLean IS Success Model
(DeLone & McLean, 1992; DeLone & McLean, 2003; DeLone & McLean, 2004), and the
Jennex and Olfman KM Success Model (Jennex & Olfman, 2006) arose from the need to offer
business performance indicators to assess the organization’s successful utilization of these
assets from an information system perspective. The theoretical framework supports the five
major knowledge domains as subtopics: knowledge workers, Knowledge Management,
Knowledge Kanagement Systems,
knowledge worker productivity, and employee satisfaction.
Each major theme incorporates subdomains supported by literature throughout the literature
review’s remaining body through the lens of the Jennex and Olfman KM Success Model
(Jennex, 2017; Jennex & Olfman, 2006).
28
Knowledge Worker
Researchers point to Peter Drucker as the creator of the term knowledge worker in
response to the innovation of technology and changes in workforce needs (Archibald et al., 2018;
Banerjee et al., 2017; Ebert & Freibichler, 2017; Surawski, 2019; Turriago-Hoyos et al., 2016).
Drucker (1999) predicted that knowledge workers would become the most valuable asset within
an organization in the 21st century. Though a review of the literature does not set one definitive
characteristic for all knowledge workers, a common prerequisite theme indicates intellectual and
cognitive job tasks as acceptable requirements (Moussa et al., 2017; Surawski, 2019; Turriago-
Hoyos et al., 2016). Knowledge workers shift from previous expectations to produce a quantity
of work to results-based output and the ability to self-govern and generate knowledge (Turriago-
Hoyos et al., 2016). Today, knowledge workers exist in technical and nontechnology industries
(Surawski, 2019).
Researchers describe additional characteristics of the knowledge worker to contain self-
sufficiency in planning, managing, and auditing the quality of the knowledge work tasks to
complete the intended business process (Iazzolino & Laise, 2018; Shrafat, 2018; Surawski, 2019;
Zhang, 2017). Surawski (2019) describes multiple references to knowledge workers within the
business and education with labels comprising information workers, data workers, professionals,
specialists, experts, white-collar workers, and office workers. Surawski believes white collar is
synonymous with present-day knowledge workers with subcategories of office workers,
information workers, and data workers. Some authors delineate professionals as a type of skilled
knowledge worker often associated with a management role within the business or education
industry (Lee et al., 2019; Surawski, 2019; Vuori et al., 2019). Specialists contained within the
category of a knowledge worker are associated with staff possessing a specific skill set, not
29
within a management position (Nikolopoulos & Dana, 2017; Surawski, 2019). On the other
hand, subject matter experts possess a skill set within a non-management role obtained through
experience often sought after by their peers in a specific knowledge area (Surawski, 2019; Vuori
et al., 2019).
Knowledge worker themes found in the literature incorporate knowledge work,
knowledge workers in the technology industry, and the 21st-century role of knowledge workers
in support of this study.
Knowledge Work
To operationalize knowledge workers, one must distinguish knowledge work as the act of
creating and using knowledge to perform organizational tasks required to generate output
necessary for an organization’s products and services (Drucker, 1999; Moussa et al., 2017;
Shujahat et al., 2019). The role of knowledge workers requires the self-management of tasks to
perform these expected deliverables as knowledge work. In direct contrast to manual labor
workers dependent upon the completion of another co-worker’s laborious tasks, the knowledge
worker possesses an intellectual aptitude necessary for performing each task to complete their
work based on their skillset (Costas & Karreman, 2016; Drucker, 1999; Shujahat et al., 2019).
Costas and Karreman (2016) further describe knowledge work on a higher level as a fulfillment
aspect, inspiring innovation, and autonomy for the knowledge worker. Knowledge work, as
performed by qualified knowledge workers, often allows independence creating connectivity
between the knowledge worker and the organization through ownership of knowledge work tasks
according to Costas and Karreman. These tasks require performance by experienced knowledge
workers possessing specific skillsets to create, transfer, and utilize knowledge to perform the
assigned knowledge work task (Costas & Karreman, 2016; Surawski, 2019). Knowledge work
tasks often heed quality over quantity to retrieve the desired outcome fostering the organization’s
30
challenge to measure the output of knowledge worker tasks due to the intangible nature of
knowledge work (Iazzolino & Laise, 2018; Shrafat, 2018).
Knowledge Workers in the Technology Industries
A review of the literature reveals knowledge workers span across multiple industries
while much of the research denotes the majority attributed to technology roles or industries
(Nikolopoulos & Dana, 2017; Surawski, 2019; Zelles, 2015). Within our digital age, technology-
based businesses require the employment of knowledge workers possessing a specific skill set to
perform the intellectual knowledge work (Surawski, 2019). The technical skill sets of knowledge
workers across a variety of industries within the technology-focused industries (Zelles, 2015).
Businesses within the technology industry fall within the “Information Sector” industry,
according to the U.S. Bureau of Labor Statistics spanning a wide range of technology-specific
products and services (“Industries at a Glance,” 2020). Software-specific industry firms
publishing software to market these products for installation or offerings of services for
installation of software fall within the software industry category of NAICS code #51121
(“Banner Reports,” 2019). Knowledge workers in the technology industries become
organizational assets offering intellectual capabilities to ensure firm innovation through the
creation of value without physical labor (Vuori et al., 2019; Zelles, 2015). Just as innovation
burst onto the scene with the emergence of the 21st Century, knowledge workers’ role
exponentially experienced a shift of knowledge work due to a growing digital era (Vuori et al.,
2019;
Shujahat et al., 2019).
21st Century Role of Knowledge Workers
Changes in the knowledge workers’ role in the workplace influenced the activities needed
to perform expected business outcomes (Turriago-Hoyos et al., 2016; Vuori et al., 2019; Wessels
31
et al., 2019). Surawski (2019, p. 114) outlines the progression of standard terms to describe
present-day knowledge workers described as “knowledge age workers,” “professionals,”
“specialists,” and “intellectual workers.” Generating knowledge for contributions to
organizational innovation and value-added activities toward the business’s successful
performance has become the norm in the 21st Century knowledge worker. The advancements in
technology continue to impact the knowledge worker’s role through improved tools to produce
knowledge more efficiently and enhanced work conditions for the knowledge worker (Wessels et
al., 2019). Regardless of the industry, multiple employment titles meet the definition of
knowledge worker found within the occupational employment code “15-0000 Computer and
Mathematical Occupations,” according to the Bureau of Labor Statistics (“Occupational
Employment Statistics,” 2018). However, the role of knowledge workers in the 21st Century will
not be limited to only technical and intellectual work tasks, rather the proficiencies in creativity,
visionary, and collaborative insights will embody the very nature of knowledge work in this
digital era (Archibald et al., 2018; Vuori et al., 2019). Challenges faced by businesses throughout
the sequence of knowledge workers coinciding with technological advancements include
managing the knowledge assets and processes (Shujahat et al., 2019).
Knowledge worker themes found in the literature incorporate knowledge work,
knowledge workers in the technology industry, and the 21st-century role of knowledge workers
in support of this study. Current findings indicate knowledge as the foundation for innovation
requiring knowledge workers to generate knowledge assets (Costas & Karreman, 2016;
Turriago-Hoyos et al., 2016). Researchers want future research to incorporate the office space
and conditions of knowledge workers to support knowledge work activities (Krozer 2017;
Surawski, 2019). Additionally, researchers seek contributions in the role of organizational
32
management impacts on knowledge workers (Turriago-Hoyos et al., 2016; Ullah et al., 2016;
Wei & Miraglia, 2017).
Knowledge Management
Knowledge management allows business leaders to leverage the accumulation of
knowledge assets originating from people, processes, and technology acting as sources for
strategic decisions, innovation, and competitive advantage (Koc et al., 2019; Shujahat et al.,
2019; Yuqing Yan & Zhang, 2019). Business leaders must assimilate multiple facets of
knowledge assets to determine strategic business decisions aimed to leverage innovation and
ensure a competitive advantage in the marketplace (Martinez-Conesa et al., 2017). Knowledge
management supports an organization’s innovation capabilities based on processes encouraging
knowledge exchange activities inspiring new products and services unique to the business
(Martinez-Conesa et al., 2017). The way knowledge becomes capitalized determines the
organization’s innovative ability during the Knowledge Management process cycle in the
capture, creation, storage, retrieval, sharing, and utilization of knowledge (Al-Emran et al., 2018;
Shujahat et al., 2019).
Innovative products and services require support from the restructuring of business
processes integrated with Knowledge Management to address emerging technologies and
et al.
facilitate improved business performance, laying the foundation for the competitive advantage
based on effective Knowledge Management practices (Martinez-Conesa et al., 2017; Roldán et
al., 2018). Researchers Al Ahbabi et al. (2019) found a significant positive relationship between
the Knowledge Management processes and the organization’s performance, proposing future
research in the information sector and inclusion of additional contexts within Knowledge
33
Management processes. Additional subdomains surrounding Knowledge Management found in
the literature incorporate the history of Knowledge Management, components of Knowledge
Management, and organizational units of Knowledge Management within the Knowledge
Management theme.
History of Knowledge Management
The first efforts to connect the management of knowledge assets to the firm’s ability to
achieve business goals and impact performance originated three decades ago by knowledge
experts Ikujiro Nonaka and Hirotaka Takeuch (Hoe, 2006; Nonaka, 1991). Early developments
in Knowledge Management efforts pursued asset management of skilled employee knowledge,
business processes, and all organizational learned knowledge for capturing intellectual capital
(Koc et al., 2019). As technology systems evolved, the capability to digitally manage knowledge
covering all facets of the organization set the stage using the Knowledge Management System
et al., 2018; Yuqing Yan & Zhang, 2019). The capability to manage knowledge assets
cultivated by innovative technology in the workplace gave life to Knowledge Management in
supporting an improved competitive advantage (Yuqing Yan & Zhang, 2019). Researchers’
descriptions of these components of Knowledge Management have changed although the end
goal is the same in managing the organization’s knowledge in a manner leading to a net benefit
for the organization (Hashemi et al., 2018; Hoe, 2006; Mousavizadeh et al., 2015; Oyemomi et
al., 2018; Shujahat et al., 2019).
Components of Knowledge Management
Knowledge management represents the organization’s capabilities to create, capture,
share, store, and consume knowledge assets as the gateway to prevent loss of knowledge, inspire
innovation, and gain a competitive advantage (Caruso, 2017; Costa & Monteiro, 2016; Intezari et
34
al., 2017; Martinez-Conesa et al., 2017; Navimipour & Charband, 2016; Shujahat et al., 2019;
Yuqing Yan & Zhang, 2019). Standard components of Knowledge Management listed in the
research represent the creation of knowledge assets, capturing knowledge assets, sharing
knowledge assets, storage of knowledge assets, and the consumption of knowledge assets
necessary to promote the successful management of knowledge (Hashemi et al., 2018; Hoe,
2006; Mousavizadeh et al., 2015; Oyemomi et al., 2018; Shujahat et al., 2019). Each component
contributes to the cyclical phases within Knowledge Management processes organizations’
follow contributing to innovation and competitive advantage (Caruso, 2017; Costa & Monteiro,
2016; Intezari et al., 2017; Martinez-Conesa et al., 2017; Navimipour & Charband, 2016;
Shujahat et al., 2019; Yuqing Yan & Zhang, 2019).
The creation of knowledge assets as a component of Knowledge Management supports
the overarching goal of acquiring knowledge and distributing it across the organization to
improve business performance (Caruso, 2017; Levallet & Chan, 2018). The creation of
knowledge assets begins during the synthesis of tacit and explicit knowledge creating new
perspectives and significance of new knowledge through collaborative efforts (Al Ahbabi et al.,
2019; Cannatelli et al., 2017; Shujahat et al., 2019). This generation of new knowledge requires
the harmonious blend of tacit and explicit knowledge as a natural process within a conducive
workplace environment (Hoe, 2006; Oyemomi et al., 2018). Researchers believe the creation of
knowledge assets sets the stage for innovative opportunities and future capabilities supporting
business value assets and performance (Caruso, 2017; Hashemi et al., 2018; Mousavizadeh et al.,
2015; Shujahat et al., 2019). The creation of new knowledge requires collaboration across
organizational boundaries to stimulate creative mixtures of new and existing knowledge contexts
(Al Ahbab et al., 2019; Cannatelli et al., 2017). Researchers relay the benefits of continuous
35
knowledge creation may improve innovative capacities fostering new innovative services and
products, KM activities, and improved business performance (Cannatelli et al., 2017; Costa &
Monteiro, 2016). The use of technology offering multiple tools for generating new knowledge
may also aid in the flow of knowledge assets creation (Roldán, Real, & Ceballos, 2018).
Once knowledge assets are acquired or created, the capturing of this knowledge requires
cooperation from business units in the deliberate collection of explicit knowledge for the desired
purpose to improve business performance and innovation (Muqadas et al., 2017; Rutten et al.,
2016; Shujahat et al., 2019). The capturing of new knowledge assets progresses through a
lifecycle of collection, organization, and defining of explicit knowledge for reuse within the
organization (Al Ahbabi et al., 2019). Knowledge workers may capture new organizational
explicit knowledge gained through the performance of work tasks and processes (Al Ahbab et
al., 2019; Shujahat et al., 2019). Al Ahbab et al. note that capturing tacit knowledge stemming
from individual knowledge workers’ job experience and skillsets is difficult yet possible to
achieve new knowledge assets. Once new knowledge becomes captured, the process of
codification ensures the reuse of the knowledge as a tangible asset throughout the organization
and accessible within the KMS (Al Ahbab et al., 2019; Cannatelli et al., 2017). The capturing of
new knowledge assets is a continuous process invoking the remaining components of the unit’s
Knowledge Management efforts (Cannatelli et al., 2017; Costa & Monteiro, 2016; Mao et al.,
2016; Roldán et al., 2018).
Researchers describe knowledge sharing as a critical component of the Knowledge
Management process due to the distribution of knowledge assets through technology tools or
people across the business units (Al Ahbabi et al., 2019; Al Shamsi & Ajmal, 2018; Loebbecke,
van Fenema, & Powell, 2016; Muqadas et al., 2017; Oyemomi et al., 2018; Shujahat et al.,
36
2019). This critical component in Knowledge Management hinges upon the unlimited collection
of internal knowledge not yet known to others within the organization dwelling within the minds
of knowledge workers (Shujahat et al., 2019). Knowledge workers participating in the capturing
and creating of new knowledge remain essential to the successful sharing of knowledge in an
organization while fostering improved process standards, innovation capabilities, and business
performance (Muqadas et al., 2017).
Influences in workplace knowledge sharing derive from diverse sources within the
organization, including workplace culture, technology, organizational leadership, and knowledge
workers (Al Shamsi & Ajmal, 2018; Shujahat et al., 2019; Wei & Miraglia, 2017). However,
several researchers pose organizational culture as a leading facilitator in the promotion of
knowledge sharing within the organization (AlShamsi & Ajmal, 2018; Caruso, 2017; Costa &
Monteiro, 2016; Mao et al., 2016; Muqadas et al., 2017; Oyemomi et al., 2018). The role of
technology tools and systems toward knowledge sharing as a component of Knowledge
Management is unclear due to varying types of technology and implementation strategies
impacting measuring outcome success (Al Shamsi & Ajmal, 2018; Costa & Monteiro, 2016;
Mao et al., 2016; Oyemomi, 2018). Nevertheless, some researchers reflect the influence of
leadership as a determinant in the compliance of knowledge sharing of this component within the
Knowledge Management process based on the support or lack thereof from business
management (Al Shamsi & Ajmal, 2018; Cannatelli et al., 2017; Costa & Monteiro, 2016).
Ultimately, knowledge workers within the organization determine when to share knowledge
through social interaction and technology or when to withhold knowledge to use as leverage for
personal gain (Costa & Monteiro, 2016; Koenig, 2018; Muqadas et al., 2017).
37
Al Shamsi and Ajmal (2018) conducted a study to examine direct influences on the
knowledge sharing of service organizations within the technology industry. Data collection
efforts retrieved a total of 222 manager-employee responses in service organizations for the
identification of knowledge-sharing behaviors. The results show that organizational leadership is
an essential factor that impacts knowledge sharing in technology-intensive organizations,
followed by organizational culture, organizational strategy, corporate performance,
organizational process, employee engagement, and organizational structure. According to the
results, the least impactful factor is human resource management (Al Shamsi & Ajmal, 2018).
In contrast, many scholars seek to understand the barriers in knowledge sharing within
the workplace from the aspects of the knowledge worker, organization, and management as
barriers (Intezari et al., 2017; Muqadas et al., 2017; Orenga-Roglá & Chalmeta, 2019; Shrafat,
2018). Knowledge workers may function as a barrier agent preventing knowledge sharing within
the organization from a personal fear of job loss or loss of power or control (Intezari et al., 2017;
Muqadas et al., 2017; Orenga-Roglá & Chalmeta, 2019). Recent studies describe organizational
barriers to knowledge sharing based on the corporate culture and behavioral norms formulating
the knowledge worker’s motivation to share knowledge (Intezari et al., 2017; Shrafat, 2018;
Zimmermann et al., 2018). Within the digital arena, barriers to knowledge sharing evolve from
the technology employed by the organization aimed to share knowledge hinders the free flow of
awareness among the workplace (Al Shamsi & Ajmal, 2018; Costa & Monteiro, 2016; Mao et
al., 2016; Oyemomi, 2018).
The storage of knowledge assets requires a continuous cycle in Knowledge Management
as tacit knowledge becomes converted to explicit knowledge and stored using the organization’s
t
38
may encompass various forms of documents, digital media, databases, or data warehouses
selected by the organizational unit. Oftentimes, organizations offer multiple forms of storage
differing by the type of knowledge and knowledge worker skillsets (Hashemi et al., 2018). In
addition to the storage of knowledge assets, Knowledge Management Systems offer convenience
in managing the creation, capture, sharing, and retrieval of knowledge (Santoro et al., 2018;
Zhang, 2017; Yuqing Yan & Zhang, 2019). Regardless of the knowledge storage mechanism,
storage knowledge assets enable the consumption of this knowledge within the Knowledge
Management lifecycle to support innovation and business performance (Al Ahbabi et al., 2019;
Hashemi et al., 2018).
The retrieval of stored knowledge assets impacts knowledge workers’ capability to
consume the knowledge required to perform assigned tasks (Hashemi et al., 2018). The
successful cycle of Knowledge Management as an ever-evolving process in an endless stream of
creating, capturing, sharing, storing, and consuming knowledge assets embodies the goals of an
organization’s Knowledge Management strategy (Shujahat et al., 2019). The consumption of
knowledge produces a value-added asset that allows knowledge workers to apply new
knowledge to inspire innovation, the intellectual capability, and skillset to utilize this knowledge
toward problem-solving or new knowledge (Al Ahbabi et al., 2019; Shujahat et al., 2019).
Overall, the application of new knowledge promotes improved business processes, innovation
competencies, and business performance enabling the knowledge creation process across the
organization to continue (Al Ahbabi et al., 2019; Costa & Monteiro, 2016). Koc et al. (2019)
describe differing organizational units of Knowledge Management within the organization as
information management, process management, people management, innovation management,
and asset management assisting leadership in successful governance. Business leaders ascribe to
39
a variation of these organizational units of Knowledge Management based on organizational
needs (Koc et al., 2019; Shujahat et al., 2019; Yuqing Yan & Zhang, 2019).
Organizational Units of Knowledge Management
The continual demand for effective management of knowledge throughout the
organization formulated the maturity of the approach known today as Knowledge Management
(Koc et al., 2019; Shujahat et al., 2019; Yuqing Yan & Zhang, 2019). Knowledge management
perspectives continue to evolve as technology disruptions emerge. Knowledge management
encompasses cross-functional sectors within the firm requiring business leader oversight into the
management of the organization’s processes, the knowledge workers, and the workspace acting
as an essential business strategy (Al-Emran et al., 2018; Shujahat et al., 2019). Researchers Koc
et al. (2019) present current organizational units within Knowledge Management into categories
encompassing the management of information, processes, people, innovation, and assets within
the organization. These organizational unites of Knowledge Management subcategories are
described below.
Information management as a unit of Knowledge Management encapsulates the explicit
and recorded knowledge transformed from the organization’s intangible knowledge assets (Koc
et al., 2019). The management of information supports the organization’s knowledge strategy by
awareness and utilization of critical information transformed into knowledge suitable for the
organization (García-Alcaraz et al., 2019; Ramayani et al., 2017). Tacit and explicit knowledge
derives from the transformation of information into a useable form captured and stored for the
express purpose of reutilizing the knowledge for the benefit of the organization and preventing
loss of unique knowledge assets (Koc et al., 2019; Martinez-Conesa et al., 2017; Ramayani et al.,
2017; Shrafat, 2018). Information management is the key to unlock the door to initiate
40
Knowledge Management processes aimed at creating, capturing, storing, sharing, and processing
knowledge within the organization (García-Alcaraz et al., 2019; Ramayani et al., 2017).
Process management ensures the embedded knowledge unique to the organization
required to perform business processes reflects the operational knowledge and workflows of
procedures needed to perform knowledge work tasks (Koc et al., 2019; Steinau et al., 2019).
Knowledge management strategies seek process management efforts by putting the
organization’s data activities front and center regardless of the technology tools used to capture
the knowledge (Steinau et al., 2019). Organizations implement multiple business process
management approaches complimenting current business strategies, process tools, and
technology systems in place to support automation of processes, innovation capabilities, and
business performance (Nikiforova & Bicevska, 2018; Steinau et al., 2019).
Managing employees through a Knowledge Management lens encourages knowledge
workers to capture and share tacit and explicit knowledge, further supporting learning and future
access to the organization’s knowledge assets (Koc et al., 2019). The management of converting
tacit knowledge embedded in the knowledge worker’s understanding and expertise gained
through experience, research, and communication requires consistent people management efforts
(Yuqing Yan & Zhang, 2019). People management in the promotion of Knowledge Management
business processes must emphasize the knowledge worker cooperation in transcending tacit into
explicit knowledge because of workplace activities (Andrews & Smits, 2019; Olaisen & Revang,
2018). Management support of an environment fostering a cohesive workplace mindset
encourages new knowledge generation (Andrews & Smits, 2019; Koc et al., 2019).
The innovation management in Knowledge Management represents knowledge
conversion enabled from singular and collaborative learned knowledge, leading to discoveries
41
and development, and guiding improved business performance (Koc et al., 2019; Yuqing Yan &
Zhang, 2019). Innovation management relies upon the continual flow of knowledge creation and
innovation processes to increase business performance (Briones-Peñalver et al., 2019; Leopold,
2019). Leopold (2019) reviews innovation processes as the key ingredient to innovation
management, including shared Knowledge Management elements through transforming
organizational employees’ tacit knowledge into explicit knowledge, spurring innovation from a
combination of this knowledge, and the promotion of knowledge sharing. Managing innovation
within an organization falls within the Knowledge Management cycle flowing from the
transformation of tacit knowledge into explicit knowledge and, ultimately, into reusable
knowledge to support the innovation efforts within the organization (Sánchez-Segura et al.,
2016). Leopold describes the initial creation of knowledge through the end product requires a
full circle of the Knowledge Management cycle upheld by the collaboration of knowledge
workers within the organization. Collaboration empowers the creation and attainment of internal
knowledge assets supporting the formation of new products and services, facilitating continued
Briones-Peñalver
et al., 2019).
The asset management component reflects managing the intellectual capital of the
organization utilizing the intangible assets as leverage to gain a competitive advantage (
et al., 2019; Bacila & Titu, 2018; Koc et al., 2019; Prusak, 2017). The components of intellectual
capital within an organization consist of human, internal, and external capital assets supporting
the firm’s business performance and competitive capabilities (Prusak, 2017). Human capital
contributions to the intellectual capital of an organization impacted by the workplace culture and
knowledge worker capabilities to produce new knowledge and develop distinctive products or
42
2017). Internal or organizational capital stems from the knowledge captured within the
organization as explicit knowledge in technology applications and systems later formulated into
et al., 2019; Prusak, 2017). External capital includes the intellectual capital based on marketing
strategies through offerings of select products and services aimed to gain new customers and
demand for these products and services (Prusak, 2017). Asset management is a component of an
organization’s Knowledge Management strategy. It exploits the human, internal, and external
capital comprising the intellectual capital aimed to gain a competitive advantage within the
market (Junior et al., 2019).
A review of Knowledge Management themes applicable to this study supported by the
literature included the history of Knowledge Management, components of Knowledge
Management, and organizational units of Knowledge Management. Regardless of the
organization’s Knowledge Management unit strategies, researchers’ findings report the capture,
creation, transfer, storage, and consumption of knowledge assets contribute to the successful
management of knowledge aimed to prevent loss of knowledge and inspire innovation and gain a
competitive advantage (Costa & Monteiro, 2016; Shujahat et al., 2019; Yuqing Yan & Zhang,
2019; Zaim et al., 2019). Nevertheless, differing research within the literature reports
organizational leadership and organizational culture supporting successful Knowledge
Management contributing to innovation (AL Shamsi & Ajmal, 2018; Intezari et al., 2017;
Mousavizadeh et al., 2015). Intezari et al. (2017) call for future exploration into factors
supporting successful Knowledge Management outcomes. In contrast, other researchers seek to
contribute organizational cultural factors toward knowledge workers’ impacts on Knowledge
43
Management success. (Ullah et al., 2016; Wei & Miraglia, 2017). The emergence of
sophisticated technology systems supports the management of knowledge assets within a
Knowledge Management System (Nurulin et al., 2019; Zhang, 2017). Managers seek Knowledge
Management tools, including Knowledge Management Systems, to capitalize on the creation,
storage, sharing, and utilization of knowledge within the organization (Mao et al., 2016; Peng et
al., 2017; Roldán et al., 2018).
Knowledge Management System
An introduction to Knowledge Management System (KMS) as a significant theme within
the literature provides the history of KMS, types of KMS, implementation of KMS, knowledge
worker use of KMS, and KMS performance indicators as the constructs of this domain
contributing to this study. Zhang (2017) describes the Knowledge Management System as a type
of information system specifically aimed to manage knowledge assets. Innovative technologies
spurred Knowledge Management capabilities through new software and hardware offerings,
leading to a variety of KMS technology platforms and offerings (Demirsoy & Petersen, 2018;
Schwartz, 2014). The development of Knowledge Management Systems evolved throughout the
decades in response to Knowledge Management’s role within organizations (Nurulin et al.,
2019). Initial features of the KMS allowed knowledge workers the capability to search the
system for knowledge, additional access knowledge articles internal or external to the
organization, create or edit knowledge artifacts, and label existing artifacts assisting
collaborators for reuse of contextual knowledge (Orenga-Roglá & Chalmeta, 2019; Zhang &
Venkatesh, 2017). KMS tools continue to evolve, enhancing new capabilities assisting in
collaborative effectivity, encourage knowledge exchange, social, and sharing (Lee et al., 2019;
Orenga-Roglá & Chalmeta, 2019; Zhang, 2017).
44
Business leaders leverage KMS as a structured tool to manage knowledge assets and
enable Knowledge Management processes, including capturing, creating, sharing, and applying
knowledge (Santoro et al., 2018). The type of KMS system becomes selected based on the
perceived support toward the needs of the Knowledge Management processes with supportive
features and tools (Demirsoy & Petersen, 2018; Lee et al., 2019). Organizations often find value
in adopting a KMS comprised of collaboration tools allowing the users to communicate
efficiently, exchange knowledge, and manage the stored knowledge and communication
mechanisms (Del Giudice & Della Peruta, 2016; Orenga-Roglá & Chalmeta, 2019; Zhang,
2017). Shrafat (2018) conducted a study of small and medium-sized businesses to identify
additional influences contributing to successful Knowledge Management practices and realized
benefits from implementing KMS. Shrafat analyzed survey results from 247 participants from
multiple small to mid-sized businesses incorporating the adoption of a KMS. Results from the
study indicated that various organizational readiness and capabilities encompassing Knowledge
Management, knowledge sharing, organizational learning, and technology contribute to the KMS
success outcome (Shrafat, 2018). In contrast, researchers conclude there remain barriers to
effective use of an organization’s KMS due to the ever-growing storage of multiple media
formats of knowledge within the system, the semantics of finding data within context, and lack
of standard system features (Kimble et al., 2016; Peng, Wang, Zhang, Zhao, & Johnson, 2017).
History of KMS
In 1978, the first rudimentary Knowledge Management System featured hyperlinks
within internal organizational groupware connecting remote workers across geographical
locations as a method to collaborate and share knowledge (Ceptureanu et al., 2012). The
popularity of internal Knowledge Management Systems within the business arena extended to
45
external networks upon the emergence of the internet, launching Knowledge Management
conferences and associations in the 1990s (Ceptureanu et al., 2012; Koenig, 2018). Multiple
types of Knowledge Management System product offerings soon materialized on the market
equipped with new features promising to provide successful management of knowledge assets
(Koenig, 2018). Initially, technological capabilities drove the type of Knowledge Management
Systems available to organizations and the features offered for the utilization of knowledge
assets (Koenig, 2018; Zhang, 2017).
Types of KMS
Several types of information systems exist for collaboration within the workplace among
knowledge workers aimed to support daily work tasks and share knowledge (Centobelli et al.,
2018; Dong et al., 2016; Lee et al., 2019; Zhang & Venkatesh, 2017). However, a review of the
literature revealed studies with multiple types of Knowledge Management Systems to support
Knowledge Management processes that exist within the workplace (Dong et al., 2016; Lee et al.,
2019; Zhang & Venkatesh, 2017). Therefore, discussions denoting types of KMS within an
organization referred to these types based on the purpose of the KMS and the categories
referenced in the literature. Classifications of the KMS are held dependent upon the use of the
system usage, including codified systems encompassing expert coding knowledge and
knowledge exchange systems aimed to share knowledge among employees (Venters, 2010;
Zhang, 2017).
The goal of the codification systems within the KMS domain is to capture tacit
knowledge from subject matter experts into the system, becoming codified explicit knowledge
available to knowledge workers within the workplace (Zhang, 2017). The codification process
within the KMS supports Knowledge Management efforts capturing multiple inputs of
46
information into a well-defined meaningful knowledge asset capable of reuse among knowledge
workers (Kimble et al., 2016). The progression of transforming data into knowledge, known as
semantics, is the basis of the KMS as a codification tool for retrieving varying forms of explicit
knowledge (Kimble et al., 2016). The KMS classification encompassing knowledge exchange
systems covers a wide range of systems supporting the organization’s Knowledge Management
activities and governance (Centobelli et al., 2018; Dong et al., 2016; Lee et al., 2019; Zhang &
Venkatesh, 2017). These systems encourage the exchange of knowledge, the transfer of tacit and
explicit knowledge, and the application of learned knowledge leading to the creation of new
knowledge (Zhang & Venkatesh, 2017). Zhang and Venkatesh explain the encouragement of
employee learning through interactive KMS offering knowledge workers numerous options to
interact and share knowledge within the workplace.
An organization’s KMS may belong to several categories depending on the features and
capabilities of the system purchased by the business to address specific business needs
(Centobelli et al., 2018; Del Giudice & Della Peruta, 2016; Dong et al., 2016.; Koenig, 2018;
Lee et al., 2019; Zhang, 2017; Zhang & Venkatesh, 2017). Koenig (2018) designates these
categories as content management, enterprise location, lessons learned, and communities of
practice A KMS with the sole purpose of warehousing knowledge assets for retrieval by
knowledge workers without collaboration features fall into the category of content management
(Koenig, 2018; Zhang, 2017). The KMS, as an enterprise location tool, provides built-in lookup
capabilities to find organizational subject matter experts possessing both tacit and explicit
knowledge needed by the knowledge worker required to complete a work task (Centobelli et al.,
2018; Zhang & Venkatesh, 2017). Database systems designed to contain and share knowledge
47
assets reflecting business expertise categorized as best practices or lessons learned systems
(Koenig, 2018).
Zhang (2017) expounds on additional features of KMS as lessons learned systems
provide knowledge workers the ability to create ‘how-to’ processes and knowledge-based articles
based on experiences in exercising tacit and explicit knowledge within the workplace meant to
share these experiences with co-workers. The recent development within the KMS domain
includes the emergence of communities of practice enhancing the capture of knowledge through
social means and the advancement of technology, including Web 2.0 tools (Del Giudice & Della
Peruta, 2016; Dong et al., 2016; Orenga-Roglá & Chalmeta, 2019). KMS inclusive of
communities of practice provides advanced technology features to capture tacit knowledge using
Web 2. 0 technologies such as video and audio intended for social collaboration (Del Giudice &
Della Peruta, 2016). Emerging technologies in the marketplace have revolutionized the
capabilities of KMS, incorporating multiple communication tools to support the Communities of
Practice platform (Lee et al., 2019).
Common characteristics of the KMS include a database providing a repository of
captured knowledge, access to the intranet, and or internet, aimed to support the creation,
capture, storage, and consumption of knowledge assets (Centobelli et al., 2018; Dong et al.,
2016; Lee et al., 2019; Zhang & Venkatesh, 2017). Powerful KMS containing several
capabilities to support all aspects of the organization’s Knowledge Management processes also
offer collaboration spaces for editing of knowledge within the system, conferencing capabilities,
and automation of knowledge flow within the system (Dong et al., 2016; Lee et al., 2019; Zhang
&
Venkatesh, 2017).
48
Implementation of KMS
The implementation of KMS may include both the initial installation of the KMS within
an organization by setting up a technology system and the application of the Knowledge
Management procedures and knowledge worker expected use of the KMS (Intezari & Gressel,
2017; Zhang, 2017; Zhang & Venkatesh, 2017). Kimble et al. (2016) suggest that organizations
identify the business needs before selecting the technology platform for the KMS. On the other
hand, researchers Orenga-Roglá and Chalmeta (2019) emphasize recognizing the organizational
culture and technology impacts the KMS will generate. Organizations may select the
functionality of the KMS as single-threaded seen in a content management system or the
selection of multi-functional capabilities as experienced with a community of practice
incorporating advanced system features for an enhanced experience within the KMS (Orenga-
Roglá & Chalmeta, 2019; Zhang & Venkatesh, 2017). Researchers agree that a systematic
approach for implementing the KMS is essential to knowledge worker job performance and
successful Knowledge Management while strategies to conduct this task lack within the literature
(Orenga-Roglá & Chalmeta, 2019; Zhang & Venkatesh, 2017).
Researchers describe key benefits from KMS implementations that deliver enhanced
Knowledge Management capabilities, financial gains, increased learning opportunities, and
competitive advantage for the organization (Intezari & Gressel, 2017; Zhang, 2017). Intezari and
Gressel (2017) relay the benefits of this implementation also transforms daily employee
knowledge encounters into new explicit experiences leading to innovative products and services.
Wang and Wang (2016) analyzed completed surveys from 291 Taiwanese businesses and found
relationships between innovation, organizational influences, and environmental factors within
49
KMS implementation. The researchers suggested future studies should include additional
performance indicators between the KMS and implementation factors (Wang & Wang, 2016).
Knowledge Worker Use of KMS
Regardless of the type of KMS or implementation strategies organizations employ to
manage knowledge, innovative technologies continue to improve knowledge worker utilization
of the KMS (Demirsoy & Petersen, 2018; Özlen, 2017). Business leaders desire knowledge
workers to contribute learned knowledge to the content of the KMS, utilize the KMS as a
knowledge-based source, and support knowledge work task efficiency (Sutanto et al., 2018).
Frequent interaction with the KMS encourages knowledge sharing and integration of knowledge
assets for the reuse of knowledge among knowledge workers (Martins et al., 2019; Özlen, 2017;
Shujahat et al., 2019). As knowledge workers utilize the system, these KMS activities enable the
continuous knowledge life cycle to create, capture, store, access and share knowledge (Demirsoy
& Petersen, 2018; Jahmani et al., 2018; Surawski, 2019). Participation within this cycle
contributes to transforming tacit knowledge into explicit knowledge in a reusable format (Tserng
et al., 2016). Knowledge workers then become motivated to seek solutions within the KMS by
searching stored knowledge to retrieve explicit instruction within their work tasks (Demirsoy &
Petersen,
2018; Zhang, 2017).
Researchers have pursued motives contributing to knowledge worker use of the KMS,
such as ease of using the system and collaboration capabilities (Del Giudice & Della Peruta,
2016; Dong et al., 2016; Lee et al., 2019; Zhang & Venkatesh, 2017). The knowledge worker
assesses the ease of using the KMS forms based on experiences resulting from interaction with
the system and perceived success (Del Giudice & Della Peruta, 2016; DeLone & McLean, 2003;
Dong et al., 2016). Based on the features available during the use of the KMS, knowledge
50
workers may be willing to interact with the system to seek available knowledge to perform a
work task or contribute learned knowledge to the KMS (Zhang & Venkatesh, 2017). Also,
knowledge workers continue to rely upon the KMS when the interaction with the system
produces expected results (Del Giudice & Della Peruta, 2016; Lee et al., 2019). Researchers
speculate the degree of utilization of the organization’s KMS contributes to Knowledge
Management success due to contributing factors encumbering organizational culture, the KMS
infrastructure, and knowledge worker impression of the KMS technology infrastructure (Lee et
al., 2019; Özlen, 2017).
Xiaojun (2017) performed a study reviewing the implementation features of one
organization’s KMS to identify potential factors influencing knowledge worker tasks, KMS
usage, and user experience. Xiaojun retrieved surveys and interviews from 1,441 knowledge
workers in finance and budgeting, accounting, personnel, customer management, sales,
advertising, and public relations business units. Xiaojun’s findings from the study signified a
positive relationship for knowledge workers based on the moderating influences from the
specific task, KMS, knowledge worker, and leadership.
KMS Performance Indicators
Jennex and Olfman (2006) outlined six key performance indicators within the Jennex
and Olfman KM Success Model as specific components utilized to measure the performance of
the KMS. These knowledge-specific performance indicators remain intact since the original
model known as knowledge quality, system quality, service quality, intent to use/perceived
benefit, use/user employee satisfaction, and net system benefits (Jennex, 2017; Jennex &
Olfman, 2006). The Jennex and Olfman KM Success Model (2006) are very similar to the
DeLone and McLean IS Success Model (1994) to address Knowledge Management dimensions
51
within each category. These categories include information/knowledge quality, system quality,
service quality, intent to use, use/user employee satisfaction, and net benefits (Jennex & Olfman,
2006). Jennex and Olfman presented the KM Success Model to address the growing need to
measure an organization’s KMS key performance indicators and knowledge dimensions updating
the construct of captured information into reusable knowledge assets (Jennex, 2017; Jennex &
Olfman, 2006; Liu et al., 2008).
Karlinsky-Shichor and Zviran (2016) reveal that scholars continue to interchange
information quality and knowledge quality as the same KMS component. This exchange is
evident as researchers continue to use theoretical constructs within the DeLone and McLean IS
Success Model when analyzing the dimensions of an organization’s KMS in conjunction with the
Jennex and Olfman KM Success Model (Liu et al., 2008; Nusantara et al.2018; Wu & Wang,
2006). Karlinsky-Shichor and Zviran (2016) presented a model in their study similar to both the
DeLone and McClean (2003) IS Success Model the Jennex and Olfman (2006) KM Success
Model analyzing only the information quality, system quality, and service quality. Instead, the
researchers added moderators in the user competence during usage of the system based on the
organization’s Knowledge Management capabilities (Karlinsky-Shichor & Zviran, 2016). In this
study, the researcher’s results based on 100 participants belonging to a knowledge-focused role
within the software industry indicated business leaders must consider both technical and cultural
influences of the KMS to foster acceptance from knowledge workers (Karlinsky-Shichor &
Zviran, 2016).
Alarming IDC survey results reveal the need for performance metrics to identify KMS
impact on business performance. The IDC reported that organizations with at least 1,000
employees would comprise at least 45% of global technology spending in 2020 (Vanian, 2016).
52
Medium-sized businesses consisting of 100 to 999 employees, intend to spend the most on
improving business performance (Vanian, 2016). The ISO recently created ISO 30401:2018 to
support organizations in developing a KMS for promotion and enablement of knowledge worker
productivity (“Knowledge Management Systems,” 2018). This evidence supports the application
of the Jennex and Olfman (2006) KM Success Model for organizations seeking key performance
indicators of KMS success based on the original six critical success factors. The performance
indicators include knowledge quality, system quality, service quality, intent to use/perceived
benefit, use/user employee satisfaction, and net system benefits (Jennex, 2017; Jennex &
Olfman, 2006). A reciprocal stream of influence often flows between the KMS and KM
capabilities within an organization leveraging the ability to harness the effective management of
knowledge, thereby leading to increased KMS acceptance and use
(Shrafat, 2018).
Several authors supplement Knowledge Management capabilities with knowledge
sharing and KMS usage, further influencing business performance (Martinez-Conesa et al., 2017;
Meng et al., 2014; Nahapiet & Ghoshal, 1998; Zhang et al., 2018). However, other authors
present the effect of Knowledge Management capabilities incorporating knowledge sharing and
KMS usage as a direct stimulus on increased innovation within the organization (Hamdoun et al.,
2018; Hock, Clauss, & Schulz, 2016; Martinez-Conesa et al., 2017; Oparaocha, 2016). The
Jennex and Olfman KM Success Model (2006) key performance indicators offer the organization
a mechanism to measure the performance of the KMS into individual components and
dimensions as markers of knowledge sharing often connected to business performance. In this
model, the knowledge worker’s perspective of the reliability and accuracy of the retrieved results
within the KMS encouraging knowledge sharing measured by the richness dimension of
knowledge quality serving as the focal of this study.
53
KMS Knowledge Quality. Knowledge quality as the first of six components in the
Jennex and Olfman KM Success Model (2006) holds the spotlight as the performance indicator
in examining a potential relationship between KM process/strategy, richness, and linkage as a
dimension of knowledge quality within the KMS. Researchers agree the deficits in knowledge
quality stored within an organization’s KMS prevents knowledge workers from retrieving
accurate information (Jennex, 2017; Karlinsky-Shichor & Zviran, 2016; Sutanto et al., 2018;
Zhang, 2017). Researchers attempt to operationalize the knowledge quality of a KMS from the
knowledge worker perception during utilization of the system (Jennex, 2017; Jennex & Olfman,
2006; Karlinsky-Shichor & Zviran, 2016; Sutanto, Liu et al., 2018; Zhang, 2017). In this study,
the construct of information quality and knowledge quality within a KMS synonymously contain
dimensions of the Knowledge Management processes/strategies, the richness of the retrieved
results, and the linkage of the KMS from the KMS knowledge quality construct (Karlinsky-
Shichor & Zviran, 2016). Knowledge workers expect to easily recover knowledge within the
organization’s KMS within the context of the search query in a timely and accurate fashion
(Jahmani et al., 2018). This expected knowledge content quality allows the knowledge worker to
combine current knowledge with the explicit knowledge system to generate new knowledge
assets available for reuse within the system (Jahmani et al., 2018).
The KMS incorporates the foundation of the Information Systems Theory (IST), bridging
the interaction with the system the underlying computational logic based on the models
embedded within the system to return expected results (Langefors, 1977; Lerner, 2004). These
results impact the knowledge worker experience using search queries to retrieve knowledge
determined by the relevant content (Demirsoy & Petersen, 2018; Karlinsky-Shichor & Zviran,
2016). Demirsoy & Petersen (2018) describe the Bayes classifier model and vector space model
54
as existing computational logic used as a framework for systems, including the KMS. These
models search only the vocabulary meanings of each word complicated by word vagueness, and
multiple definitions of one word may return irrelevant query results (Demirsoy & Petersen,
2018). Advanced queries logic often incorporates classifying and clustering combined with
statistical word frequencies to rank the sequence of assumed relevant results (Demirsoy &
Petersen, 2018).
In contrast, semantic information searches combine the computational logic of the KMS
based on the underlying semantic logic model to interpret the definition of words contained in
the search query with continued knowledge worker use of the system to improve the relevancy of
the results (Demirsoy & Petersen, 2018). As a component of knowledge quality within a KMS,
the linkages represent the structure of retrieved results returned after performing a search within
the KMS (Karlinsky-Shichor & Zviran, 2016). The richness as a component of knowledge
quality within a KMS signifies the accuracy and timeliness impacting the context of retrieved
results within the KMS (Karlinsky-Shichor & Zviran, 2016). The technology infrastructure of the
KMS, the continued use of the KMS, and the constant update of knowledge with the system
contribute to the knowledge quality of the KMS (Demirsoy & Petersen, 2018). Therefore, the
Knowledge Management processes and strategies as an element of knowledge quality within a
KMS comprise the unique methods, activities, and procedures the business sets in place to
manage knowledge assets (Demirsoy & Petersen, 2018; Karlinsky-Shichor & Zviran, 2016).
The degree of knowledge content quality retrieved from the KMS impacts the knowledge
worker’s perceived usefulness of the system (Jahmani et al., 2018; Zhang, 2017). This perceived
usefulness produces attitudes and behaviors related to knowledge workers’ motivation to utilize
the KMS for knowledge sharing within the organization (Jahmani et al., 2018; Zhang, 2017).
55
However, (Sutanto et al., 2018) maintain the knowledge worker efficiency during the use of the
KMS does not improve based on the perception of the quality of retrieved results. Jahmani et al.
(2018) concluded a study from multiple hospitals surveying healthcare staff regarding KMS
components finding that knowledge content quality retrieved from the knowledge worker is a
functional requirement for the perceived usefulness of KMS. Eltayeb and Kadoda (2017)
performed semi-structured interviews with experts, managers, and employees from multiple
organizations in the region to identify connections between Knowledge Management and
business performance. The researchers concluded a significant relationship between Knowledge
Management practices, future business strategies, and business performance and request future
researchers to explore the quality of knowledge information stored within the KMS (Eltayeb &
Kadoda, 2017).
Knowledge Worker Productivity
Knowledge worker productivity (KWP) as the fourth major theme within this study was
discussed as KWP measurement challenges, KWP enablement, and the KWP financial
implications while using the organization’s Knowledge Management System. Peter Drucker
rationalized the productivity capability of knowledge workers led to increased business
performance and financial gains (Drucker, 1999; Iazzolino & Laise, 2018). Productivity efforts
may benefit by empowering knowledge workers, instilling autonomy, enacting continuous
improvement for innovation, allowing self-governance of quality, and treating knowledge
workers as assets and not merely resources (Drucker, 1999; Iazzolino & Laise, 2018).
KWP Measurement Challenges
Challenges in measuring knowledge worker productivity arise from capturing potential
intangible tasks aimed to assign a metric for comparison of changes in productivity (Iazzolino &
56
Laise, 2018; Karlinsky-Shichor & Zviran, 2016). Productivity is one of the perceived KMS
benefits often cumbersome to measure (Karlinsky-Shichor & Zviran, 2016; Turriago-Hoyos et
al., 2016). The underlying activities impacting knowledge worker productivity and measurable
outcomes become intertwined with current Knowledge Management processes and knowledge
worker perceptions (Turriago-Hoyos et al., 2016). Iazzolino and Laise (2018) reviewed Drucker
and Public’s model to identify the meaning and measurement of productivity surrounding the
knowledge workers, management, and stakeholders. Iazzolino and Laise perceived like Drucker;
Public believed the measure of knowledge workers was comparable to manual workers based on
the value-added of activities and the ways of the employees. Public’s proposal to translate
knowledge worker’s activities into value-added metrics aligns with the purpose of the income
statement. Identifying the investment of human capital supports the notion of workers as
investments and not merely a line-item cost (Iazzolino & Laise, 2018). Methods rating the
measurement of knowledge worker productivity as a ratio calculation between the value-added
metric of each knowledge worker and the total number of employees and the differences in
productivity vary among the research (Duarte, 2017; Iazzolino & Laise, 2018).
KWP Enablement
The enablement of knowledge worker productivity begins with the effective use of the
KMS by knowledge workers within an organization supporting the Knowledge Management
capabilities (Shrafat, 2018). KMS usage as a dimension of Knowledge Management capabilities
is often noted as the most desired capability an organization strives to achieve expressed as an
exchange of organizational information, knowledge, and skills (Caruso, 2017; Intezari et al.,
2017; Navimipour & Charband, 2016). The knowledge worker productivity may easily be
measured when operationalized into variables measuring the number of times a user successfully
57
retrieved knowledge within an existing KMS (Dey & Mukhopadhyay, 2018). Researchers link
Knowledge Management activities as contributors to increased business performance and
productivity stemming from knowledge capturing, sharing, and application of knowledge assets
(Shrafat, 2018).
KWP Financial
Implications
The failure of businesses to implement a successful KMS to retrieve knowledge assets
reported productivity losses well over 5.7 million published by Fortune 500 organizations
(Ferolito, 2015). The International Organization for Standardization (ISO) developed ISO
30401:2018 as the standard for the KMS organization followed for the promotion and
enablement of knowledge creation. The creation of the ISO 30401:2018 certification for
organizations with existing KMS supports organizations’ efforts to empower knowledge worker
productivity through efficient retrieval of organization knowledge for knowledge sharing (ISO
30401, 2018). Losses in knowledge worker productivity continue to plague businesses incapable
of leveraging the organization’s KMS to sustain knowledge sharing activities supporting
increased business performance (Ferolito, 2015; “ISO 30401:2018,” 2018; ).
Scholars report business leaders fail to ensure the organization’s KMS incorporates Knowledge
Management processes necessary for the promotion of knowledge worker productivity (Jennex,
2017; Karlinsky-Shichor & Zviran, 2016; Sutanto et al., 2018; Vanian,
2016; Xiaojun, 2017).
While utilizing the organization’s KMS, knowledge worker productivity then requires the
capability to accomplish the knowledge work task in an efficient and timely manner (Shujahat et
al., 2019). According to Vanian (2016), the continued loss of millions of dollars flows from the
failure of businesses to implement an efficient KMS to support knowledge worker productivity
that necessitates further research.
58
Employee Satisfaction
Employee satisfaction is one of the desired outcomes after implementing the
organization’s KMS in support of KM activities (Zhang & Venkatesh, 2017). The specific KM
capabilities realized within the organization lead to employee satisfaction as an outcome through
employee empowerment to perform assigned job tasks (Zamir, 2019). Employee satisfaction as
the final major theme in this study is based on literature in context from the KMS usage
perspective (Jennex, 2017; Jennex & Olfman, 2006; Zamir, 2019; Zhang & Venkatesh, 2017). A
review of the literature further reveals subthemes related to this study as the user satisfaction
during the usage of the organization’s KMS and the KMS knowledge quality constructs defined
as KM strategy/process, richness, and linkage (Jennex & Olfman, 2006; Jennex, 2017; Kumar,
2018; Zamir, 2019; Zhang & Venkatesh, 2017).
User Satisfaction
The Jennex and Olfman KM Success Model describe the user satisfaction dimension in
this model as an indicator of the employee’s satisfaction with their interaction with the KMS
(Jennex & Olfman, 2006; Jennex, 2017).
The user’s experience during the successful retrieval of
knowledge assets performing KM activities has linked these results with employee satisfaction to
gain the desired knowledge (Popa et al., 2018). During the employee’s use of the organization’s
KMS to support KM activities, user satisfaction comes from the capability to retrieve the
knowledge asset from the system to complete assigned job tasks as noted in the Jennex and
Olfman KM Success Model (Jennex & Olfman, 2006; Jennex, 2017). Employee satisfaction
varies based on the KMS user’s experience evidenced in the knowledge outcome determined by
the user’s specific job task (Khanal and Raj Poudel, 2017).
59
KM Strategy/Process. The KM Strategy/process construct is the first indicator of KMS
knowledge quality described as the KM strategies determining the KM processes for knowledge
workers while performing KM planned activities (Jennex & Olfman, 2006; Jennex, 2017). The
organization’s KM strategy determines the processes upheld for the flow of knowledge assets
within the KMS and affects the KMS knowledge quality (Popa et al., 2018). Research study
results connect employee satisfaction with KM’s capabilities and KMS strategies, allowing the
knowledge worker to perform tasks because of KM activities(Khanal and Raj Poudel, 2017).
Richness. The richness constructs depicted as the second indicator of the organization’s
KMS knowledge quality was described as the result of the accuracy and timeliness retrieval of
the knowledge asset within the KMS (Jennex, 2017; Jennex & Olfman, 2006). Richness is an
indicator of retrieving the desired results from performing a search within the KMS and the
success of those results (Zhang & Venkatesh, 2017). Employee satisfaction within the context of
KMS use becomes a consequence of the knowledge worker’s expectations from the KMS to
perform knowledge work (Jennex, 2017; Jennex & Olfman,
2006; Zhang & Venkatesh, 2017).
Linkage. The internal codification of the stored knowledge assets creates an internal
mapping within the KMS. Behind the scenes, codification impacts the knowledge worker based
on the search query contributing as the third indicator of the organization’s KMS knowledge
quality (Jennex, 2017; Jennex & Olfman, 2006). During the implementation of the KMS, the
internal structure originates in an internal network of logic from the available KMS features and
capabilities (Karlinsky-Shichor & Zviran, 2016). Employee satisfaction becomes affected by the
internal link competencies supporting the additional constructs of KMS knowledge quality
(Jennex, 2017; Jennex & Olfman, 2006).
60
Summary
The literature review followed the lens of the Jennex and Olfman KM success model’s
knowledge quality within a Knowledge Management System as one of the components (Jennex,
2017; Jennex & Olfman, 2006; Liu et al., 2008). Information systems theory served as the
seminal foundation of this model relevant to the underlying technology influencing the
knowledge quality of the KMS (Langefors, 1977; Lerner, 2004). The Jennex and Olfman KM
Success Model (2006) allowed the appropriate framework for this study. This model adds
specific knowledge aspects asserting comparative dimensions of performance indicators detailed
in the DeLone and McLean IS Success Model (DeLone & McLean, 1992; DeLone & McLean,
2003; DeLone & McLean, 2004; Liu, Olfman, & Ryan, 2005; Zuama et al., 2017). Related
knowledge domains listed within the literature review include knowledge workers, Knowledge
Management, Knowledge Management Systems, knowledge worker productivity, and employee
satisfaction as the five major themes encompassing subdomains forming the context of this
research.
Additional subcategories documented within the knowledge worker domain section
included knowledge work, knowledge workers in the technology industries, and the 21st-century
role of knowledge workers. Researchers Eltayeb and Kadoda (2017) concluded a significant
relationship between Knowledge Management practices, future business strategies, and business
performance upon analysis from interviews with experts, managers, and employees in the region.
Eltayeb and Kadoda request future researchers to explore the quality of knowledge information
stored within the KMS. Within the Knowledge Management major theme, the subcategories are
the history of Knowledge Management, components of Knowledge Management, and
organizational units of Knowledge Management. Additional dimensions within the components
61
of the Knowledge Management subcategory included the creation of knowledge assets, capturing
knowledge assets, sharing knowledge assets, storage of knowledge assets, and consumption of
knowledge assets. The organizational unit Knowledge Management also contains additional
dimensions within the subcategory supported by the literature, including information
management, process management, people management, innovation management, and asset
management. AlShamsi and Ajmal’s (2018) report result from a study investigating the critical
success factors to promote knowledge sharing identifying leadership, culture, and strategy as
leading indicators. AlShamsi and Ajmal encourage future studies to review additional essential
elements of success in new industries. The third major theme in this literature review is the
Knowledge Management System. This theme incorporates five subcategories comprised of the
history of KMS, types of KMS, implementation of KMS, knowledge worker use of KMS, KMS
performance indicators, and KMS knowledge quality. Iskandar et al. (2017) accumulated top
research articles to request future studies to identify additional features of the KMS supporting
the organization’s Knowledge Management processes.
The fourth domain supported within the literature is knowledge worker productivity,
including the subcategories KWP measurement challenges, KWP enablement, and KWP
financial implications. Shujahat et al. (2019) collected data from 369 knowledge workers within
the IT industry to identify potential relationships between the knowledge worker productivity
and Knowledge Management processes. The results indicated significant linkages between
knowledge creation and knowledge utilization to increased innovation influenced by
productivity. Shujahat et al. (2019) request future research to include additional Knowledge
Management process influences.
62
Employee satisfaction as the final theme in this study supported by a review of the
literature encompasses user satisfaction, and the three constructs of the KMS knowledge quality
dimension of the Jennex and Olfman KM Success Model (Jennex, 2017; Jennex & Olfman,
2006). Researchers agree the common thread to employee satisfaction within the context of KM
activities while using the KMS is the enablement for the knowledge worker to retrieve
knowledge assets required to perform job tasks (Jennex, 2017; Jennex & Olfman, 2006; Zamir,
2019; Zhang & Venkatesh, 2017). Employee satisfaction is one of the desired outcomes after the
implementation of the organization’s KMS in support of KM activities (Zhang & Venkatesh,
2017). The specific KM capabilities realized within the organization lead to employee
satisfaction as an outcome through the empowerment of the employees to efficiently perform
assigned job tasks (Zamir, 2019).
Employee satisfaction as the final major theme in this study is based on literature in
context from the KMS usage perspective (Jennex, 2017; Jennex & Olfman, 2006; Zamir, 2019;
Zhang & Venkatesh, 2017). A review of the literature further reveals subthemes related to this
study as the user satisfaction during the usage of the organization’s KMS and the KMS
knowledge quality constructs defined as KM strategy/process, richness, and linkage (Jennex &
Olfman, 2006; Jennex, 2017; Kumar, 2018; Zamir, 2019; Zhang & Venkatesh, 2017).
Researchers identified correlations between the successful implementation of an organization’s
KMS to promote KM activities, employee performance, and satisfaction (Zamir, 2019; Zhang &
Venkatesh, 2017).
63
Chapter 3: Research Method
A thorough review of the literature in chapter two revealed that the Jennex and Olfman
KM Success Model supports the appropriate theoretical framework in this study (Jennex, 2017;
Jennex & Olfman, 2006; Liu et al., 2008). Five domain sections within the literature review in
support of this framework included knowledge workers, Knowledge Management, Knowledge
Management Systems, knowledge worker productivity, and employee satisfaction, forming the
background of this research. The literature review supports the research method and design
implemented in this study to examine the relationship between KMS knowledge quality,
knowledge worker productivity, and employee satisfaction.
The failure of business leaders to implement a KMS capable of providing knowledge
quality reduces knowledge worker productivity and results in millions of dollars in annual losses
(Ferolito, 2015; Vanian, 2016). Numerous researchers have studied how the implementation and
maintenance of an organization’s KMS affect the knowledge workers’ ability to retrieve
knowledge assets (Andrawina et al., 2018; De Freitas & Yáber, 2018; Ferolito, 2015; Xiaojun,
2017; Zhang & Venkatesh, 2017). Deficiencies in the quality of information stored within an
organization’s KMS prevents knowledge workers from retrieving these knowledge assets,
thereby reducing knowledge worker productivity and employee satisfaction (Jennex, 2017;
Jennex & Olfman, 2006; Karlinsky-Shichor & Zviran, 2016; Khanal & Raj Poudel, 2017; Popa
et al., 2018; Sutanto et al., 2018; Xiaojun, 2017; Zhang & Venkatesh, 2017). The International
Data Corporation (IDC) reports that one-third of a knowledge worker’s daily responsibilities will
require the searching for and acquiring of information needed across several knowledge systems,
and only 56% of the time, the information becomes found (Ferolito, 2015; Vanian, 2016).
64
The purpose of this quantitative survey research method is to explore the relationship
between the knowledge quality of an organization’s Knowledge Management System, knowledge
worker productivity, and employee satisfaction for firms in the software industry in California.
The theoretical concept to measure the KMS knowledge quality stems from the Jennex and
Olfman KM Success Model adapted after the DeLone and McLean IS Success Model
incorporating the knowledge quality expectations from an organization’s KMS (DeLone &
McLean, 1992; DeLone & McLean, 2003; DeLone & McLean, 2004; Jennex & Olfman, 2006).
The research questions support the statement of the problem and purpose of the study, forming
the basis for the research method and design.
RQ1. To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and knowledge worker productivity?
RQ2. To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and employee satisfaction?
The remainder of this chapter includes details of this quantitative research methodology
and correlational design supporting the study’s defined problem and purpose. Next is a
description of the population and sample participants intended for this study. This is followed by
the questionnaire and survey as the planned instrument and tool deliver the gateway for the
remainder of the chapter. The details of the independent and dependent variables used in this
study served as the operational definitions of variables supported by research. In the study
procedures section of this chapter, a description of actions used to gather the data for the number
of sample participants provided researchers a clear understanding when replicating this study.
The intended strategies to test the hypotheses when answering the research questions were listed
in this section also used to address the problem in this study based on participant data collection.
65
A description of the assumptions with supporting rationale, limitations, delimitations, and ethical
assurances finalized the sections enclosed in this chapter, supplemented the reader’s
understanding of this study’s goals. Finally, the significant points of this study listed in summary
conveyed the foundational concepts supporting the study topic in this chapter.
Research Methodology and Design
This quantitative, correlational research study explored if a relationship exists between
the KMS knowledge quality, knowledge worker productivity, and employee satisfaction. This
quantitative research method was the most relevant to this study and accomplished the goal of
exploring the relationship between the identified variables supporting the problem, purpose, and
research questions from the same participant sample (Mellinger & Hanson, 2016). This
correlational research design applied correlational statistical methods and identified the
relationship between variables (Cavenaugh, 2015). The Pearson’s Correlation test was planned to
be used to examine if a positive, negative, or zero association existed between the variables;
however, the assumption of the normal distribution was not met and replaced by the Spearman’s
coefficient of rank correlation test as the distribution was not normal (Field, 2013). The
quantitative research questions in this study assisted the researcher in devising this study’s
structure, allowed the researcher to answer questions tested by using the hypotheses statements
as a guide during the collection and analysis of the data (Punch, 2013). The quantitative research
design in this study assisted the researcher in implementing the research questions and identify
the relationship between the independent variable identified as KMS knowledge quality and the
dependent variables determined as knowledge worker productivity and employee satisfaction
(Jennex & Olfman, 2006; Jennex, 2017).
66
Descriptive, causal-comparative/quasi-experimental, and experimental quantitative
research designs were considered inappropriate for this study due to the noncausal relationship-
focused method (Mellinger & Hanson, 2016; Punch, 2013). Since the descriptive design does not
form a hypothesis until after the participant data collection, this design was determined as
irrelevant as this study’s hypothesis was based on current literature. Causal-comparative/quasi-
experimental design aims to identify cause-effect between identified variables and do not
manipulate the dependent variable. An experimental design that utilizes the scientific method to
manipulate the independent variable to identify the outcome of the dependent variable in a
controlled environment did not apply to this study.
Qualitative and mixed methods research methods determined the least desired as the
methodology due to the purpose of this study to analyze potential relationships between the
independent and dependent variables (Mukhopadhyay & Gupta, 2014). Qualitative research
methods did not meet the goal of the proposed research study due to the variables of the study
identified and the planned statistical measurement (Mukhopadhyay & Gupta, 2014). Another
constraint when considering qualitative research includes the time and cost of gathering and
interpreting the data. Qualitative research includes conducting interviews, observing participants,
and facilitating focus groups to categorize and codify based on the prerequisites to data analysis
(Johnson & Onwuegbuzie, 2004). Mixed methods as a research method did not deem appropriate
for this study based on the variables requiring validity and reliability efforts when reporting the
analysis of the relationships (Johnson & Onwuegbuzie, 2004). Researchers utilize qualitative
designs based on unknown variables and problem generalization, guiding the purpose of the
study and not applicable based on known variables in this study (Flick, 2018). An experimental
design was not amiable for this study. The correlational research design was best suited for this
67
study in support of the research questions to determine if a relationship exists between the KMS
knowledge quality as the independent variable and the dependent variables identified as
knowledge worker productivity and
employee satisfaction.
Researchers agree that the knowledge quality within an organization’s KMS proves a
vital performance indicator of success within the digital management of knowledge assets
(Jennex, 2017; Jennex & Olfman, 2006; Karlinsky-Shichor & Zviran, 2016). The knowledge
quality of the KMS system comprises multiple dimensions of KM strategy/process, richness, and
linkage working in the background within the KMS knowledge quality construct as the focus of
this study (Jennex, 2017; Jennex & Olfman, 2006). Jennex and Olfman (2006) further describe
knowledge quality as a component of the KM Success Model, signifying the success of the
knowledge worker’s productivity and user satisfaction based on the context of the explicit
knowledge retrieved within the KMS. Researchers also describe correlational ties of knowledge
worker productivity based on interaction activities with the organization’s KMS to innovation
and financial outcomes (Drucker, 1999; Karlinsky-Shichor & Zviran, 2016; Iazzolino & Laise,
2018; Shrafat, 2018; Shujahat et al., 2019; Zhang, 2017; Zaim et al., 2019). Within the Jennex
and Olfman KM Success Model (2006), the productivity outcome of knowledge workers points
to dimensions of KMS knowledge quality. At the same time, researchers report that lack of
knowledge quality within an organization’s KMS affects knowledge workers during the retrieval
of knowledge within the KMS (Jennex, 2017; Karlinsky-Shichor & Zviran, 2016; Sutanto et al.,
2018; Zhang, 2017).
According to Bloomfield and Fisher (2019), quantitative research methods enlist the
positivism paradigm approach adopting research guided by logistical collection and analysis to
generate the reality of the phenomena. This study follows the same practices of current research
68
employing a quantitative research method collecting data from participants by anonymous online
survey techniques. Next, the researcher plans to identify the population and sample participants,
noting the estimated size and characteristics supported by evidence of this population reflecting
the problem, purpose, and research questions in this study.
Population and Sample
The identified population for this study includes businesses classified within the software
industry. The focus of this study further narrowed the population to businesses within California
within the software industry with headquarters in California. Requirements for participants in
this study introduced as those employed in the software industry referred to as the knowledge
worker category (Moussa et al., 2017; Surawski, 2019; Turriago-Hoyos et al., 2016). Using the A
priori power analysis within the G*Power software allows the identification of the sampling
sizes needed for this correlational study indicating a sample size of 153 participants because of
the output for the priori power analysis (medium effect size = .0625, error = .05, power = .95,
predictors = 1) in Appendix A Figure 3. These factors used an alpha level of p = .05, allowed the
researcher to achieve an 80% probability of accurately finding a significant result supported by
the alpha level indication of a significant difference among the groups. The target sample size
included 153 qualified knowledge worker participants. Survey questions included the constructs
of KMS knowledge quality, knowledge worker usage, knowledge worker productivity, and
employee satisfaction questions for data collection efforts. The data collection efforts for this
study began by contracting Qualtrics panel services to solicit online participation for knowledge
workers in the software industry in California. Qualtrics panel services allowed the targeting of
participants from several organizations for full and part-time employees within the existing
69
technology-based departments, considered knowledge workers, to complete an anonymous
online survey.
Instrumentation
The Jennex and Olfman KM Success Model served as the basis for selecting the KMS
knowledge quality as the independent variable and knowledge worker productivity and employee
satisfaction during the use of the KMS as the dependent variables (Jennex, 2017; Jennex &
Olfman, 2006). The researcher gained permission to use Halawi’s (2005) KMS Success survey
instrument for data collection that formulated the research questions as shown in
Appendix B
Figure 4. The online survey for this study required a conversion of Halawi’s (2005) original
printed KMS Success survey as an instrument in Appendix C Figure 5 to an online survey
questionnaire using Qualtrics panel services. The online KMS Success survey offered the same
survey questions as the original KMS Success survey. The online KMS Success survey questions
ensured the capture of KMS usage in terms of the knowledge quality of the KMS, knowledge
productivity, employee satisfaction, and six demographic questions using the same 7-point Likert
scale (Halawi, 2005). According to Halawi (2005), the validity of the KMS Success survey
instrument confirmed the measurement of the variables in context, and the reliability of the
survey instrument held consistent as a result of thorough preliminary methods implementing a
documented pre-test, pilot test, factor analysis, and internal consistency validations (Halawi,
2005).
Data collected from the online survey instrument served as the mechanism that provided
the researcher answers to the study research questions and guidance for the acceptance or
rejection of the null hypothesis (Wright, 2017). The researcher implemented the research
questions as a guide to the survey questions and response choices. Advantages exist when using
70
a survey to collect data articulating answers representing the data from the population (Kelley-
Quon, 2018). If the sample size is large enough to represent the population, collected data should
yield answers like those received if the entire population took the same survey. Another
advantage of administering online surveys to collect data is the removal of researcher
subjectivity in the participant’s answers (Kelley-Quon, 2018). Disadvantages during the sole use
of online surveys as instruments to collect data may introduce the offending of participants due
to the general question format aimed toward the entire population (Wright, 2017). Another
disadvantage to online surveys prevents capturing the participant’s emotional response to
questions which generally provides the researcher with the depth of the emotion attached to the
question (Wright, 2017).
The researcher contracted the use of Qualtrics panel survey services and implemented
Halawi’s (2005) KMS Success survey online in the collection of desired data to answer the
research questions in this study. The reliability and validity of the survey instrument were
confirmed by using Halawi’s proven KMS Success instrument based on the Jennex and Olfman
KM Success Model (2017) in previous studies. Knowledge workers, Knowledge Management,
Knowledge Management Systems, knowledge worker productivity, and employee satisfaction
comprise the five major themes based on the review of the literature to answer the research
questions. Halawi (2005) performed Cronbach’s alpha to measure internal consistency reliability
and confirmed the variables’ accurate measurement (Allen, 2017). In this study, the researcher
entered the collected data into IBM SPSS
Operational Definitions of Variables
This quantitative, correlational research study consisted of one independent variable,
KMS knowledge quality, and two dependent variables, knowledge worker productivity and
71
employee satisfaction, displayed in Appendix D Table 1. The Jennex and Olfman KM Success
Model served as a guide for the context and operational definitions of these variables (Jennex,
2017; Jennex & Olfman, 2006; Liu et al., 2008). Halawi’s (2005) KMS Success survey
instrument was the tool used to answer the research questions further supported by the
independent and dependent variables in the Jennex and Olfman KMS Success Model context. A
7-point Likert scale within the online survey instrument included the same questions based on
Halawi’s (2005) KMS Success survey instrument. The independent variable and dependent
variables utilizing the Likert 7-point scale model followed the standard scores of 1 = Strongly
Disagree, 2 = Moderately Disagree, 3 = Somewhat Disagree, 4 = Neutral, 5 = Somewhat Agree,
6 = Moderately Agree, and 7 = Strongly Agree. Peer-reviewed research articles regarding KMS
knowledge quality, knowledge worker productivity, and employee satisfaction facilitated the
appropriate variable constructs and definitions of each variable. The independent and dependent
variables calculated score comprised the average results from the Likert scale representing
interval scale measurements from each question answered on the KMS Success survey
instrument and formed the transformed independent and dependent variables.
KMS Knowledge Quality
This independent, interval variable was transformed from a subset of ordinal 7-point
Likert scale question objects representing the three levels comprising KMS knowledge quality
defined as KM strategy/process, richness, and linkages as supported within the Jennex and
Olfman KM Success Model (Jennex, 2017; Jennex & Olfman, 2006; Liu et al., 2008).
Researchers have found the implementation of the organization’s KMS to affect the knowledge
worker’s ability to retrieve knowledge assets (Andrawina et al., 2018; De Freitas & Yáber, 2018;
Ferolito, 2015; Xiaojun, 2017; Zhang & Venkatesh, 2017).
72
KM Process/Strategy
KM process/strategy is one of three dimensions within KMS knowledge quality serving
as the independent variable contributing to the research questions in this study. As an indicator of
KMS knowledge quality, the KM strategies and processes determine how the knowledge worker
will use the KMS during planned KM activities (Jennex & Olfman, 2006; Jennex, 2017). The
direct result of the KM strategies and processes establishes the capability of retrieving
knowledge assets within the KMS, directly affecting the KMS knowledge quality (Popa et al.,
.2018). Researchers link the capability of the knowledge worker to perform KM activities and
employee satisfaction (Khanal and Raj Poudel, 2017).
Richness
Richness is one of three dimensions within KMS knowledge quality, serving as the
independent variable contributing to the research questions in this study. The knowledge worker
performs search queries within the KMS, expecting successful results of the knowledge assets
within the context of each search (Zhang & Venkatesh, 2017). This indicator of knowledge
quality reflects the accuracy and timeliness of the knowledge assets retrieved from the KMS and
within the context of the expected knowledge return (Jennex, 2017; Jennex & Olfman, 2006). In
KMS use, employee satisfaction becomes a consequence of the knowledge worker’s realized
outcomes from the KMS to complete knowledge work tasks (Jennex, 2017; Jennex & Olfman,
2006; Zhang & Venkatesh, 2017).
Linkage.
Linkage is one of three dimensions within KMS knowledge quality, serving as the
independent variable contributing to the research questions in this study. As an indicator of KMS
knowledge quality, the internal codification of the stored knowledge assets enables an internal
73
mapping within the KMS (Jennex, 2017; Jennex & Olfman, 2006). During the implementation of
the KMS, the initial structure of these mappings became revealed to the knowledge worker after
retrieving the knowledge assets based on the internal logic to create those mappings (Karlinsky-
Shichor & Zviran, 2016). Employee satisfaction becomes affected by the internal linkage of
knowledge asset mappings supporting the additional constructs of KMS knowledge quality
(Jennex, 2017; Jennex & Olfman, 2006).
Knowledge Worker Productivity
Knowledge worker productivity as the first dependent, interval variable became
represented by the knowledge worker’s value-added activities resulting from interaction with the
organization’s KMS (Kianto et al., 2019; Shujahat et al., 2019). Researchers link knowledge
worker productivity concerning KM activities as influencers on business performance and
financial results (Shrafat, 2018; Vanian, 2016; ). Knowledge worker
productivity becomes operationalized as a mechanism to the ability of the user to successfully
retrieving knowledge assets within an existing KMS (Dey & Mukhopadhyay, 2018). The
analysis of data collected using the survey instrument measured the first research question to
determine if a statistically significant relationship existed between the knowledge quality of an
organization’s KMS and knowledge worker productivity.
Employee Satisfaction
Employee satisfaction as the second dependent, interval variable depicted the successful
experience based on the knowledge worker’s actual use of the KMS while performing KM
activities (Zamir, 2019; Zhang & Venkatesh, 2017). Similarly, Jennex and Olfman (2006)
describe use/user employee satisfaction as a component of the performance indicator, referencing
a successful experience based on the actual use of the KMS and employee satisfaction from each
74
use. The analysis of data collected using the survey instrument measured the second research
question and identified if a statistically significant relationship existed between the knowledge
quality of an organization’s KMS and employee satisfaction.
Study Procedures
The following steps describe the completed actions ensuring this study may be replicated.
Northcentral University’s Institutional Review Board (IRB) approved the study before data
collection efforts began as displayed in Appendix F Figure 6. The questions from the original
printed KMS Success survey instrument created and validated by Halawi (2005) were used for
this study displayed in Appendix C Figure 5. The researcher converted Halawi’s (2005) KMS
Success survey printed questions into an online Qualtrics survey. The online KMS Success
survey offered the same survey questions as the original KMS survey, including the KMS
Success questions. The online survey tabulated KMS usage in terms of the knowledge quality of
the KMS, knowledge productivity, employee satisfaction, and six demographic questions using
the 7-point Likert scale (Halawi, 2005). Once collected, the researcher used the online survey
data to align the 7-point Likert scale responses to the independent and dependent variables for
statistical analysis. Using the transform tool in SPSS, the mean of the KMS knowledge quality
survey question objects computed into the independent variable. Similarly, the SPSS transform
tool was used to convert the mean of the survey question objects into each applicable dependent
variable, knowledge worker productivity and employee satisfaction within the context of KMS
usage (Jennex, 2017; Jennex & Olfman, 2006). As shown in Appendix D Table 1, the
independent and dependent variables were a result of the average calculations from the ordinal
Likert scale questions representing interval scale measurements from each question answered on
the KMS Success survey instrument.
75
The researcher contracted Qualtrics panel services to solicit responses from knowledge
workers employed in the software industry residing in California. The survey link was active for
two weeks until the required 153 survey responses were collected. The first webpage of the
online survey listed the purpose of the study and the risks and benefits of participating in the
online survey, and efforts to ensure anonymous participation. On the first page of the survey, the
participant was required to click the ‘I agree’ button to begin the survey registering their consent
to the outlined risks. Once the participants clicked the button, the indication of informed consent
to collect data from each answered survey question was logged. The second page of the online
survey further prequalified the participants listing questions to confirm the participant’s
classification as knowledge workers in the software industry. The researcher stored the data on a
password-protected Microsoft Excel spreadsheet using an alphanumeric labeling system for each
participant’s set of survey responses to secure the participants’ identity. After uploading the
survey data into SPSS, the researcher removed the participants identifying information and
assigned a generic alpha ID for each participant’s responses. Statistical tests were performed on
the collected data, and the production of tables and charts were interpreted and described in the
chapter 4 findings.
Data Analysis
The instrument for data collection was an online survey from Qualtrics with pre-validated
survey questions from the original Halawi’s (2005) KMS Survey in Appendix C Figure 5. Data
collection for this study originated from the survey contracted by Qualtrics panel services that
utilized pre-recruited surveys from knowledge workers in California using the probability
sampling framework. The online KMS Success survey questions matched the same survey
questions as the original printed KMS Success survey. The online survey included the KMS
76
questions tabulating KMS usage in terms of the knowledge quality of the KMS, knowledge
productivity, employee satisfaction, and six demographic questions on a 7-point Likert scale
(Halawi, 2005). The research questions examining KMS knowledge quality, knowledge, worker
productivity, and employee satisfaction were measured using an acceptable medium effect size
of .0625 (Field, 2013). The assumption in using an alpha level of p = .05 for the Type 1 error
probability rejecting the null hypothesis when it is true applies to this quantitative, correlational
survey research method. The researcher performed the analysis with IBM SPSS after
transforming the survey question ordinal objects into the interval variables. These statistical tests
supported the researcher’s ability in answering the research questions that determined the type of
relationship among the variables.
Research integrity represents the threats to internal validity when incorrect data collection
procedures introduce unknown biases during data collection (Dewitt et al., 2018; Siedlecki,
2020). External validity ensures the study methodology is repeatable on the more significant
applicable population requiring the sample data to be a representation of the applicable
population (2020). In this way, future studies to repeat the study design and methodology.
Ethical considerations for this study require receiving informed consent by each participant after
initial contact to participate in the study survey. Informed consent reduces the potential pressure
applied by management to complete the survey in a manner expected by the employee’s
management (Rawdin, 2018; Vehovar & Manfreda, 2017). Care to ensure the employee’s
responses cannot become an identifiable method from the management personnel. Upon the
return of the required 153 surveys, the survey was considered open for participation for pre-
qualified survey partakers. Each response then became coded using an alphanumerical system
for identification purposes only and the initial responses were permanently deleted. Informed
77
consent was acquired by providing the risks and benefits on the first page of the survey with an ‘I
Agree’ button clicked by each participant.
The support for this research design reflected the ability to quantify the variable
measurements determining the relationship between the knowledge quality within a KMS,
knowledge worker productivity, and employee satisfaction through Likert scale data collection.
The vulnerability of this research design included self-reported data and potential outlier
variables (Hughes, 2012). The verification of reliability and validity of the research variables for
confirmation factor analysis (CFA) was achieved through structural equation modeling fully
validated by Halawi (2005). Validity efforts of the research variables utilized survey questions
evident in research studies, including a pre-test, pilot, and complete analysis performed by
Halawi’s KMS Success survey. Pre-qualification efforts used by Qualtrics panel services guided
each participant using questions confirming job classification type and assigned department. This
researcher conducted ethical assurances to ensure the anonymous data collection was performed
without risking the employees’ confidentiality, preventing social status and job safety concerns
based on collected responses.
Assumptions
On of the assumptions of this study was that California businesses utilize one type of
electronic Knowledge Management System to capture, store, and share knowledge. Next, this
study presumed that each participant answered honestly to each survey question. A final
assumption is that knowledge workers employed in California make use of an organization’s
KMS to search for knowledge assets. Most organizations have a robust organizational structure
preventing outliers and allowing research data to capture information toward knowledge worker
productivity.
78
Limitations
The researcher identified many limitations in this research study. Data collection methods
when using online surveys limit the ability of the researcher to determine if the participant
answered the online questions honestly (Siedlecki, 2020). Halawi’s (2005) KMS Success survey
comprised multiple questions and asked the same context as was validated by Halawi that
performed Cronbach’s alpha reliability tests for each variable (Allen, 2017). The researcher
contracted Qualtrics panel services for data collection that administered the online survey to
increase participation rates and reduce the study limitations. These limitations included a lack of
control to verify participants lived in California participating in the online survey. Finally, the
respondents in the survey may not have represented the desired knowledge worker employed in
the software industry. The Qualtrics panel services prequalified each survey response and
ensured each participant was identified as a knowledge worker employed in the software
industry living in California before the survey closed with over 153 qualified responses
collected.
Delimitations
The purpose of this quantitative, correlational study was to determine if a relationship
exists between KMS knowledge quality, knowledge worker productivity, and employee
satisfaction. A review of the research revealed that knowledge workers using the organization’s
KMS represent employees performing tasks requiring a specific skill set to be productive when
the assigned job role tasks were executed (Levallet & Chan, 2018; Orenga-Roglá & Chalmeta,
2019; Surawski, 2019; Wang & Yang, 2016; Xiaojun, 2017; Zhang & Venkatesh, 2017). While
knowledge workers exist across the globe, the researcher selected only employees in the software
industry in California that ensured a large enough sample size was acquired for this study. The
79
researcher selected only the KMS knowledge quality as one of six components in the Jennex and
Olfman KM Success Model (2006) that identified the relationship between knowledge worker
productivity and employee satisfaction. A review of the literature identified KMS knowledge
quality may enable Knowledge Management capabilities and business performance, yet business
leaders continue to fail to implement an effective KMS (Drucker, 1999; Iazzolino & Laise,
2018); Jennex, 2017; Karlinsky-Shichor & Zviran, 2016; Sutanto et al., 2018; Vanian, 2016;
Xiaojun, 2017).
Ethical Assurances
The researcher gained approval from the Northcentral University Institutional Review
Board (IRB) before the data collection began as displayed in Appendix F Figure 6. The
researcher minimized the risk to the participants of the online survey by briefly documenting the
process of the data collection, the storage procedures of the participant data, took steps to ensure
the participant identity and answers remained protected, and the planned destruction of the data
process within the survey (Dewitt et al., 2018). Each participant agreed to informed consent
when the “I Agree” button was clicked on the information page and confirmed the outlined risks
were accepted for participation in the online survey. The researcher stored the data from the
online survey on a password-protected Microsoft Excel spreadsheet followed by an
alphanumeric labeling system for each participant’s set of survey responses which secured the
identity of the participants. The researcher took additional steps that prevented bias during the
data collection and the analysis of data. The researcher contracted with Qualtrics panel services
and retrieved an unbiased collection of data applicable for this research study. Submitted online
surveys with missed answers to the KMS question sections were not used for data analysis to
remove unknown participant bias (Siedlecki, 2020).
80
Summary
In summary, this quantitative, correlational study aligned with the problem, purpose, and
research questions that examined if a relationship exists between the knowledge quality
component of the KMS, knowledge worker productivity, and employee satisfaction. The
selection of the quantitative research method and correlational survey design resulted from the
support of research also supported by the framework comprised of knowledge workers,
Knowledge Management, Knowledge Management Systems, knowledge worker productivity,
and employee satisfaction (DeLone & McLean, 1992; DeLone & McLean, 2003; DeLone &
McLean, 2004; Liu, Olfman, & Ryan, 2005; Zuama et al., 2017). Online surveys hinged upon the
direction of the Jennex and Olfman KM Success Model served as the basis for selecting
variables for content validity (Jennex, 2017; Jennex & Olfman, 2006). The operational
definitions of the KMS knowledge quality, knowledge worker productivity, and employee
satisfaction variables were transformed and computed by the means from the collection of the
Likert scale question objects.
The specific study procedures describe the steps taken during data collection and allowed
an informed decision from each participant to complete the survey agreeing to informed consent.
Assumptions, limitations, delineations, and ethical assurances supported the validity and ethical
considerations of this study. Data collection intentions and analysis of the collected data
supported the research method and design of this study. In chapter 4, the findings on the analysis
of collected data were reported and segregated by each research question that identified patterns
in the findings. Descriptive information and explanations for each statistical test allowed the
reader to interpret the results, including inferred assumptions.
81
Chapter 4: Findings
The problem addressed in this study was that there is often great difficulty encountered in
trying to retrieve knowledge assets about events in the past required for strategic decision-
making without an effective, in-place Knowledge Management System (KMS) (Oladejo &
Arinola, 2019). Knowledge Management (KM) is challenging to implement requires exploration
and improvement in its’ continued application and development (Putra & Putro, 2017).
Additional difficulties associated with the lack of an effective KMS include knowledge asset
unavailability, improper knowledge asset documentation, excessive time consumption associated
with searching for knowledge assets, decision-making overhead, and duplication of effort
(Oladejo & Arinola, 2019). A substantial number of Knowledge Management System (KMS)
implementations have not achieved their intended outcomes, such as employee performance and
employee satisfaction (Zhang & Venkatesh, 2017).
The purpose of this quantitative, correlational study was to explore the relationship
between the knowledge quality of an organization’s KMS, the knowledge worker productivity,
and employee satisfaction for software industry organizations in California. This study is
relevant and contributes to the Knowledge Management research community as millions of
dollars in losses from unsuccessful KMS implementations fail to satisfy expected benefits in
knowledge assets to support business performance (Fakhrulnizam et al., 2018; Levallet & Chan,
2018; Nusantara et al., 2018; Vanian, 2016). When organizations fail to implement a successful
KMS, KM strategies depending on the use of knowledge assets for knowledge worker
productivity and employee satisfaction also fail (De Freitas & Yáber, 2018; Demirsoy &
Petersen, 2018; Putra & Putro, 2017; Xiaojun, 2017).
82
The research questions of this study were used to examine the relationship between the
knowledge quality of an organization’s Knowledge Management System, knowledge worker
productivity, and employee satisfaction. These research questions formed the basis for the
research method and design, reflected the statement of the problem and purpose of the study.
Each research question corresponded with the null and alternative hypothesis as follows:
RQ1. To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and knowledge worker productivity?
H10. There is not a statistically significant relationship between the knowledge quality of
an organization’s KMS and knowledge worker productivity.
H1a. There is a statistically significant relationship between the knowledge quality of an
organization’s KMS and knowledge worker productivity.
RQ2. To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and employee satisfaction?
H20. There is not a statistically significant relationship between the knowledge quality of
an organization’s KMS and employee satisfaction.
H2a. There is a statistically significant relationship between the knowledge quality of an
organization’s KMS and employee satisfaction.
In the remaining sections of this chapter, the researcher organized the sections based on
their relevance to the research questions, including the validity and reliability of the data.
Descriptions in this chapter also include the assumptions of statistical tests, results from the data
analysis to answer the research questions, and the hypothesis outcomes. Next, the evaluation of
the findings based on existing research and theory are provided, followed by a summary of the
research findings and Chapter 5.
83
Validity and Reliability of the Data
The KM Success survey instrument established by Halawi (2005) to measure the success
of Knowledge Management Systems was converted to an online survey using the same questions
and a 7-point Likert scale. In this study, this researcher performed statistical tests on data
gathered from the online Qualtrics survey and explored if a statistically significant relationship
existed between the knowledge quality of an organization’s KMS, knowledge worker
productivity, and employee satisfaction. Halawi (2005) initiated a pre-test to confirm the survey
question’s clarity among the small set of participants, followed by a pilot study in further
confirmation of modified survey questions to represent identified variables. The evaluation for
the validity of the final KMS Success survey instrument reported by Halawi (2005) consisted of
discriminant, construct, and convergent validity measures.
Throughout the discriminant validity tests, Halawi (2005) dropped constructs based on
factor loading values with correlations lower than 0.5 ensured no overlap of factors existed.
Also, construct validity tests were performed using factor analysis to examine the relationship
between the survey items supporting variables described within the study’s theoretical context of
Halawi’s (2005) study. Halawi also performed convergent validity tests by analyzing the survey
items to the total correlation based on each survey item’s correlation summed by the additional
survey items. Halawi (2005) measured the reliability of the final survey instrument using
Cronbach’s Alpha tests for internal consistency across the study’s constructs. Halawi (2005)
reported that the Pearson’s correlation coefficients test to identify the strength and direction for
the study’s variables were statistically significant at the .01 level.
84
KMS Success Survey Instrument
In this study, the same questions from Halawi’s (2005) KMS Success survey instrument
were converted from printed form to an online survey hosted in the Qualtrics web platform
included in Appendix C Figure 5. Halawi’s (2005) survey instrument was used to test the
dimensions of the Delone and McLean model (1992, 2002, 2003) for measuring the success of
an organization’s KMS and the relationship among the dimensions. The online KMS Success
survey for this study served the purpose of data collection in answering the research questions
and explored the relationship between the knowledge quality of an organization’s KMS, the
knowledge worker productivity, and employee satisfaction. According to Halawi (2005), the
validity of the KMS Success survey instrument confirmed the measurement of the variables in
context, and the reliability of the survey instrument was held reliable.
Common Construct Measures
The questions in Halawi’s (2005) original survey measured the six construct dimensions
in the transformed Delone and McLean model (1992, 2002, 2003) contained within the Jennex
and Oflman KMS Success Model (2003). The survey questions and constructs applied to this
study’s variables remained relevant to the same construct dimensions within the Jennex and
Oflman KMS Success Model (2003) identified as knowledge quality, employee satisfaction (user
satisfaction), and knowledge worker productivity (net system benefits). This researcher
combined the common survey question items representing each construct into one specific
variable as specified in this study to answer the research questions. Multiple survey items
representing each dependent variable, employee satisfaction, and knowledge worker productivity
were combined into one variable for each dependent variable construct. Although several
dimensions exist within KMS knowledge quality as the independent variable as discussed in
85
Chapter 2, for this study, the independent variable as a single construct was measured against the
dependent variables to answer the research questions and test the null hypothesis.
Assumptions
Spearman’s correlation coefficient was used to answer each of the research questions
assessing the statistical assumptions, including the level of measurement and monotonic
relationship. Pearson’s correlation coefficient was eliminated as the statistical measure due to the
violation of normality and further recommendation to use Spearman’s test for survey ordinal data
(de Winter et al., 2016).
Monotonic Relationship. A Spearman correlation requires that the relationship between
each pair of variables does not change direction (Schober et al., 2018). Schober et al. state that
this assumption is violated if the points on the scatterplot between any pair of variables appear to
shift from a positive to negative or negative to a positive relationship. Figure 1 presents the
scatterplot of the correlation between the KMS knowledge quality independent variable and the
dependent variable knowledge worker productivity. Figure 2 presents the scatterplot of the
correlation between the KMS knowledge quality independent variable and the dependent
variable employee satisfaction.
86
Figure 1
Scatterplot of KMS KQ and KWP
Scatterplot of KMS KQ and KWP
Figure 2 Scatterplot of KMS KQ and employee satisfaction
Scatterplot of KMS KQ and employee satisfaction
87
Level of Measurement. The level of measurement when using the Spearman
correlational coefficient test is more relaxed than that of Pearson’s correlation coefficient
assumptions (Schober et al., 2018). Schober et al. state the internal, ratio or ordinal levels of
measurement meet the assumption for the Spearman correlational coefficient test.
Test for Normality. The Shapiro-Wilk tests were conducted to identify if the
distributions of KMS knowledge quality, knowledge worker productivity, and employee
satisfaction resulted as significantly different from a normal distribution. The variables had
distributions that significantly differed from normality based on an alpha of 0.05: KMS
knowledge quality (W = 0.90, p < .001), knowledge worker productivity (W = 0.88, p < .001),
and employee satisfaction (W = 0.89, p < .001). The results are presented in Table 2.
Table 2
Shapiro-Wilk Test Results for all Study Variables Test for Normality
Shapiro-Wilk Test Results for all Study Variables Test for Normality
Variable W p
KMS knowledge quality 0.90 < .001
knowledge worker productivity 0.88 < .001
employee satisfaction 0.89 < .001
Results
In this section, results from the data analysis used to answer the research questions and to
test the hypothesis are presented, describing the overall study sample size, descriptive statistics,
and demographic summary. The target sample size of N = 153 was determined using G*Power
software displayed in Appendix A Figure 3. An online survey with eighty-three questions about
the employees’ current KMS was distributed using Qualtrics panel services for two weeks until a
88
total of at least 153 valid responses from knowledge workers employed in software industry
firms in California were fulfilled. A total of 154 valid participant responses were recorded and
analyzed for this study. Survey questions were transformed into the independent variable KMS
knowledge quality and the two dependent variables, knowledge worker productivity, and
employee satisfaction.
Descriptive Statistics
Descriptive statistics were calculated for the KMS knowledge quality, knowledge worker
productivity, and employee satisfaction. The results for KMS knowledge quality as the
independent variable were calculated using SPSS as an average of 5.30 (SD = 1.35, SEM = 0.11,
Min = 1.09, Max = 7.00, Skewness = -1.15, Kurtosis = 1.09). The results for knowledge worker
productivity were calculated as an average of 5.35 (SD = 1.39, SEM = 0.11, Min = 1.00, Max =
7.00, Skewness = -1.25, Kurtosis = 1.22). The results for employee satisfaction were calculated
as an average of 5.25 (SD = 1.55, SEM = 0.13, Min = 1.00, Max = 7.00, Skewness = -1.03,
Kurtosis = 0.40). In a comparison of the means, all three variable values were close in value to
one another whereas the employee satisfaction variable with a higher standard deviation was
interpreted as more dispersed from the mean than the KMS knowledge quality or knowledge
worker productivity. The summary of descriptive statistics can be found in Table 3.
89
Table 3
Summary of Descriptive Statistics
Summary of Descriptive Statistics
Variable M SD n SEM Min Max Skewness Kurtosis
KMS knowledge quality 5.30 1.35 154 0.11 1.09 7.00 -1.15 1.09
Knowledge worker productivity 5.35 1.39 154 0.11 1.00 7.00 -1.25 1.22
Employee satisfaction 5.25 1.55 154 0.13 1.00 7.00 -1.03 0.40
Demographic Summary
Demographic information was voluntary and collected from participants completing the
online KMS Success survey. Frequencies and percentages were calculated for each participant’s
voluntary demographic data for gender, age, years employed, years of KMS usage, education
level, employment position, and industry in Appendix E. The most frequently noted category of
gender was Male (n = 132, 86%). The noted frequencies for age had an average of 41.07 (SD =
7.90, Min = 20.00, Max = 68.00). The most frequently noted category of years employed was
greater than ten years (5) (n = 57, 37%). The most frequently noted category of years of KMS
usage was greater than five years (5) (n = 51, 33%). The most frequently noted category of
education level was master’s degree or beyond (4) (n = 110, 71%). The most frequently noted
category of employment position was Sr. Manager/Director (3) (n = 63, 41%). Frequencies and
percentages tables and values are presented in Appendix E Tables 4 – 10.
Research Question 1/Hypothesis
RQ1. To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and knowledge worker productivity?
90
H10. There is not a statistically significant relationship between the knowledge quality of
an organization’s KMS and knowledge worker productivity.
H1a. There is a statistically significant relationship between the knowledge quality of an
organization’s KMS and knowledge worker productivity.
A Spearman correlation analysis was conducted between KMS knowledge quality and
knowledge worker productivity for the first research question to assess if a significant statistical
relationship exists between KMS knowledge quality and knowledge worker productivity. The
minimum required sample size of N = 153 according to the G*Power analysis in Appendix A
Figure 3 was collected over two weeks resulting in 154 valid participant survey responses. The
result of the correlation was examined based on an alpha value of 0.05. A significant positive
correlation was observed between KMS knowledge quality and knowledge worker productivity
(rs = 0.94, p < .001, 95% CI [0.92, 0.96]). The correlation coefficient between KMS knowledge
quality and knowledge worker productivity was .94, indicating a large effect size (Cohen, 1988).
This correlation indicates that as KMS knowledge quality increases, knowledge worker
productivity tends to increase. Table 11 presents the
output of the correlation test.
Table 11 Spearman Correlation Result: KMS KQ and KWP
Spearman Correlation Result: KMS Knowledge Quality and Knowledge Worker Productivity
Combination rs 95% CI p
KMS knowledge quality-knowledge worker productivity 0.94 [0.92, 0.96] < .001
Note. n = 154.
Research Question 2/Hypothesis
RQ2. To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and employee satisfaction?
91
H20. There is not a statistically significant relationship between the knowledge quality of
an organization’s KMS and employee satisfaction.
H2a. There is a statistically significant relationship between the knowledge quality of an
organization’s KMS and employee satisfaction.
A Spearman correlation analysis was conducted between KMS knowledge quality and
employee satisfaction for the second research question to assess if a significant statistical
relationship exists between KMS knowledge quality and employee satisfaction. The minimum
required sample size of N = 153 according to the G*Power analysis in Appendix A Figure 3 was
collected over two weeks resulting in 154 valid participant survey responses. The result of the
correlation was examined based on an alpha value of 0.05. A significant positive correlation was
observed between KMS knowledge quality and employee satisfaction (rs = 0.93, p < .001, 95%
CI [0.91, 0.95]). The correlation coefficient between KMS knowledge quality and employee
satisfaction was 0.93, indicating a large effect size. This correlation indicates that as KMS
knowledge quality increases, employee satisfaction tends to increase. Table 12 presents the
output of the correlation test.
Table 12
Spearman Correlation Results: KMS Knowledge Quality and Employee Satisfaction
Spearman Correlation Results: KMS Knowledge Quality and Employee Satisfaction
Combination rs 95% CI p
KMS knowledge quality-employee satisfaction 0.93 [0.91, 0.95] < .001
Note. n = 154.
Evaluation of the Findings
The assessment of the findings in this study was interpreted based on the Jennex and
Olfman KM Success (2017) forming the theoretical framework discussed in chapter 1 and
92
chapter 2. As presented in existing research, the findings support the theoretical framework
representing the importance in the success of the KMS knowledge-centric practices providing
knowledge assets utilized to accomplish an organizational business purpose (Alavi & Leidner,
2001; Ermine, 2005; Jennex, 2017; Jennex & Olfman, 2006; Wu & Wang, 2006). The theoretical
framework in this study was the basis for the research questions and interpretation of the results
using the Jennex and Olfman KM Success Model as performance indicators while using the
organization’s Knowledge Management Systems (Jennex, 2017; Jennex & Olfman, 2006).
Research Question 1
To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and knowledge worker productivity?
The null hypothesis for this research question was that there is not a statistically significant
relationship between the knowledge quality of an organization’s KMS and knowledge worker
productivity and was rejected. The results of the inferential test analysis using Spearman’s
correlation resulted in a significant strong positive correlation between the knowledge quality of
an organization’s KMS and knowledge worker productivity. There was an indication between the
variables in that when KMS knowledge quality increases, knowledge worker productivity tends
to increase.
Research Question 2
To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and employee satisfaction? The null hypothesis for
this research question was that there is not a statistically significant relationship between the
knowledge quality of an organization’s KMS and employee satisfaction and was rejected. The
results of the inferential test analysis using Spearman’s correlation resulted in a significant strong
93
positive correlation between the knowledge quality of an organization’s KMS and employee
satisfaction. There was an indication between the variables in that when KMS knowledge quality
increases, employee satisfaction tends to increase as seen in the hypothesis testing results for the
first research question.
Summary
The purpose of this quantitative, correlational study was to explore the relationship
between the knowledge quality of an organization’s KMS, the knowledge worker productivity,
and employee satisfaction for software industry organizations in California. The researcher
conducted this study using an online survey as the research instrument and collected data from
knowledge workers employed in software industry firms in California. The data collected were
analyzed based on the research questions from the 154 completed participant responses in this
study. Halawi’s (2005) past validity and reliability efforts for the KMS Success survey
instrument were considered appropriate to support this study.
Two research questions were tested against the null hypothesis, and both were rejected
based on the statistical significance results. The Spearman’s correlational nonparametric test was
used to investigate if a relationship existed between the KMS knowledge quality, knowledge
worker productivity, and employee satisfaction. Descriptive statistics results were noted for the
independent and dependent variables with similar results. A demographic summary of voluntary
participant information was captured for gender, age, years employed, years of KMS usage,
education level, employment position, and type of software industry as displayed in Appendix E.
An evaluation of the findings shows the null hypothesis for both research questions was rejected
as the significant relationship test results were evident between the KMS knowledge quality and
the knowledge worker productivity and the KMS knowledge quality and employee satisfaction.
94
The findings in this chapter will be used as the basis for the implications, recommendations, and
conclusions of this study in chapter 5.
95
Chapter 5: Implications, Recommendations, and
Conclusions
This chapter continues the study topic that explored the relationship between the quality
of a KMS, knowledge worker productivity, and employee satisfaction. The problem addressed
by this study was that there is often great difficulty encountered in trying to retrieve knowledge
assets about events in the past required for strategic decision-making without an effective, in-
place Knowledge Management System (KMS) (Oladejo & Arinola, 2019). The purpose of this
quantitative, correlational study was to explore the relationship between the knowledge quality
of an organization’s KMS, the knowledge worker productivity, and employee satisfaction for
software industry organizations in California. The quantitative research method achieved the
goal of exploring the relationship between the identified variables supporting the problem,
purpose, and research questions from the same participant sample (Mellinger & Hanson, 2016).
This correlational research design using correlational statistical methods identified the
relationship between independent and dependent variables (Cavenaugh, 2015). The results from
the assumption of the normal distribution compelled Spearman’s coefficient of rank correlation
test as the statistical method of choice (Field, 2013). Statistical tests on data gathered from the
online Qualtrics survey were used to explore if a statistically significant relationship existed
between the knowledge quality of an organization’s KMS, knowledge worker productivity, and
employee satisfaction. The results of this study were based on two research questions tested
against the null hypothesis. both research questions’ null hypothesis were rejected based on the
statistical significance results. A significant positive correlation was observed between KMS
knowledge quality and knowledge worker productivity. Likewise, a significant positive
correlation was observed between KMS knowledge quality and employee satisfaction. The
researcher identified limitations in this research study beyond control. The Qualtrics panel
96
services acquired 154 participants responses self-identifying as meeting the requirements over 18
years old, read/understand English, interact with the organization’s KMS, live in California, and
currently employed in the software industry. Another limitation impacting this study arises from
the lack of additional research similar to the Jennex and Olfman KM Success Model (2006)
addressing recent KMS applications.
The research questions and corresponding hypothesis served as a guide for the basis of
this study, supported by the research method and design alignment with the statement of the
problem and purpose. The researcher used the first research question to determine if a
statistically significant relationship existed between the knowledge quality of an organization’s
KMS and knowledge worker productivity. The researcher used the second research question to
determine if a statistically significant relationship existed between the knowledge quality of an
organization’s KMS and employee satisfaction. Each research question was used to test the
hypothesis. The remainder of this chapter will include a review of the implications of this study,
recommendations for practice based on the results of this study, and recommendations for future
research to further explore the results of this study. Finally, the conclusion will summarize the
problem, purpose, and importance of the study, finalizing this section discussing applications for
future professional and academic stakeholders based on the findings of this study.
Implications
The findings reflected in chapter 4 guided the implications of this study. These findings
align with the study’s theoretical framework based on the Jennex and Olfman KM Success
Model. The Jennex and Olfman KM Success Model (2006) supported KMS knowledge quality
as the independent variable followed by knowledge worker productivity and employee
satisfaction during the use of the KMS as the dependent variables for this study (Jennex, 2017;
97
Jennex & Olfman, 2006). Five significant themes formed the context of this study’s theoretical
framework supported by a review of the literature, including knowledge workers, Knowledge
Management, Knowledge Management Systems, knowledge worker productivity, and employee
satisfaction. In following this model, two research questions were used to guide this study to
explore the relationship between the knowledge quality of an organization’s KMS, the
knowledge worker productivity, and employee satisfaction. Within each research question
section below, the researcher will discuss results relative to the literature review in chapter 2, the
problem and purpose of the study supported by the theoretical framework, and the significance.
Each research question will include the findings and how the study results contribute to the
existing body of research.
Research Question 1/Hypothesis
RQ1. To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and knowledge worker productivity?
H10. There is not a statistically significant relationship between the knowledge quality of
an organization’s KMS and knowledge worker productivity.
H1a. There is a statistically significant relationship between the knowledge quality of an
organization’s KMS and knowledge worker productivity.
The first research question guided the analysis of the data to determine if a significant
relationship between the knowledge quality of an organization’s KMS and knowledge worker
productivity existed. The null hypothesis was tested and rejected based on the findings indicating
a significant positive correlation was observed between KMS knowledge quality and knowledge
worker productivity. The results indicated that as KMS knowledge quality increases, knowledge
worker productivity tends to increase. Table 11 presents the output of the correlation test.
98
Factors that might have influenced the interpretation of the results originate from the
contextual implications regarding the knowledge quality of KMS and the knowledge worker
productivity definition derived within the literature (Jennex 2017; Jennex & Olfman, 2006). The
knowledge quality of the organization’s KMS served as the independent variable within the first
research question. Jennex and Olfman (2006) outlined six key performance indicators within the
Jennex and Olfman KM Success Model as specific components utilized to measure the
performance of the KMS, noting knowledge quality as the first key indicator applicable to this
study. Additional researchers describe the degree of knowledge content quality retrieved
determines the knowledge worker’s perceived usefulness of the system (Jahmani et al., 2018;
Zhang, 2017). Knowledge worker productivity was identified as the dependent variable within
the first research question. Researchers describe knowledge worker productivity as the value-
added activities resulting from the knowledge worker’s interaction with the organization’s KMS
(Kianto et al., 2019; Shujahat et al., 2019).
The study results address the problem in the difficulty encountered in trying to retrieve
knowledge assets about events in the past. The knowledge quality dimensions comprised of KM
process/strategy, richness, and linkage impact the knowledge worker’s capability to retrieve the
desired knowledge assets within the organization’s KMS (Jennex & Olfman, 2006). The study
results
support the purpose of exploring the relationship between the knowledge quality of an
organization’s KMS and the knowledge worker productivity for software industry organizations
in California. Data collected from 154 participant surveys representing knowledge workers
employed in the software industry living in California answered questions regarding KMS
knowledge quality, productivity, and perceived satisfaction during usage of the KMS. The results
99
indicated a significant positive correlation observed between KMS knowledge quality and
knowledge worker productivity.
The study results contribute to the existing literature and theoretical framework based on
the accepted KMS knowledge quality and knowledge worker definitions within the literature and
framework derived by the Jennex and Olfman KM Success Model (2006). Also, this researcher
converted the written Halawi’s (2005) KM Success survey into an online survey based on the
foundation of the Jennex and Olfman KM Success Model (2006) used as the basis of the data
analyzed in this study. The researcher determined the existing literature supported the need for
this study as the continued financial losses from unsuccessful KMS implementations fail to
satisfy expected benefits in knowledge assets to support business performance (Fakhrulnizam et
al., 2018; Levallet & Chan, 2018; Nusantara et al., 2018; Vanian, 2016). The study results are
consistent with existing research as discussed by Jahmani et al. (2018) from study conclusions
surveying healthcare staff from multiple hospitals regarding KMS components finding that
knowledge content quality retrieved from the knowledge worker as a functional requirement
perceived usefulness of KMS. Also, several researchers report that the failure to ensure the
organization’s KMS incorporates successful KM process outcomes impact knowledge worker
productivity (Jennex, 2017; Karlinsky-Shichor & Zviran, 2016; Sutanto et al., 2018; Vanian,
2016; Xiaojun, 2017).
The study results are also consistent with existing Jennex and Olfman KM Success
(2006) theory for KMS implementation success indicators as to the degree of knowledge content
quality retrieved during usage of the KMS impacted the knowledge worker’s perceived
usefulness of the system (Jahmani et al.,2018; Zhang, 2017). The study results provided an
unexpectedly large effect size in the correlation strength between the independent variable KMS
100
knowledge quality and the dependent variable knowledge worker productivity. While not a
causal relationship, the implications associated with the strength of the correlation coefficient
when measuring the relationship between the variables indicate that as the KMS knowledge
quality improves, the knowledge worker productivity may also improve.
Research Question 2/Hypothesis
RQ2. To what extent, if any, is there a statistically significant relationship between the
knowledge quality of an organization’s KMS and employee satisfaction?
H20. There is not a statistically significant relationship between the knowledge quality of
an organization’s KMS and employee satisfaction.
H2a. There is a statistically significant relationship between the knowledge quality of an
organization’s KMS and employee satisfaction.
The second research question guided the data analysis to determine if a significant
relationship between the knowledge quality of an organization’s KMS and employee satisfaction
existed. The null hypothesis was tested and rejected based on the findings indicating a significant
positive correlation between KMS knowledge quality and employee satisfaction. The results
indicated that as KMS knowledge quality increases, employee satisfaction tends to increase.
Table 12 presents the output of the correlation test.
Factors that might have influenced the interpretation of the results originate from the
contextual implications regarding the knowledge quality of KMS and the employee satisfaction
definition derived within the literature. The knowledge quality of the organization’s KMS served
as the independent variable within the second research question. The Jennex and Olfman KM
Success Model (2006) describes knowledge quality within the KMS as the performance indicator
of KMS success comprised of KM process/strategy, richness, and linkage dimensions. The
101
Jennex and Olfman KM Success Model (2006) describes the employee ‘users’ satisfaction as one
component of the KM Success performance indicators, noted as a successful experience based on
the actual use of the KMS and employee satisfaction from each use. Employee satisfaction was
identified as the dependent variable within the second research question. Employee satisfaction
is also portrayed as a successful experience based on the knowledge worker’s actual use of the
KMS while performing KM activities (Zamir, 2019; Zhang & Venkatesh, 2017).
The study results speak to the difficulty retrieving the knowledge asset due to the KMS
knowledge quality impacting the employee satisfaction and potentially the intent not to use the
KMS in the future (Jennex and Olfman, 2006; Oladejo & Arinola, 2019). The study results
support the purpose of exploring the relationship between the knowledge quality of an
organization’s KMS and employee satisfaction for software industry organizations in California.
Data collected from 154 participant surveys representing knowledge workers employed in the
software industry living in California answered questions regarding KMS knowledge quality,
productivity, and perceived satisfaction during usage of the KMS. The results indicated a
significant positive correlation observed between KMS knowledge quality and employee
satisfaction. The study results contribute to the existing literature and theoretical framework
based on the accepted KMS knowledge quality and employee satisfaction definitions within the
literature and framework derived by the Jennex and Olfman KM Success Model (2006). Also,
this researcher converted the written Halawi’s (2005) KM Success survey into an online survey
based on the foundation of the Jennex and Olfman KM Success Model (2006) used as the basis
of the data analyzed in this study. The researcher determined the existing literature supported the
need for this study as the continued financial losses from unsuccessful KMS implementations
102
fail to satisfy expected benefits in knowledge assets to support business performance
(Fakhrulnizam et al., 2018; Levallet & Chan, 2018; Nusantara et al., 2018; Vanian, 2016).
The study results are consistent with existing research for employee satisfaction reviewed
by Zamir ( 2019) as realized through employee empowerment to perform assigned job tasks.
Several researchers concur the employee satisfaction is a desired outcome during the usage of the
organization’s KMS based on the successful retrieval of knowledge assets (Jennex & Olfman,
2006; Jennex, 2017; Kumar, 2018; Zamir, 2019; Zhang & Venkatesh, 2017). The study results
are also consistent with existing Jennex and Olfman KM Success (2006) theory for KMS
implementation success indicators as a link between the user’s experience during the successful
retrieval of knowledge assets performing KM activities and employee satisfaction in achieving
the desired knowledge within the KMS (Popa et al., 2018). The study results provided an
unexpectedly large effect size in the correlation strength between the independent variable KMS
knowledge quality and employee satisfaction as the dependent variable. While not a causal
relationship, the implications associated with the strength of the correlation coefficient when
measuring the relationship between the variables indicate that as the KMS knowledge quality
improves, employee satisfaction may also improve.
The most significant implications of this study are the unexpected strength in the
correlation coefficient when examining the relationship for each research question. The
implications from both research questions indicate that as the KMS knowledge quality improves,
the knowledge worker productivity may improve. As the KMS knowledge quality improves,
employee satisfaction may also improve. The consequences of the study results support the
continued efforts to identify indicators to effectively manage the knowledge assets within an
organization’s KMS to achieve the desired productivity result (Ferolito, 2015; Vanian, 2016).
103
This study was presented to contribute to the body of knowledge due to the failure of
organizations to implement a successful KMS to support knowledge worker productivity and
employee satisfaction in the workplace. The results of this study found a statistically significant
relationship between the knowledge quality of an organization’s KMS and knowledge worker
productivity and between the knowledge quality of an organization’s KMS and employee
satisfaction. The study results contribute to the existing body of research as the continued
unsuccessful KMS implementations failing to efficiently manage knowledge assets have resulted
in millions of dollars in loss of employee productivity to support business performance
(Fakhrulnizam et al., 2018;
et al., 2019).
Recommendations for Practice
The literature review based on the Jennex and Olfman KM Success Model (2006)
framework sought to address organizations need to implement an effective KMS to retrieve
knowledge assets (Fakhrulnizam et al., 2018; Levallet & Chan, 2018; Nusantara et al., 2018;
Oladejo & Arinola, 2019; Vanian, 2016). The recommendations for practice are based on the
study findings guided by the research questions to explore the relationship between the
knowledge quality of an organization’s KMS, knowledge worker productivity, and employee
satisfaction.
Knowledge Management Business Leaders
Organizational business leaders responsible for the Knowledge Management strategies
should follow ISO 30401:2018 Knowledge Management guidelines due to the global problem of
KMS implementation failures (“ISO 30401:2018,” 2018). Attempts to implement KMS online
systems have not provided the desired productivity result across the globe based on the lack of
104
organizational standards (Ferolito, 2015; Vanian, 2016). The results showed a significant
positive correlation between KMS knowledge quality and knowledge worker productivity
displayed in Table 11. Also, a significant positive correlation was observed between KMS
knowledge quality and employee satisfaction displayed in Table 12. Business leaders should note
the results of this study as the survey participants included 63% of participants who had
interacted with the organization’s KMS for at least three or more years. Moreover, 94.2 % of
participants held the position of manager or higher.
Recommendations for Future Research
Several recommendations for future research are provided based on the limitations noted
in this study and gaps realized within the literature. A change in the participant requirements is to
open the online survey to participants on a global scale and not just in California. Also, the
online survey should allow participants to translate into any supported language and not just
English. Finally, the survey should be expanded to allow participants employed in health and
education in addition to the software industry. A new survey instrument is recommended
creating a new pilot survey further approved by subject matter experts in each industry updating
the existing survey questions to capture newer KMS capabilities. Two options to include
additional performance indicators are recommended. First, this study only analyzed some of the
Jennex and Olfman KM Success Model (2006) performance indicators. Future research should
include all performance indicators to replicate all Jennex and Olfman KM Success Model (2006).
Second, an updated KM Success Model includes changes in the Knowledge Management
strategies and KMS performance indicators. The next logical short-term step for future
researchers could be to expand the participant requirements, as previously noted. Next, a
modification of the current online survey should include the key performance indicators as
105
mentioned in the Jennex and Olfman KM Success Model (2006) or additional key performance
indicators as supported by the literature.
Conclusions
This quantitative, correlational study explored the relationship between the knowledge
quality of an organization’s KMS, knowledge worker productivity, and employee satisfaction
within the software industry in California. These findings align with the study’s theoretical
framework based on the Jennex and Olfman KM Success Model (2006). The Jennex and Olfman
KM Success Model supported KMS knowledge quality as the independent variable followed by
knowledge worker productivity and employee satisfaction during the use of the KMS as the
dependent variables for this study (Jennex, 2017; Jennex & Olfman, 2006). The study results
supported the difficulty encountered in retrieving knowledge assets about events in the past and
the KMS knowledge quality impacting employee satisfaction (Jennex and Olfman, 2006;
Oladejo & Arinola, 2019). The most significant implication from this study was the unexpected
strength in the correlation coefficient when measuring the relationship for each research
question. The study results support the importance of continuing efforts to identify performance
indicators to effectively manage the knowledge assets within an organization’s KMS to achieve
the desired productivity result (Ferolito, 2015; Vanian, 2016). This study contributed to the body
of knowledge and future research efforts for Knowledge Management business leaders due to the
failure of organizations to implement a successful KMS to support knowledge worker
productivity and employee satisfaction in the workplace. Further research to include updated KM
performance indicators for the KM Success for organizations with an existing KMS may be
helpful to organizations in several industries worldwide. Efforts to identify key KM performance
indicators may provide helpful information for business leaders to improve the successful
106
retrieval of knowledge assets, improve the knowledge quality of the KMS, and improve
knowledge worker productivity.
107
References
Al Ahbabi, S.A., Singh, S.K., Balasubramanian, S. and Gaur, S.S. (2019). Employee perception
of impact of Knowledge Management processes on public sector performance. Journal of
Knowledge Management, 23(2), 351-373. https://doi.org/10.1108/JKM-08-2017-0348
Al Shamsi, O., & Ajmal, M. (2018). Critical factors for knowledge sharing in technology-
intensive organizations: Evidence from UAE service sector. Journal of Knowledge
Management, 22(2), 384-412. https://doi.org/10.1108/JKM-05-2017-0181
Alaarj, S., Zainal, A. M., & Bustamam, U.S.B.A. (2016). Mediating role of trust on the effects of
Knowledge Management capabilities on organizational performance. Procedia – Social
and Behavioral Sciences, 235(2016), 729–738.
http://doi.org/10.1016/j.sbspro.2016.11.074
Alattas, M., & Kang, K. (2016). The relationship between organizational culture and knowledge
sharing towards business system success. arXiv.org. https://arxiv.org/abs/1606.02460.
Alavi, M., & Leidner, D. E. (2001). Knowledge Management and Knowledge Management
Systems: Conceptual foundations and research issues. MIS Quarterly, 25(1), 107–
136
.
https://doi.org/10.2307/3250961
Allen, M. (2017). In the SAGE Encyclopedia of Communication Research Methods (Vol. 1-4).
SAGE. https://doi.org/10.4
135
/9781483381411
Al-Emran, M., Mezhuyev, V., Kamaludin, A., & Shaalan, K. (2018). The impact of Knowledge
Management processes on information systems: A systematic review. International
Journal of Information Management, 43, 173–187.
https://doi.org/10.1016/j.ijinfomgt.2018.08.001
108
Ali, N., Tretiakov, A., Whiddett, D., & Hunter, I. (2016). Knowledge management systems
success in healthcare: Leadership matters. International Journal of Medical
Informatics, 97, 331–340. https://doi.org/10.1016/j.ijmedinf.2016.11.004
A literature study. Business Management Dynamics, 8(12), 1–12.
https://doi.org/10.1016/j.chb.2016.03.075
Andrawina, L., Soesanto, R. P., Pradana, S. I., & Ramadhan, G. (2018). Measuring Knowledge
Management System implementation readiness. Pertanika Journal of Social Sciences &
Humanities, 26, 219.
Andrews, M., & Smits, S. (2019). Using tacit knowledge exchanges to improve teamwork. ISM
Journal of International Business, 3(1), 15–23. doi.org/10.1201/
1078/43194.18.1.200101
Archibald, T., Sharrock, G., Buckley, J., & Young, S. (2018). Every practitioner a “knowledge
worker”: Promoting evaluative thinking to enhance learning and adaptive management in
international development. New Directions for Evaluation, 2018(158), 73–91.
https://doi.org/10.1002/ev.20323
of R&D spillovers in innovation development. Journal of Security & Sustainability, 9(2),
409–420. https://doi.org/10.9770/jssi.2019.9.2(1)
Bacila, M. L., & Titu, M. A. (2018). Structural capital and organizational culture – an approach
regarding the development of valuable intellectual capital. Review of General
Management, 28(2), 66–74. https://doi.org/10.1007/s11135-015-0183-3
109
Banerjee, P., Gupta, R., & Bates, R. (2017). Influence of organizational learning culture on
knowledge worker’s motivation to transfer training: testing moderating effects of learning
transfer climate. Current Psychology: A Journal for Diverse Perspectives on Diverse
Psychological Issues, 36(3), 606. https://doi.org/10.1007/s12144-016-9449-8
Barnes Reports: Software Publishing Industry (NAICS 51121). (2019). United States
remediation services industry report, 1–196. https://www.barnesreports.com
Becerra-Fernandez, I., Leidner, D. E., & Leidner, D. (2008). Knowledge management: An
evolutionary view. https://ebookcentral.proquest.com
Bloomfield, J., & Fisher, M. J. (2019). Quantitative research design. Journal of the Australasian
Rehabilitation Nurses’ Association (JARNA), 22(2), 27–30.
https://doi.org/10.33235/jarna.22.2.27-30
Briones-Peñalver, A. J., Bernal-Conesa, J. A., & de Nieves Nieto, C. (2019). Knowledge and
innovation management model. Its influence on technology transfer and performance in
Spanish Defense industry. International Entrepreneurship and Management Journal, 1.
https://doi.org/10.1007/s11365-019-00577-6
Byrne, B. (2019). Return on Expectations: An Academic Assessment of a Large KM
Project. Proceedings of the European Conference on Knowledge Management. Journal of
Knowledge Management Application and Practice, 1, 201. Retrieved from
http://www.naturalspublishing.com/
Cannatelli, B., Smith, B., Giudici, A., Jones, J., & Conger, M. (2017). An expanded model of
distributed leadership in organizational knowledge creation. Long Range Planning, 50(5),
582–602. https://doi-org.proxy1.ncu.edu/10.1016/j.lrp.2016.10.002
110
Carlson, A. (1969). Information Systems: Theory and Practice. Accounting Review, 44(4), 852–
854. Retrieved from http://www.jstor.org/stable/243690
Caruso, S. J. (2017). A Foundation for Understanding Knowledge Sharing: Organizational
Culture, Informal Workplace Learning, Performance Support, and Knowledge
Management. Contemporary Issues in Education Research, 10(1), 45–52.
https://doi.org/10.19030/cier.v10i1.9879
Cavanagh, R. (2015). A unified model of student engagement in classroom learning and
classroom learning environment: one measure and one underlying construct. Learning
Environments Research, 18(3), 349–361. https://doi.org/10.1007/s10984-015-9188-z
Centobelli, P., Cerchione, R., & Esposito, E. (2018). How to deal with Knowledge Management
misalignment: a taxonomy based on a 3D fuzzy methodology. Journal of Knowledge
Management, 22(3), 538. https://doi.org/10.1108/JKM-10-2016-0456
Ceptureanu, S.I., Ceptureanu, E. G., Zgubea, F., & Tudorache, A. (2012). Economic Survey on
Knowledge Based Management in Romanian Companies. Review of International
Comparative Management. Revista de Management Comparat International, 13(2), 325–
336. Retrieved from https://www.ceeol.com/
Corney, P. J. (2018). As KM evolves, so will the ISO standard. Business Information
Review, 35(4), 165. https://doi.org/10.1177/0266382118810825
Costas, J., & Karreman, D. (2016). The bored self in knowledge work. Human Relations, 69(1),
61–83. https://doi.org/10.1177/0018726715579736
-Sikora, A., Sikora, J., Rorat, J., & Niemiec, M.(2018). Information
technology tools in corporate Knowledge Management. Ekonomia i Prawo, 1(1), 5.
https://doi.org/10.12775/EiP.2018.00. (2018).
111
De Freitas, V., & Yáber, G. (2018). Information management as a determinant of success in
Knowledge Management Systems. Journal of Business, 10(2), 88.
https://doi.org/10.1590/s1984-296120180034
DeLone, W. H., & McLean, E. R. (1992). Information systems success: The dependent variable.
Information Systems Research, 3(1), 60–95. https://doi.org/10.1287/isre.3.1.60
DeLone, W. H., & McLean, E. R. (2003). The DeLone and McLean Model of Information
Systems Success: A Ten-Year Update. Journal of Management Information Systems,
19(4), 9–30. https://doi.org/10.1080/07421222.2003.11045748
DeLone, W. H., & McLean, E. R. (2004). Measuring e-commerce success: Applying the DeLone
& McLean Information Systems Success Model. International Journal of Electronic
Commerce, 9(1), 31–47. https://doi.org/10.1080/10864415.2004.11044317
Demirsoy, A. & Petersen, K. (2018). Semantic Knowledge Management System to support
software engineers: Implementation and static evaluation through interviews at Ericsson.
E-Informatica Software Engineering Journal, 1(1), 237. https://doi.org/10.5277/e-
Inf180110
Dewitt J., Capistrant B., Kohli N., Rosser B., Mitteldorf D., Merengwa E., West W. (2018).
Addressing participant validity in a small internet health survey (The Restore Study):
Protocol and recommendations for survey response validation. Journal of Medical
Internet Research, 20(4), 1. https://doi.org/10.2196/resprot.7655
Dey, T., & Mukhopadhyay, S. (2018). Linkage between contextual factors, knowledge-sharing
mediums, and behaviour: Moderating effect of knowledge-sharing intentions. Knowledge
& Process Management, 25(1), 31–40. https://doi-org.proxy1.ncu.edu/10.1002/kpm.1558
112
Dong, T.-P., Hung, C.-L., & Cheng, N.-C. (2016). Enhancing knowledge sharing intention
through the satisfactory context of continual service of Knowledge Management
Systems. Information Technology & People, 29(4), 807. https://doi.org/10.1108/ITP-09-
2014-0195
Drucker, P. F. (1999). Knowledge-Worker Productivity: The biggest challenge. California
Management Review, 41(2), 79–94. https://doi.org/10.2307/41165987
Duarte, C. H. (2017). Productivity paradoxes revisited: Assessing the relationship between
quality maturity levels and labor productivity in Brazilian software companies. Empirical
Software Engineering: An International Journal, 22(2), 818.
https://doi.org/10.1007/s10664-016-9453-5
Dun&Bradstreet. (2020). Business Directory. D&B. https://www.dnb.com/business-
directory.html
Duvall Antonacopoulos, N. M., & Serin, R. C. (2016). Comprehension of online informed
consents: Can it be improved? Ethics & Behavior, 26(3), 177–193.
https://doi.org/10.1080/10508422.2014.1000458
Ebert, P., & Freibichler, W. (2017). Nudge management: applying behavioural science to
increase knowledge worker productivity. Journal of Organization Design, 6(1), 1.
https://doi.org/10.1186/s41469-017-0014-1
Eltayeb, S. & Kadoda, G. (2017). The impact of Knowledge Management practices on business
strategies and organizational performance. (2017). 2017 Sudan Conference on Computer
Science and Information Technology (SCCSIT), Computer Science and Information
Technology (SCCSIT), 2017 Sudan Conference On, 1.
https://doi.org/10.1109/SCCSIT.2017.8293062
113
Ermine, J.L. (2005). A theoretical and formal model for Knowledge Management Systems. St.
Louis: Federal Reserve Bank of St Louis.
Fakhrulnizam, M., Rusli, A., Marzanah, J., Rozi Nor, H., & Nor Aida, A. R. (2018). Towards the
integration of Quality Management Systems and Knowledge Management System in
Higher Education institution: Development of Q-Edge Kms Model. Acta Informatica
Malaysia, 1(2), 4. https://doi.org/10.26480/aim.02.2018.04.09
Ferolito, D. (2015). Unlocking the hidden value of information | AI-driven intelligent enterprise
search software. https://www.bainsight.com/
Field, A. (2013). Discovering statistics using IBM SPSS statistics. Washington D.C.: Sage
Publications, Inc.
Flick, U. (2018). The SAGE handbook of qualitative data collection. (2018). SAGE Publications,
Ltd. https://doi.org/10.4135/9781526416070
Fortune. (2019). Fortune.com. Retrieved from https://fortune.com/fortune500
García-Alcaraz, J. L., Montalvo, F. J. F., Avelar-Sosa, L., Pérez de la Parte, M. M., Blanco-
Fernández, J., & Jiménez-Macías, E. (2019). The importance of access to information and
knowledge coordination on quality and economic benefits obtained from Six Sigma.
Wireless Networks: The Journal of Mobile Communication, Computation, and
Information, 1. https://doi.org/10.1007/s11276-019-02180-7
Ghodsian, N., Khanifar, H., Yazdani, H., & Dorrani, K. (2017). The effective contributing
factors in knowledge sharing and knowledge transfer among academic staff at Tehran
University of Medical Sciences: A Qualitative Study. Journal of Medical Education,
1(2). https://doi.org/10.22037/jme.v16i2.18038
114
Gunadham, T. & Thammakoranonta, N. (2019). Knowledge Management Systems
Functionalities Enhancement in Practice. In Proceedings of the 5th International
Conference on Frontiers of Educational Technologies (ICFET 2019). Association for
Computing Machinery, 83–88. https://doi.org/10.1145/3338188.3338213
Halawi, L. A. (2005). Knowledge management system success in knowledge-based
organizations: An empirical validation utilizing the DeLone and McLean IS
success model (Publication No. 3169717) [Doctoral dissertation, Northcentral
University]. ProQuest Dissertations Publishing.
Hamdoun, M., Jabbour, C. J., & Ben Othman, H. (2018). Knowledge transfer and organizational
innovation: Impacts of quality and environmental management. Journal of Cleaner
Production, 193, 759–770. https://doi.org/10.1016/j.jclepro.2018.05.031
Hancock, G. R., Mueller, R. O., & Stapleton, L. M. (Eds.). (2010). The reviewer’s guide to
quantitative methods in the social sciences. https://ebookcentral.proquest.com
Hashemi, P., Khadivar, A., & Shamizanjani, M. (2018). Developing a domain ontology for
Knowledge Management technologies. Online Information Review, 42(1), 28. Retrieved
from https://www.emerald.com/insight
Hock, M., Clauss, T., & Schulz, E. (2016). The impact of organizational culture on a firm’s
capability to innovate the business model. R&D Management, 46(3), 433–450.
https://doi-org.proxy1.ncu.edu/10.1111/radm.12153
Hoe, S. (2006), “Tacit knowledge, Nonaka and Takeuchi SECI model and informal knowledge
processes”, International Journal of Organization Theory & Behavior, 9(4), 490-502.
https://doi.org/10.1108/IJOTB-09-04-2006-B002
115
Hughes, J. (2012). SAGE Library of Research Methods: SAGE internet research methods, 1-4.
SAGE Publications Ltd. https://doi.org/10.4135/9781446263327
Iazzolino, G., & Laise, D. (2016). Value creation and sustainability in knowledge-based
strategies. Journal of Intellectual Capital, 17(3), 457–470. https://doi.org/10.1108/JIC-
09-2015-0082
Iazzolino, G., & Laise, D. (2018). Knowledge worker productivity: is it really impossible to
measure it? Measuring Business Excellence, 22(4), 346. https://doi.org/10.1108/MBE-06-
2018-0035
Industries at a Glance: Publishing Industries (except Internet): NAICS 511. (2020). BLS.
https://www.bls.gov/iag/tgs/iag511.htm
Intezari, A., & Gressel, S. (2017). Information and reformation in KM systems: big data and
strategic decision-making. Journal of Knowledge Management, 21(1), 71.
https://doi.org/10.1108/JKM-06-2016-0216
Intezari, A., Taskin, N., & Pauleen, D. J. (2017). Looking beyond knowledge sharing: An
integrative approach to Knowledge Management culture. Journal of Knowledge
Management, 21(2), 492-515. https://doi.org/10.1108/JKM-06-2016-0216
Iskandar, K., Jambak, M. I., Kosala, R., & Prabowo, H. (2017). Current issue on Knowledge
Management System for future research: A systematic literature review. Procedia
Computer Science, 116, 68–80. https://doi.org/10.1016/j.procs.2017.10.011
ISO 30401:2018: Knowledge Management Systems — Requirements. (2018). ISO.
https://www.iso.org/obp/ui/#iso:std:iso:30401:ed-1:v1:en
Jabar, M. A. & Alnatsha, A. S. M. (2014). Knowledge Management System quality: A survey of
knowledge management system quality dimensions, 2014 International Conference on
116
Computer and Information Sciences (ICCOINS), 2014, pp. 1-5.
http://doi.org/10.1109/ICCOINS.2014.6868438.
Jahmani, K., Fadiya, S. O., Abubakar, A. M., & Elrehail, H. (2018). Knowledge content quality,
perceived usefulness, KMS use for sharing and retrieval. VINE: The Journal of
Information & Knowledge Management Systems, 4(4), 470.
https://doi.org/10.1108/VJIKMS-08-2017-0054
Jennex, M. (2017). Re-examining the Jennex Olfman Knowledge Management Success Model.
Proceedings of the 50th Hawaii International Conference on System Sciences.
https://doi.org/10.24251/HICSS.2017.567
Jennex, M. E., & Olfman, L. (2006). A Model of Knowledge Management Success.
International Journal of Knowledge Management, 2(3), 1.
Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed Methods Research: A Research Paradigm
Whose Time Has Come. Educational Researcher, 33(7), 14-26.
Junior, H. J., Barbosa, C. E., de Lima, Y. O., & de Souza, J. M. (2019). Approaching future-
oriented technology analysis strategies in Knowledge Management processes. 2019 IEEE
23rd International Conference on Computer Supported Cooperative Work in Design
(CSCWD), Computer Supported Cooperative Work in Design (CSCWD), 2019 IEEE
23rd International Conference On, 99–104.
https://doi.org/10.1109/CSCWD.2019.8791886
nonparametric factor analytical methods. Educational Sciences: Theory & Practice,
16(1), 153–171. https://doi.org/10.12738/estp.2016.1.0220
117
Kaplan, D. (2002). Review of structural equation modeling: Foundations and extensions. Journal
of Educational Measurement, 39(2), 183–186. https://doi.org/10.1111/j.1745-
3984.2002.tb01142.x
Karlinsky-Shichor, Y., & Zviran, M. (2016). Factors influencing perceived benefits and user
employee satisfaction in Knowledge Management Systems. Information Systems
Management, 33(1), 55–73. https://doi.org/10.1080/10580530.2016.1117873
Kelley-Quon, L. I. (2018). Surveys: Merging qualitative and quantitative research methods.
Seminars in Pediatric Surgery, 27(6), 361–366.
https://doi.org/10.1053/j.sempedsurg.2018.10.007
Khanal, L., and Raj Poudel, S., 2017. Knowledge management, employee satisfaction, and
performance: Empirical evidence from Nepal. Saudi Journal of Business and
Management Studies, 2(2), pp. 82-91. https://doi.org/10.21276/sjbms.2017.2.2.3
Khasseh, A. A., & Mokhtarpour, R. (2016). Tracing the historical origins of Knowledge
Management issues through referenced publication years spectroscopy (RPYS). Journal
of Knowledge Management, 20(6),
139
3. https://doi.org/10.1108/JKM-01-2016-0019
Kianto, A., Shujahat, M., Hussain, S., Nawaz, F., & Ali, M. (2019). The impact of Knowledge
Management on knowledge worker productivity. Baltic Journal of Management, 14(2),
178. https://doi.org/10.1108/BJM-12-2017-0404
Kimble, C., Vasconcelos, J., & Rocha, Á. (2016). Competence management in knowledge
intensive organizations using consensual knowledge and ontologies. Information Systems
Frontiers, 18(6), 1119. https://doi.org/10.1007/s10796-016-9627-0
118
Koc, T., Kurt, K., & Akbiyik, A. (2019). A brief summary of Knowledge Management domain:
10-year history of the Journal of Knowledge Management. Procedia Computer Science.
https://doi.org/10.1016/j.procs.2019.09.128
Koenig, M. (2018). What is KM? Knowledge management explained. KM World.
https://www.kmworld.com
Koenig, M., & Neveroski, K. (2008). The origins and development of Knowledge Management.
Journal of Information & Knowledge Management, 7(4), 243.
https://doi.org/10.1142/S0219649208002111
Kraemer, H. D. & Blasey, C. (2016). How many subjects? Statistical power analysis in research.
SAGE Publications, Ltd. https://doi.org/10.4135/9781483398761
Krozer, Y. (2017). Innovative offices for smarter cities, including energy use and energy-related
carbon dioxide emissions. Energy, Sustainability and Society, 7(1), 1.
https://doi.org/10.1186/s
137
05-017-0104-5
Kumar, M. (2018). Nature of knowledge technology across Indian organizations. Delhi Business
Review, 19(1), 69. https://doi.org/10.2139/ssrn.3400833
Langefors, B. (1977). Information systems theory. Information Systems, 1(4), 207–219.
https://doi.org/10.1016/0306-4379(77)90009-6
Labafi, S. (2017). Knowledge hiding as an obstacle of innovation in organizations a qualitative
study of software industry. Ad-Minister, 1(30), 131–148. https://doi.org/10.17230/ad-
minister.30.7
Lee, O.-K. D., Choi, B., & Lee, H. (2019). How do Knowledge Management resources and
capabilities pay off in short term and long term? Information & Management, 103166.
https://doi.org/10.1016/j.im.2019.05.001
119
Lee, J. Y., Yoo, S., Lee, Y., Park, S., & Yoon, S. W. (2019). Individual and organisational
factors affecting knowledge workers’ perceptions of the effectiveness of informal
learning: A multilevel analysis. Vocations & Learning, 12(1), 155.
https://doi.org/10.1007/s12186-019-09218-z
Leopold, H. (2019). Social media and corporate innovation management—Eight rules to form an
innovative organisation. E & i Elektrotechnik Und Informationstechnik, 136(3), 241.
https://doi.org/10.1007/s00502-019-0729-5
Levallet N, Chan YE.(2018). Organizational knowledge retention and knowledge loss. Journal
of Knowledge Management. 2019; 23(1):176. https://doi.org/10.1108/JKM-08-2017-0358
Liu, S.-C., Olfman, L., & Ryan, T. (2008). Knowledge Management System success: Empirical
assessment of a theoretical model. IGI Global. https://www.igi-global.com
Mao, H., Liu, S., Zhang, J., & Deng, Z. (2016). Information technology resource, Knowledge
Management capability, and competitive advantage: The moderating role of resource
commitment. International Journal of Information Management, 36(6), 1062–1074.
https://doi.org/10.1016/j.ijinfomgt.2016.07.001
Martinez-Conesa, I., Soto-Acosta, P., & Carayannis, E. G. (2017). On the path towards open
innovation: Assessing the role of Knowledge Management capability and environmental
dynamism in SMEs. Journal of Knowledge Management, 21(3), 553-570.
https://doi.org/10.1108/JKM-09-2016-0403
Martins, V. W. B., Rampasso, I. S., Anholon, R., Quelhas, O. L. G., & Leal Filho, W. (2019).
Knowledge management in the context of sustainability: Literature review and
opportunities for future research. Journal of Cleaner Production, 489–500.
https://doi.org/10.1016/j.jclepro.2019.04.354
120
Medakovic, V., & Maric, B. (2018). A Model of Management Information System for Technical
System Maintenance. Acta Technica Corvininesis – Bulletin of Engineering, 11(3), 85–
90. Retrieved from http://acta.fih.upt.ro
Mellinger, C. D., & Hanson, T. A. (2016). Quantitative Research Methods in Translation and
Interpreting Studies. Taylor & Francis.
Mentzas, G. (1994). Towards intelligent organizational information systems. International
Transactions in Operational Research, 1(2), 169. https://doi.org/10.1016/0969-
6016(94)90018-3
Mousavizadeh, M., Harden, G., Ryan, S., & Windsor, J. (2015). Knowledge management and the
creation of business value. Journal of Computer Information Systems, 55(4), 35-45.
https://doi.org/10.1080/08874417.2015.11645785
Moussa, M., Bright, M., & Varua, M. E. (2017). Investigating knowledge workers’ productivity
using work design theory. International Journal of Productivity & Performance
Management, 66(6), 822–834. https://doi.org/10.1108/IJPPM-08-2016-0161
Mukhopadhyay, S., & Gupta, R. K. (2014). Survey of Qualitative Research Methodology in
Strategy Research and Implication for Indian Researchers. Vision (09722629), 18(2),
109-123. https://doi.org/10.1177/0972262914528437
Muqadas, F., Rehman, M., Aslam, U., & Ur-Rahman, U.-. (2017). Exploring the challenges,
trends, and issues for knowledge sharing. VINE: The Journal of Information &
Knowledge Management Systems, 47(1), 2. https://doi.org/10.1108/VJIKMS-06-2016-
0036
Musyoki, J., Bor, T., & Tanui, T. A. (2017). Effects of Knowledge Management Facilitators and
Mechanisms on Organizational Performance in the Hospitality Industry. CLEAR
121
International Journal of Research in Commerce & Management, 8(11), 37–42.
NAICS Association. (2017). https://data.census.gov/cedsci/
Nikiforova, A., & Bicevska, Z. (2018). Application of LEAN Principles to Improve Business
Processes: A Case Study in Latvian IT Company. Baltic Journal of Modern Computing,
6(3), 247. https://doi.org/10.22364/bjmc.2018.6.3.03
Nikolopoulos, K., & Dana, L. (2017). Social capital formation in EU ICT SMEs: The role played
by the mobility of knowledge workers. European Management Review, 14(4), 409–422.
https://doi.org/10.1111/emre.12113
Nonaka, I. (1991). The knowledge-creating company. Harvard Business Review, 1(6), 96.
https://eric.ed.gov/?id=EJ1126832
Nugroho, E.A.K. Suroso, J. S., & Hanifah, P. (2018). A Study of Knowledge Management
System Acceptance in Halo Bca. JUTEI (Jurnal Terapan Teknologi Informasi), 1(1), 43.
https://doi.org/10.21460/jutei.2018.21.91
Nuñez, M. A., Wendlandt, T. R., & Álvarez, M. T. (2016). The relationship between
organizational culture and Knowledge Management in Tequila companies from Mexico.
International Journal of Advanced Corporate Learning, 9(1), 44–50.
https://doi.org/10.3991/ijac.v9i1.5748
Nurulin, Y., Skvortsova, I., Tukkel, I., & Torkkeli, M. (2019). Role of knowledge in
management of innovation. Resources (2079-9276), 8(2), 87.
https://doi.org/10.3390/resources8020087
122
Nusantara, P. D., Gayatri, N. A. G., & Suhartana, M. (2018). Combining two models of
successful information system measurement. Telkomnika, 16(4), 1793–1800.
https://doi.org/10.12928/TELKOMNIKA.v16i4.7737
O’Dwyer, L. M., & Bernauer, J. A. (2013). Quantitative research for the qualitative researcher.
SAGE Publications.
Occupational Employment Statistics. (2018). May 2018 State Occupational Employment and
Wage Estimates California. BLS.gov. https://www.bls.gov/oes/current/oes_tn.htm#15-
0000
Oladejo, B. F., & Arinola, A. G. (2019). University Knowledge Management System for
decision support for disciplinary procedures using a case-based reasoning technique.
International Journal of Technology, Knowledge & Society: Annual Review, 15(2), 31–
41. https://doi.org/10.18848/1832-3669/CGP/v15i02/31-41
Olaisen, J., & Revang, O. (2018). Exploring the performance of tacit knowledge: How to make
ordinary people deliver extraordinary results in teams. International Journal of
Information Management, 43, 295–304. https://doi.org/10.1016/j.ijinfomgt.2018.08.016
Oparaocha, G. O. (2016). Towards building internal social network architecture that drives
innovation: a social exchange theory perspective. Journal of Knowledge Management,
20(3), 534-556.
http://www.emeraldgrouppublishing.com/products/journals/journals.htm?id=jkm
Orenga-Roglá, S., & Chalmeta, R. (2019). Methodology for the implementation of Knowledge
Management Systems 2.0. Business & Information Systems Engineering, 61(2), 195.
https://doi.org/10.1007/s12599-017-0513-1
123
Oyemomi, O., Liu, S., Neaga, I., Chen, H., & Nakpodia, F. (2018). How cultural impact on
knowledge sharing contributes to organizational performance: Using the fsQCA
approach. Journal of Business Research. doi:10.1016/j.jbusres.2018.02.027
Palvalin, M. (2017). How to measure the impacts of work environment changes on knowledge
work productivity – Validation and improvement of the SmartWoW tool.
Measuring Business Excellence, 21(2), 175–190. https://doi.org/10.1108/MBE-05-2016-
0025
Peng, G., Wang, H., Zhang, H., Zhao, Y., & Johnson, A. L. (2017). A collaborative system for
capturing and reusing in-context design knowledge with an integrated representation
model. Advanced Engineering Informatics, 33, 314–329.
https://doi.org/10.1016/j.aei.2016.12.007
Ping-Ju Wu, S., Straub, D. W., & Liang, T. (2015). How information technology governance
mechanisms and strategic alignment influence organizational performance: Insights from
a matched survey of business and IT managers. MIS Quarterly, 39(2), 497-A7
Knowledge Management practices on employee satisfaction in the Romanian healthcare
system. Amfiteatru Economic, 20(49), 553–566.
https://doi.org/10.24818/EA/2018/49/553
Prusak, R. (2017). The impact of the level of market competition intensity on enterprises
activities in area of intellectual capital. Management (1429-9321), 21(2), 49–61.
https://doi.org/10.1515/manment-2017-0004
Punch, K. F. (2013). Introduction to social research: Quantitative and qualitative Approaches.
SAGE Publications.
124
Putra, R. J., & Putro, B. L. (2017). Knowledge Management System (KMS) readiness level
based on group areas of expertise to improve science education and computer science
quality (cross-fertilization principle) (Case study: Computer science program course
FPMIPA UPI). 2017 3rd International Conference on Science in Information Technology
(ICSITech), Science in Information Technology (ICSITech), 2017 3rd International
Conference, 701–705. Https://doi.org/10.1109/ICSITech.2017.8257203
Ramayani, H., Wang, G., Prabowo, H., Sriwidadi, T., Kodirun, R., & Gunawan, A. (2017).
Improving knowledge assets management (KM) through cloud-based platform in higher
education. 2017 International Conference on Information Management and Technology
(ICIMTech), Information Management and Technology (ICIMTech), 2017 International
Conference, 10–13. https://doi.org/10.1109/ICIMTech.2017.8273502
Rawdin, C. (2018). Calming the ‘perfect ethical storm’: A virtue-based approach to research
ethics. Ethics & Education, 13(3), 346–359.
https://doi.org/10.1080/17449642.2018.1477230
Roldán, J. L., Real, J. C., & Ceballos, S. S. (2018). Antecedents and consequences of Knowledge
Management performance: The role of IT infrastructure. Intangible Capital, 14(4), 518–
535. https://doi-org.proxy1.ncu.edu/10.3926/ic.1074
Santoro, G., Vrontis, D., Thrassou, A., & Dezi, L. (2018). The Internet of Things: Building a
Knowledge Management System for open innovation and Knowledge Management
capacity. Technological Forecasting & Social Change.
https://doi.org/10.1016/j.techfore.2017.02.034
125
Sarnikar, S., & Deokar, A. V. (2017). A design approach for process-based Knowledge
Management Systems. Journal of Knowledge Management, 21(4), 693-717.
https://doi.org/10.1108/JKM-09-2016-0376
Schmitt, U., & Gill, T. G. (2019). Synthesizing design and informing science rationales for
driving a decentralized generative Knowledge Management agenda. Informing Science:
The International Journal of an Emerging Transdiscipline, 1.
https://doi.org/10.28945/4264
Schober, P., Boer, C., & Schwarte, L. A. (2018). Correlation coefficients: Appropriate use and
interpretation. Anesthesia & Analgesia, 126(5), 1763-1768.
https:/www.10.1213/ANE.0000000000002864
Schwartz, D. G. (2014). The disciplines of information: Lessons from the history of the
discipline of medicine. Information Systems Research, 25(2), 205–221.
https://doi.org/10.1287/isre.2014.0516
Shieh, G. (2006). Exact Interval Estimation, Power Calculation, and Sample Size Determination
in Normal Correlation Analysis. Psychometrika, 71(3), 529–540.
Shrafat, F. D. (2018). Examining the factors influencing Knowledge Management System
(KMS) adoption in small and medium enterprises SMEs. Business Process Management
Journal, 24(1), 234–265. https://doi.org/10.1108/BPMJ-10-2016-0221
Shujahat, M., Sousa, M. J., Hussain, S., Nawaz, F., Wang, M., & Umer, M. (2019). Translating
the impact of Knowledge Management processes into knowledge-based innovation: The
neglected and mediating role of knowledge-worker productivity. Journal of Business
Research, 94, 442–450. https://doi.org/10.1016/j.jbusres.2017.11.001
126
Siedlecki, S. L. (2020). Understanding descriptive research designs and methods. Clinical Nurse
Specialist: The Journal for Advanced Nursing Practice, 34(1), 8–12.
https://doi.org/10.1097/NUR.0000000000000493
Slavinsky, J. (2016). Relating Knowledge Management Success Factors to Economic Value
within United States’ Airline Industry Firms (Publication No. 10242942) [Doctoral
dissertation, Northcentral University]. ProQuest Dissertations Publishing.
Standard Occupational Classification. (2018). 2018 Standard Occupational Classification
System. Retrieved from https://www.bls.gov/soc/2018/major_groups.htm
Steinau, S., Marrella, A., Andrews, K., Leotta, F., Mecella, M., & Reichert, M. (2019). DALEC:
a framework for the systematic evaluation of data-centric approaches to process
management software. Software & Systems Modeling, 18(4), 2679.
https://doi.org/10.1007/s10270-018-0695-0
Surawski, B. (2019). Who is a “knowledge worker” – clarifying the meaning of the term through
comparison with synonymous and associated terms. Management (1429-9321), 23(1),
105–133. https://doi.org/10.2478/manment-2019-0007
Sutanto, J., Liu, Y., Grigore, M., & Lemmik, R. (2018). Does knowledge retrieval improve work
efficiency? An investigation under multiple systems use. International Journal of
Information Management, 40, 42–53. https://doi.org/10.1016/j.ijinfomgt.2018.01.009
Tserng, H. P., Lee, M.-H., Hsieh, S.-H., & Liu, H.-L. (2016). The measurement factor of
employee participation for Knowledge Management System in engineering consulting
firms. Journal of Civil Engineering & Management, 22(2), 154–167.
https://doi.org/10.3846/13923730.2014.897963
Turriago-Hoyos, A., Thoene, U., & Arjoon, S. (2016). Knowledge workers and virtues in
127
Peter Drucker’s management theory. SAGE Open, 6(1), 1–9.
https://doi.org//10.1177/2158244016639631
Vanian, J. (2016). Businesses Expected to Spend $2.7 Trillion on I.T. by 2020.
https://www.fortune.com.
Vaske, J. J., Beaman, J., & Sponarski, C. C. (2017). Rethinking Internal Consistency in
Cronbach’s Alpha. Leisure Sciences, 39(2), 163–173.
http://dx.doi.org/10.1080/01490400.2015.1127189
Vehovar, V. & Manfreda, K. (2017). Overview: Online surveys. In N. Fielding R. Lee & G.
Blank. The SAGE Handbook of online research methods, 143-161. SAGE Publications
Ltd. https://doi.org/10.4135/9781473957992.n9
Venters, W. (2010). Knowledge management technology-in-practice: A social constructionist
analysis of the introduction and use of Knowledge Management Systems. Knowledge
Management Research & Practice, 8(2), 161-172. https://doi.org/10.1057/kmrp.2010.8
intelligence as promoter of knowledge transfer in multinational companies. Journal of
Business Research, 94, 367–377. https://doi.org/10.1016/j.jbusres.2018.01.033
Volkova, V. N., & Chernyi, Y. Y. (2018). Application of Systems Theory Laws for Investigating
Information Security Problems. Automatic Control and Computer Sciences, 52(8), 1164.
https://doi.org/10.3103/s0146411618080424
Vuori, V., Helander, N., & Okkonen, J. (2019). Digitalization in knowledge work: the dream of
enhanced performance. Cognition, Technology & Work, 21(2), 237.
https://doi.org/10.1007/s10111-019-00543-w
128
Wang, M.-H., & Yang, T.-Y. (2016). Investigating the success of Knowledge Management: An
empirical study of small- and medium-sized enterprises. Asia Pacific Management
Review, 21(2), 79–91. https://doi.org/10.1016/j.apmrv.2015.12.003
Wang, Y.-M., & Wang, Y.-C. (2016). Determinants of firms’ Knowledge Management System
implementation: An empirical study. Computers in Human Behavior, 64, 829–842.
https://doi.org/10.1016/j.chb.2016.07.055
Wei, Y., & Miraglia, S. (2017). Organizational culture and knowledge transfer in project-based
organizations: Theoretical insights from a Chinese construction firm. International
Journal of Project Management, 35(4), 571–585.
https://doi.org/10.1016/j.ijproman.2017.02.010
Wetcher-Hendricks, D. (2011). Analyzing quantitative data : An introduction for social
researchers. Retrieved from https://ebookcentral.proquest.com
Wilson, J. P., & Campbell, L. (2016). Developing a Knowledge Management policy for ISO
9001: 2015. Journal of Knowledge Management, 20(4), 829–844.
https://doi.org/10.1108/JKM-11-2015-0472
Wipawayangkool, K., & Teng, J. T. C. (2016). Paths to tacit knowledge sharing: knowledge
internalization and individual-task-technology fit. Knowledge Management Research &
Practice, 14(3), 309. https://doi.org/10.1057/kmrp.2014.33
Wright, K. (2017). Researching Internet-based populations: Advantages and disadvantages of
online survey research, online questionnaire authoring software packages, and Web
survey services. Journal of Computer-Mediated Communication, 10(3).
https://doi.org/1083-6101.2005.tb00259.x
129
Wu, J.-H., & Wang, Y.-M. (2006). Measuring KMS success: A specification of the DeLone and
McLean’s model. Information & Management, 43(6), 728–739.
https://doi.org/10.1016/j.im.2006.05.002
Xiaojun, Z. (2017). Knowledge Management System use and job performance: A multilevel
contingency model. MIS Quarterly, 41(3), 811-A5.
https://doi.org/10.25300/MISQ/2017/41.3.07
Yuqing Yan, A., & Zhang, Z. A. (2019). Knowledge transfer, sharing, and management system
based on causality for requirements change management. Information System and Data
Mining, 201. https://doi.org/10.1145/3325917.3325947
Zaim, H., Muhammed, S., & Tarim, M. (2019). Relationship between Knowledge Management
processes and performance: critical role of knowledge utilization in
organizations. Knowledge Management Research & Practice, 17(1), 24.
https://doi.org/10.1080/14778238.2018.1538669
Zamir, Z. (2019). The Impact of Knowledge Capture and Knowledge Sharing on Learning,
Adaptability, Job Satisfaction, and Staying Intention: A Study of the Banking Industry in
Bangladesh. International Journal of Entrepreneurial Knowledge, 7(1), 46–64.
https://doi.org/10.2478/IJEK-2019-0004
Zhang, X. (2017). Knowledge Management System use and job performance: A multilevel
contingency model. MIS Quarterly, 41(3), 811-A5.
https://doi.org/10.25300/MISQ/2017/41.3.07
Zhang, X., & Venkatesh, V. (2017). A Nomological network of Knowledge Management System
use: Antecedents and consequences. MIS Quarterly, 41(4), 1275–1306.
130
Zimmermann, A., Oshri, I., Lioliou, E., & Gerbasi, A. (2018). Sourcing in or out: Implications
for social capital and knowledge sharing. Journal of Strategic Information Systems,
27(1), 82–100. https://doi.org/10.1016/j.jsis.2017.05.001
Zuama, R. A., Hudin J.M., Puspitasari, D., Hermaliani, E.H., & Riana, D. (2017). Quality
dimensions of Delone-McLean model to measure students’ accounting computer
employee satisfaction: An empirical test on accounting system information. (2017). 2017
5th International Conference on Cyber and IT Service Management (CITSM).
131
Appendices
132
Appendix A
Figure 3
G*Power Statistics Analysis
133
Appendix B
Figure 4
Halawi’s (2005) KMS survey permission request/approval
134
Appendix C
Figure 5
Halawi KMS Survey Questions (2005)
135
136
137
138
139
140
Appendix D
Table 1
Research Study Variables (Jennex, 2017; Jennex & Olfman, 2006)
Variable name Variable type Variable scale Value
KMS knowledge quality Independent Variable
(IV)
Interval 1 – 7
Knowledge worker productivity Dependent Variable
(DV)
Interval 1 – 7
Employee satisfaction Dependent Variable
(DV)
Interval 1 – 7
141
Appendix E
Table 4
KMS Success Survey Participants by Gender
Gender Frequency Percent Valid
percent
Cumulative
percent
Male 132 85.7 85.7 85.7
Female 22 14.3 14.3 100
Total 154 100 100
Table 5
KMS Success Survey Participants by
Age
Participant
Age
N Minimum Maximum Mean
Age 146 20 68 41.0685
Total Reported 146
142
Table 6
KMS Success Survey Years Employed
Years employed Frequency Percent Valid
percent
Cumulative
percent
Less Than One Year (1) 1 0.6 0.6 0.6
One to Three Years (2) 8 5.2 5.2 5.8
Three to Five Years (3) 39 25.3 25.3 31.2
Five to Ten Years (4) 49 31.8 31.8 63
Greater Than Ten Years
(5)
57 37 37 100
Total 154 100 100
143
Table 7
KMS Success Survey Years of KMS Usage
Years usage Frequency Percent Valid
percent
Cumulative
percent
Less Than One Year (1) 2 1.3 1.3 1.3
One to Two Years (2) 23 14.9 14.9 16.2
Two to Three Years (3) 32 20.8 20.8 37
Three to Five Years (4) 46 29.9 29.9 66.9
Greater Than Five Years
(5)
51 33.1 33.1 100
Total 154 100 100
Table 8
KMS Success Survey Education Level
Education level Frequency Percent Valid
percent
Cumulative
percent
Some or No College Degree (1) 2 1.3 1.3 1.3
Associates Degree (2) 2 1.3 1.3 2.6
Bachelor’s Degree (3) 39 25.3 25.5 28.1
Master’s Degree or beyond (4) 110 71.4 71.9 100
Total 153 99.4 100
Unreported 1 0.6
Grand Total 154 100
144
Table 9
KMS Success Survey Employment Position
Position Frequency Percent Valid
percent
Cumulative
percent
Non-Mgmt. Professional (1) 9 5.8 5.8 5.8
Supervisor/Manager (2) 45 29.2 29.2 35.1
Sr. Manager/Director (3) 63 40.9 40.9 76
Executive (4) 8 5.2 5.2 81.2
President/CEO/COO/CIO/CKO(5) 29 18.8 18.8 100
Total 154 100 100
145
Table 10
KMS Success Survey Industry Employed
Industry Frequency Percent Valid
percent
Cumulative
percent
Banking/Financial Servicesl (1) 4 2.6 2.6 2.6
Government (4) 2 1.3 1.3 3.9
Health Care (5) 2 1.3 1.3 5.2
Information Technology (6) 137 89 89 94.2
Industrial/Manufacturing (7) 2 1.3 1.3 95.5
Wholesale/Retail (9) 1 0.6 0.6 96.1
Other/Specify (10) 6 3.9 3.9 100
Total 154 100 100
146
Appendix F
Figure 6
IRB Approval Letter
147
ProQuest Number:
INFORMATION TO ALL USERS
The quality and completeness of this reproduction is dependent on the quality
and completeness of the copy made available to ProQuest.
Distributed by ProQuest LLC ( ).
Copyright of the Dissertation is held by the Author unless otherwise noted.
This work may be used in accordance with the terms of the Creative Commons license
or other rights statement, as indicated in the copyright statement or in the metadata
associated with this work. Unless otherwise specified in the copyright statement
or the metadata, all rights are reserved by the copyright holder.
This work is protected against unauthorized copying under Title 17,
United States Code and other applicable copyright laws.
Microform Edition where available © ProQuest LLC. No reproduction or digitization
of the Microform Edition is authorized without permission of ProQuest LLC.
ProQuest LLC
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106 – 1346 USA
28716266
2021
Developing a Cloud Computing Risk Assessment Instrument for Small to Medium Sized
Enterprises: A Qualitative Case Study using a Delphi Technique
Dissertation Manuscript
Submitted to Northcentral University
School of Business
in Partial Fulfillment of the
Requirements for the Degree of
DOCTOR OF PHILOSOPHY IN BUSINESS ADMINISTRATION
by
MATTHEW WHITMAN MEERSMAN
San Diego, California
May 2019
ProQuest Number:
All rights reserved
INFORMATION TO ALL USERS
The quality of this reproduction is dependent upon the quality of the copy submitted.
In the unlikely event that the author did not send a complete manuscript
and there are missing pages, these will be noted. Also, if material had to be removed,
a note will indicate the deletion.
ProQuest
Published by ProQuest LLC ( ). Copyright of the Dissertation is held by the Author.
All rights reserved.
This work is protected against unauthorized copying under Title 17, United States Code
Microform Edition © ProQuest LLC.
ProQuest LLC.
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106 – 1346
13901056
13901056
2019
Approval Page
By
Approved by the Doctoral Committee:
Dissertation Chair: INSERT NAME Degree Held Date
Committee Member: INSERT NAME Degree Held Date
Committee Member: INSERT NAME Degree Held Date
DocuSign Envelope ID: 23DAA0D2-1F9A-4BB4-87A7-E419FCD9F58D
06/20/2019 | 10:46:15 PDT
Sherri Braxton
Sc.D.
Marie Bakari
06/20/2019 | 11:44:54 MST
DBA, MBA
Ph.D. 06/20/2019 | 10:44:31 MST
Garrett Smiley
Developing a Cloud Computing Risk Assessment Instrument for Small to Medium Sized
Enterprises: A Qualitative Case Study using a Delphi Technique
MATTHEW WHITMAN MEERSMAN
ii
Abstract
Organizations’ leaders lack security expertise when moving IT operations to the Cloud. Almost
all small to medium enterprises need to solve this problem. This qualitative research study using
a Delphi technique, addressed the problem that there is no commonly understood and adopted
best practice standard for small to medium sized enterprises (SMEs) on how to specifically
assess security risks relating to the Cloud. Experienced risk experts from a local chapter of an
international organization in Washington, D.C. responded to three sets of questions through
SurveyMonkey. This research study shows the current state of Cloud risk assessments has not
kept up with the changes in business and IT brought on by Cloud computing. Almost all
respondents indicated that staffing and non-technical concerns affect SMEs that are transitioning
to the Cloud. SMEs are not properly risk assessing and auditing Cloud computing environments.
The primary mitigation recommendation for SMEs is outsourcing or using third parties on some
or all of the SMEs’ Cloud computing. As the use of Cloud computing is becoming an inflection
point for SMEs, SMEs’ decision makers need more research on both the process and the results.
Cloud computing is fundamentally changing the daily pace of business, and this research study
shows that SMEs have not kept pace. SMEs would greatly benefit from more research to help
them adopt Cloud computing securely and effectively.
iii
Acknowledgements
Thank you to my parents, whose erudition set high standards, and my family for their
support. Thank you to Katherine Scott whose help was essential during my research phase.
Thank you to my dissertation chair Dr. Garrett Smiley for his expert guidance and weekly phone
calls. Thank you to my committee subject matter expert Dr. Sherri Braxton. Thank you to my
committee academic reader Dr. Marie Bakari.
iv
Table of Contents
Chapter 1: Introduction ……………………………………………………………………………………………………. 1
Statement of the Problem ……………………………………………………………………………………………. 3
Purpose of the Study ………………………………………………………………………………………………….. 4
Theoretical Conceptual Framework ……………………………………………………………………………… 5
Nature of the Study ……………………………………………………………………………………………………. 7
Research Questions ……………………………………………………………………………………………………. 9
Significance of the Study ………………………………………………………………………………………….. 10
Definitions of Key Terms …………………………………………………………………………………………. 11
Summary ………………………………………………………………………………………………………………… 11
Chapter 2: Literature Review ………………………………………………………………………………………….. 13
Introduction …………………………………………………………………………………………………………….. 13
Documentation ………………………………………………………………………………………………………… 15
Theoretical Framework …………………………………………………………………………………………….. 16
Themes …………………………………………………………………………………………………………………… 21
Chapter 3: Research Method …………………………………………………………………………………………… 59
Research Methodology and Design ……………………………………………………………………………. 60
Population and Sample …………………………………………………………………………………………….. 63
Materials and Instrumentation …………………………………………………………………………………… 64
Study Procedures …………………………………………………………………………………………………….. 65
Data Collection and Analysis…………………………………………………………………………………….. 67
Assumptions ……………………………………………………………………………………………………………. 69
Limitations ……………………………………………………………………………………………………………… 71
Delimitations …………………………………………………………………………………………………………… 72
Ethical Assurances …………………………………………………………………………………………………… 73
Summary ………………………………………………………………………………………………………………… 74
Chapter 4: Findings ……………………………………………………………………………………………………….. 76
Trustworthiness of the Data ………………………………………………………………………………………. 76
Results ……………………………………………………………………………………………………………………. 82
Evaluation of the Findings ………………………………………………………………………………………. 109
Summary ………………………………………………………………………………………………………………. 112
Chapter 5: Implications, Recommendations, and Conclusions ………………………………………….. 114
Implications…………………………………………………………………………………………………………… 117
v
Recommendations for Practice ………………………………………………………………………………… 121
Recommendations for Future Research …………………………………………………………………….. 122
Conclusions …………………………………………………………………………………………………………… 123
References ………………………………………………………………………………………………………………….. 125
Appendix A Survey Answers Aggregate …………………………………………………………………… 163
Appendix B Survey one individual answers ………………………………………………………………. 207
Appendix C Survey two individual answers ………………………………………………………………. 253
Appendix D Survey Three Individual Answers ………………………………………………………….. 322
Appendix E Validated survey instrument ………………………………………………………………….. 352
vi
List of Figures
Figure 1. Creswell data analysis spiral. ……………………………………………………………………………. 68
vii
List of Tables
Table 1 Survey 1. Q9: What IT related frameworks (partially or completely) do you see SMEs
adopting?……………………………………………………………………………………….86
Table 2 Survey 1, Q 13: What Cloud security configuration baselines have you seen used by
SMEs? …………………………………………………………………………………………………………………………. 87
Table 3 Survey 3, Q14: Have you seen Cloud risk assessments change other previously
completed SME risk assessments in the ways listed below? ……………………………………………….. 89
Table 4 Survey 3, Q15: How do you see SMEs changing their risk and audit teams to adapt to
Cloud environments? …………………………………………………………………………………………………….. 90
Table 5 Survey 1, Q15: What non-technical areas of concern do you see when SMEs are
contemplating Cloud adoption?………………………………………………………………………………………..92
Table 6 Survey 1, Q17: What IT (non-security) areas of concern do you see for SMEs as they
adopt Cloud computing? ………………………………………………………………………………………………… 93
Table 7 Survey 2, Q8: When starting to plan a transition to a Cloud environment, what have you
seen SMEs start with before risk assessments or collections of requirements? ……………………… 94
Table 8 Survey 3, Q10: When assessing risk of Cloud environments, do you see SMEs changing
their process in the ways listed below? …………………………………………………………………………….. 96
Table 9 Survey 1, Q11: What IT security control standards do you see SMEs using? ……………. 98
Table 10 Survey 1, Q 19: What Cloud security controls do you see SMEs adopting? ………….. 101
Table 11 Survey 2, Q10: Which portions of a transition to a Cloud environment have you seen
recommended to be outsourced? ……………………………………………………………………………………. 104
Table 12 Survey 3, Q 13: Once controls have been identified for the SME’s environment, what
effect do they have on existing SME IT controls? ……………………………………………………………. 106
viii
Table 13 Survey 2, Q10: Which portions of a transition to a Cloud environment have you seen
recommended to be outsourced? ……………………………………………………………………………………. 109
1
Chapter 1: Introduction
The information technology (IT) industry is a recent addition to the world of business but
has become a critical part of every enterprise level organization in this century (Jeganathan,
2017). Information technology has become critical to any business over a certain size and has
become a significant part of most large business’ IT budgets (Ring, 2015). IT is a field of truly
breathtaking change, and industry standards for storing company data (Al-Ruithe, Benkhelifa, &
Hameed, 2016), reaching out to customers (Alassafi, Alharthi, Walters, & Wills, 2017), and
creating new value from company assets (Khan & Al-Yasiri, 2016) can change on a frequent
basis. No part of IT is safe from rapid change, including fundamental concepts such as what a
computer is, and where a company should put the computer (Bayramusta & Nasir, 2016; Funk,
2015). A change that is gathering speed in the IT industry that has the potential to disrupt almost
every daily task for cybersecurity professionals is Cloud computing.
Cloud computing at its simplest is using someone else’s computers to perform the
organization’s IT operations (Rao & Selvamani, 2015). Instead of using hardware that the
organization has bought and takes care of, the organization uses virtual servers in an
environment usually built and maintained by another company. Some versions of private Clouds
use the organization’s own hardware, but that is rare, and is more of a semantical redefinition of
using virtual servers in house (Molken & van Wilkins, 2017). Cloud computing as commonly
understood in the IT industry, is a virtual computing environment hosted, maintained, and at
least partially secured by a third party (Lian, Yen, & Wang, 2014). The use of Cloud computing
by an organization allows its IT team to focus on more strategic and business aligned activities
and removes physical maintenance and other activities related to owning and caring for computer
servers (Rahul, & Arman, 2017). By adopting Cloud computing, an organization can remove
2
high cost items from its capital expenditure (CapEx) budget such as server rooms with expensive
cooling and huge electrical needs and replace them with more cost-efficient operating expenses
(OpEx) budget items such as virtual server rentals from Cloud service providers (CSP)
(Mangiuc, 2017).
Although cost savings are a primary driver for many organizations that adopt Cloud
computing, the reasons why there is need for more research on Cloud computing is much more
interesting (Bayramusta & Nasir, 2016). Cloud computing is rapidly changing the fundamental
underlying foundational paradigms of IT and how businesses use IT (Chatman, 2010;
Hosseinian-Far, Ramachandran, Sarwar, 2017; Wang, Wood, Abdul-Rahman, & Lee, 2016).
Even though IT is a new and fast-moving field compared to most parts of business, Cloud
computing is an even more powerful change agent. Many careers in IT are changing or
disappearing because of Cloud computing (Khan, Nicho, & Takruri, 2016). Basic ideas in IT
such as what describes a server most accurately, or how an IT business process comes to fruition,
change very rapidly because of Cloud computing (El Makkaoui, Ezzati, Beni-Hssane, &
Motamed, 2016). The primary constraint that has prevented some organizations from adopting
Cloud computing is the security of the organization’s data in the Cloud (Ring, 2015). One of the
most important business processes that organizations can use to evaluate and resolve Cloud
security concerns is risk assessment and analysis (Damenu & Balakrishma, 2015; Viehmann,
2014; Weintraub & Cohen, 2016). Researching ways organizations successfully address these
security concerns is an important contribution to the industry and the academic field of research
(Alassafi, Alharthi, Walters, & Wills, 2017; Al-Ruithe, Benkhelifa, & Hameed, 2016; Ray,
2016).
3
There are multiple academic and practical approaches to securing Cloud computing
environments (Aljawarneh, Alawneh, & Jaradat, 2016; Casola, De Benedictis, Rak, & Rio, 2016;
Choi & Lee, 2015). Many solutions rely on organizations trusting the CSP (Trapero, Modic,
Stopar, Taha, & Sur, 2017). Organizations that do not trust their CSPs or other vendors to
provide complete Cloud security either by mandate or by common business practice (Cayirci,
Garaga, Santana de Oliveira, & Roudier, 2016; Preeti, Runni, & Manjula, 2016) need a different
approach. Organizations that modify or adapt their current business practices or the Cloud
computing environment the organization uses, may provide a useful template for future academic
research into Cloud security.
Statement of the Problem
The researcher used this study to address the problem that there is no commonly
understood and adopted best practice standard for small to medium sized enterprises (SMEs) on
how to specifically assess security risks relating to the Cloud (Coppolino, D’Antonio, Mazzeo, &
Romano, 2016; El Makkaoui, Ezzati, Beni-Hssane, & Motamed, 2016; Raza, Rashid, & Awan,
2017). Existing business processes and industry frameworks follow a design created for larger,
on premise environments and as such, do not effectively address Cloud computing security
concerns for smaller organizations (El-Gazzar, Hustad, & Olsen, 2016; Gleeson & Walden,
2016). To date, larger organizations are relying on Cloud Service Providers (CSPs) to supply
their own security tools (Jaatun, Pearson, Gittler, Leenes, & Niezen, 2015), Service Level
Agreements (SLAs) (Barrow, Kumari, & Manjula, 2016), better Cloud services customer
education (Paxton, 2016), new data classification laws and regulations (Gleeson & Walden,
2016), comprehensive security and management frameworks (Raza, Rashid, & Awan, 2017), and
a myriad of tailored solutions to specific problems, leaving a baseline that could be leveraged by
4
lesser sized organizations for Cloud security risk assessment to be addressed by others. SMEs
have slowed their adoption of Cloud computing even though Cloud computing improves many
business processes and offers significant savings (Issa, Abdallah, & Muhammad, 2014; Khan,
Nicho, & Takruri, 2016). There are several interesting academic solutions to Cloud security
issues published (Alasaffi, Alharthi, Walters, & Wills, 2017; Al-Ruithe, Benkhelifa, & Hameed,
2016; Gleeson & Walden, 2016), but rarely a case of a solution adopted in the corporate world
that a smaller sized organization could leverage (Rebello, Mellado, Fernandez-Medina, &
Mouratidis, 2014); these studies do not build upon current industry best practices and are not
viable for most organizations. Organizations that are adopting Cloud computing are facing these
new security issues without a consensus solution that all organizations, regardless of their size
can use (Coppolino, D’Antonio, Mazzeo, & Romano, 2016; El Makkaoui, Ezzati, Beni-Hssane,
& Motamed, 2016; Raza, Rashid, & Awan, 2017).
Purpose of the Study
The purpose of this qualitative case study-based research study was to discover an
underlying framework for research in SME risk analysis for Cloud computing and to create a
validated instrument that SMEs can use to assess their risk in Cloud adoption. Unlike SMEs, the
vast majority of medium to large enterprises use risk assessments before adopting new
computing environments (Cayirci, Garaga, Santana de Oliveira, & Roudier, 2016; Jouini &
Rabai, 2016). SMEs need a process or validated instrument such as a risk assessment to
determine if they should move to the Cloud (Bildosola, Rio-Belver, Cilleruelo, & Garechana,
2015; Carcary, Doherty, & Conway, 2014; Hasheela, Smolander, & Mufeti, 2016). Research
shows that SMEs using a risk-based approach have not reached a consensus on how to identify
and address Cloud security risks (Carcary, Doherty, Conway, & McLaughlin, 2014; Kumar,
5
Samalia, & Verma, 2017). The target population for this research study was risk professionals
that were either employed by or contracted to SMEs to perform Cloud security risk assessments.
Members of a Washington D.C. area chapter of a professional risk association represent
the target population and the sample population consisted of those chapter members that respond
to the web survey. The population of risk experts in the chapter is approximately three thousand
members. This research study used a web survey predicated on a Delphi technique with two or
three rounds as the method to gather data from a group of IT risk experts based on membership
in the Washington D.C. area local chapter of ISACA. ISACA is a global organization of
information systems auditors and risk assessors. ISACA publishes information security
governance and risk assessment guides including COBIT (ISACA GWDC, 2018). Recent
studies using a Delphi technique show useful results with sample population sizes of forty or
fewer participants (Choi & Lee, 2015; El-Gazzar, Hustad, & Olsen, 2016; Johnson, 2009). With
a potential population of approximately three thousand and a participation rate as low as one per
cent, the resulting sample size of close to thirty experts was more than enough to complete the
research study and return useful results. Using case study procedures, this research study
addressed the concerns of SMEs looking at Cloud adoption (Glaser, 2016). Case study review
and analysis was the procedure inductively used to create a thesis from the survey results. A risk
assessment instrument for SMEs follows from the generated theory.
Theoretical Conceptual Framework
The underlying conceptual framework for this research study is that SMEs have different
needs than large enterprises regarding Cloud computing environment risk assessments, and
academic research has not answered those needs yet. Cloud environment risk assessments create
new problems for organizations of all sizes (Assante, Castro, Hamburg, & Martin, 2016;
6
Hussain, Hussain, Hussain, Damiani, & Chang, 2017). While SMEs of different regions may
have distinct issues (Bildosola, Rio-Belver, Cilleruelo, & Garechana, 2015; Carcary, Doherty, &
Conway, 2014; Kumar, Samalia, & Verma, 2014), a common factor for all SMEs is that SMEs
have greater challenges solving these problems as SMEs have less resources and fewer skilled
employees to resolve Cloud computing risk assessment issues (Assante, Castro, Hamburg, &
Martin, 2016; Carcary, Doherty, & Conway, 2014; Chiregi & Navimipour, 2017). A common
factor for all SMEs is that many academic solutions to properly risk assessing Cloud computing
environments require highly technical knowledge and skills (Hasheela, Smolander, & Mufeti,
2016; Kumar, Samalia, & Verma, 2017) or large budgets (Mayadunne & Park, 2016; Moyo &
Loock, 2016). While properly deployed large enterprise solutions may show positive results if
used by SMEs, the cost in both employee skills and financial outlay prohibit these solutions in
the real world (Bildosoia, Rio-Belver, Cillerueio, & Garechana, 2015; Carcary, Doherty,
Conway, & McLaughlin, 2014). Appropriate solutions for large multi-national enterprises are not
the correct answer for SMEs.
The smaller budgets of SMEs require Cloud environment risk assessments that not only
require smaller initial cost or capital expenditures (CapEx), but also require small to no
continuing costs or operating expenses (OpEx). Academic solutions to Cloud environment risk
assessment needs that cannot exist in the smaller confines of the SME world, do not contribute to
the field of SME Cloud computing environment risk assessments. While foundational research
that indicates SMEs need workable Cloud computing environment risk assessments is present
(Lacity & Reynolds, 2014; Phaphoom, Wang, Samuel, Helmer, & Abrahamsson, 2015;
Priyadarshinee, Raut, Jha, & Kamble, 2017), the next step of addressing the need is not yet
current (Trapero, Modic, Stopar, Taha, & Suri, 2017; Wang, Wood, Abdul-Rahman, & Lee,
7
2016). Addressing the need of SMEs to properly assess the risk of using Cloud computing
environments is a new field of research hampered by factors unique to the SME paradigm. Due
to the reduced levels of skill and budget amounts available to SMEs, the research for SME Cloud
computing risk assessments must focus on simpler ways to assess and reduce the risk of Cloud
computing adoption. This research study attempted to supply a workable risk assessment
instrument based on research that other researchers can extend and amplify going forward.
Nature of the Study
A qualitative approach using a case study methodology is the best solution as the theory
relating to a successful Cloud computing risk assessment does not yet exist. A problem solved by
using a qualitative case study approach is that the subject population of risk-based Cloud
computing research experts were able to respond with qualitative data but not quantitative
numbers to avoid compromising their organization’s security (Beauchamp, 2015). Even though
the audience for this research study commonly works in quantitative ways, the audience will find
value in qualitative case study research on this topic (Liu, Chan, & Ran, 2016). A truly
experimental design for this research study was not feasible as the topic is not a general one, and
a random selection of the population would not possess the requisite knowledge needed to
address the topic of Cloud security. Even narrowing the population to that of cybersecurity
engineers, Cloud computing expertise is in short supply, and Cloud computing security even
more so (Khan, Nicho, & Takruri, 2016).
Other qualitative research approaches lack the flexibility needed to discover and refine a
new theory and a validated tool for SMEs from existing industry standards. Ethnographic,
phenomenological, or narrative approaches do not work for this research study based on data
regarding Cloud computing risk assessments. A grounded theory approach was not appropriate
8
for several reasons, but most importantly because of the security of the participating subjects’
organizations. Cybersecurity professionals do not commonly discuss specifics in their fight to
keep their organizations secure, which limits a researcher’s ability to ask questions and develop
connections during a coding process (Rebello, Mellado, Fernandez-Medina, & Mouratidis,
2014). If details of the cybersecurity professionals’ organizations’ defenses are common
knowledge, then their adversaries gain an advantage. This organizational security concern is also
the primary factor regarding ethical concerns for this research study.
The proposed case study research study design includes use of the Delphi technique. The
RAND Corporation created the Delphi technique to facilitate the collation and distillation of
expert opinions in a field (Hsu & Sanford, 2007). The Delphi technique seems well designed for
the Internet with current researchers using “eDelphi” based web surveys (Gill, Leslie, Grech, &
Latour, 2013). Although Cloud security is a very new field, some illustrative research is evident
in the field using Delphi techniques (Choi & Lee, 2015; El-Gazzar, Hustad, & Olsen, 2016; Liu,
Chan, & Ran, 2016). These studies use the Delphi technique in different manners, but similar to
this proposed research study, all rely on electronic communications with groups of experts.
Although the research topic is very specialized compared to some business research
topics, the topic is broad enough to select a sample population large enough to meet the needs of
this research study. Using a Delphi technique with two or three rounds of surveys further reduces
the appropriate number of subjects needed for this research study, although Delphi techniques
have subject based issues also. For this research study, ethical concerns focused on protecting the
anonymity of the respondents and their organizations. There is no consensus in the industry or
the academic research field on what works for Cloud security (Ring, 2015), so a case study of
9
even a very successful effort to secure Cloud computing would not have much external validity
or replication interest.
The data collection procedures and data analysis followed accepted Delphi technique
practices. As academic research is still exploring the current state of Cloud security, the
informative value of industry-based practitioners’ tools and techniques is very high (Lynn,
VanDer Werff, Hunt, & Healy, 2016). The researcher employed a Delphi technique to gather and
distill the current frameworks, categories, controls, and recommended mitigation used by a
representative expert group of risk professionals. Although the experts in this research study are
not academics, the results still add to the field of research because the field is so new.
Research Questions
The questions used in the survey elicited details on how organizations adopt current risk
assessment processes and other business procedures used to approve new IT computing
environments, and/or what new paradigms organizations are using. Additionally, by using a
Delphi technique this research study was able to identify if there is a consensus on what works
and if there are processes and procedures that have shown success.
RQ1. What are the current frameworks being leveraged in Cloud specific risk
assessments?
RQ2. What are the primary categories of concern presently being addressed in Cloud
specific risk assessments?
RQ3. What are the commonly used and tailored security controls in Cloud specific risk
assessments?
RQ4. What are the commonly recommended mitigations in Cloud specific risk
assessments?
10
Significance of the Study
This research study is important because it contributes to the academic field of Cloud
computing security risk assessment solutions, and to the security of SMEs adopting Cloud
computing. The answers to the research questions posed by this study have importance to both
SMEs and the academic field. IT risk assessment solutions for on-premises computing have
achieved maturity from an academic viewpoint, but those frameworks and existing IT solutions
are not adequate for Cloud based computing (Coppolino, D’Antonio, Mazzeo, & Romano, 2016;
El Makkaoui, Ezzati, Beni-Hssane, & Motamed, 2016; Raza, Rashid, & Awan, 2017). Even
though researchers have proposed several academic frameworks to improve risk assessments for
Cloud based computing for large enterprises, this research study is one of the first to provide
evidence of which frameworks experts in the field are starting to use. SMEs can use the results of
this research study to better secure their Cloud computing environments. While many non-viable
frameworks are interesting thought experiments and contribute to the body of academic
knowledge, researchers that are interested in real world feedback on proposed Cloud computing
risk assessments solutions will be able to use this research study to provide direction for SMEs.
Researchers can also use this research study as an example of effective Delphi techniques for
research in the Cloud security field.
Industry based professionals will find significance in this research study as it provides
guidance on what expert practitioners are using. The IT field has issues with sharing solutions.
This is because any publication describing organizations security solutions can provide
information that bad actors could use to find weaknesses in the organization’s security (Jouini &
Rabai, 2016). Through this research study, the researcher provides pertinent and accurate
11
information regarding Cloud computing risk assessments that may not be available by other
means.
Definitions of Key Terms
Cloud Data Storage: Cloud storage is a way for organizations to store data on the
Internet as a service instead of using on-premises storage systems. Cloud data storage key
features include standard Cloud features such as just-in-time capacity, and no up-front CapEx
expenditures (Phaphoom, Wang, Samuel, Helmer, & Abrahamsson, 2015)
Cloud Service Provider (CSP): The current term for an organization that offers services
to customers from a remote data center connected via the Internet. Major public CSPs include
AWS, Google, and Microsoft. (Cayirci, Garaga, Santana de Oliveira, & Roudier, 2016).
Security as a Service (SECaaS): Security as a service (SECaaS) is where a third party
provides an organization’s IS needs. Due to the structure of most CSPs, SECaaS is becoming
increasingly important. (Aldorisio, 2018)
Service Level Agreement (SLA): A service-level agreement (SLA) defines the level of
service an organization expects from a third party. Cloud SLAs are becoming an option for an
organization’s security requirements. (Overby, Greiner, & Paul, 2017)
Summary
Research in IT fields has a hard time keeping up with real world applications due to the
high rate of change in the industry. This issue increases almost exponentially when one focuses
on Cloud computing security. Many research studies have taken the first step and identified risk-
based organizational concerns with Cloud computing security, and a few authors have proposed
novel solutions. Evidence of what organizations are doing to satisfy their risk requirements in
Cloud computing adoption is not clear. A qualitative multiple case study-based research study
12
adds to the body of knowledge and further the research in the field of Cloud security, as the field
is not at the point where consensus of what are the successful frameworks and theories has
emerged. Through this study the researcher addressed the problem that researchers cannot
identify commonly understood and adopted best practice standards for small to medium sized
enterprises (SMEs) on how to specifically assess security risks relating to the Cloud. The
creation of a new framework for academic treatment of SME Cloud computing risk, and the
creation of a validated instrument that SMEs can use to assess their risk in Cloud adoption were
the reasons for this research study. A survey with a Delphi technique of industry experts is a
good step to resolving those concerns of SMEs adopting Cloud computing and is a good step to
increasing the knowledge in the academic field of Cloud security. The guiding framework of this
research study is that the risk assessment process for Cloud computing environments is
fundamentally different for SMEs than large enterprises and the primary data collection
instrument is a web survey of risk experts with a Delphi technique. The population for this
research study has constraints on security information that they can share. A qualitative case
study-based theory approach was the only way for a researcher to gather the data needed to
propose a unifying theory for SME Cloud computing risk assessment. As the state of research in
SME risk assessment tools and procedures is still in the nascent stages, case study-based theory
is the correct framework to advance the field and to create a validated instrument for SME Cloud
computing risk assessments.
13
Chapter 2: Literature Review
Introduction
This was a qualitative case study-based theory-based research project using a Delphi
technique. The researcher’s goal for this study was two-fold. The first goal was to contribute to
the academic field of research regarding SMEs adoption of Cloud computing. The second goal
was to create a validated risk instrument for use by SMEs to evaluate and assess the various risks
involved in adopting Cloud computing environments. The researcher with this research study
used several rounds of a web-based survey instrument to question a population sample of risk
subject matter experts as defined by membership in the Greater Washington D.C. chapter of
ISACA (ISACA GWDC, 2018). This research study was a direct result of the literature review
which revealed the lack of academically sound solutions for SMEs to resolve Cloud security
issues (Assante, Castro, Hamburg, & Martin, 2016; Mayadunne & Park, 2016; Rasheed, 2014).
New theory must spring from collected data, a very good fit for qualitative case study-based
theory approaches (Glaser, 2016; Mustonen-Ollila, Lehto, & Huhtinen, (2018; Wiesche, Jurisch,
Yetton, & Krcmar, 2017).
The search strategies used in researching this study included the use of Northcentral’s
online library, Google Scholar, and other online resources. The searches using Northcentral
library included all possible databases including the Institute of electrical and electronics
engineers (IEEE), the Association for computing machinery (ACM), the ProQuest computing
database, and the Gale information science and technology collection. Almost all resources are
from peer reviewed journals or conference papers that are less than five years old. Including
conference papers was a necessary decision because the general field of Cloud computing is
new, and specific sub-fields more so. Even though most conference papers are short and
14
primarily descriptive, conference papers are the leading edge of published work in a field and
very important for a field undergoing as rapid a growth as Cloud computing. As any aspect of
Cloud computing is a very young academic field, there are very few seminal articles in the field,
so none will appear in the literature review (Bayramusta & Nasir, 2016; Chang, Y., Chang, P.,
Xu, Q., Ho, K., Halim, 2016; Lian, Yen, & Wang, 2014). Perhaps the closest to seminal in Cloud
research is the U.S. Department of Commerce, National Institute of Standards (NIST) special
publications regarding Cloud computing found in this research study’s list of citations (Li & Li,
2018; Mell & Grance, 2011).
This chapter includes the literature review which starts with the broad theme of
cybersecurity and Cloud computing and moves towards the more specific topic of Cloud
computing risk assessments. In this literature review, the researcher continues by addressing
several themes related to Cloud security, including themes such as improving Cloud security
with technical approaches such as encryption and new tool designs for Cloud computing
environments. The next theme is business process approaches to securing Cloud computing
environments including Security as a service (SecaaS) and service level agreements (SLAs)
including security service level agreements (SecSLAs). Cloud computing is a new field of
academic research, but there are signs of consensus among researchers on several topics (Al-
Anzi, Yadav, & Soni, 2014; Rao & Selvamani, 2015).
Once the researcher presents sufficient detail regarding Cloud computing environments
and the security tools and techniques needed to secure Cloud computing environments, the
literature review will move to a SME related discussion. Research on SMEs and Cloud
computing tend to follow predictable patterns. There is an abundance of SME and Cloud
literature based on the geographical location of the SMEs (Carcary, Doherty, & Conway, 2014;
15
Carcari, Doherty, Conway, & McLaughlin, 2014; Hasheela, Smolander, & Mufeti, 2016; Kumar,
Samalia, & Verma, 2017; Qian, Baharudin, & Kanaan-Jeebna, 2016). Another popular topic
regarding SMEs and adopting Cloud computing environments focuses on the difference between
SMEs and large enterprises (Bildosola, Rio-Belver, Cilleruelo, & Garechana, 2015; Gastermann,
Stopper, Kossik, & Katalinic, 2014; Llave, 2017; Mayadunne & Park, 2016; Seethamraju, 2014).
The best research on the differences between SMEs and large enterprises adopting Cloud
computing environments, however, points to SMEs needing their own risk assessment processes
for adopting Cloud computing environments (Senarathna, Yeoh, Warren, & Salzman, 2016;
Vasiljeva, Shaikhulina, Kreslins, 2017; Wakunuma & Masika, 2017).
With the themes following the SME discussion the researcher focused on risk, risk
assessments, and Cloud computing risk assessments. The research regarding risk assessments has
a much longer history than research regarding Cloud computing adoption both in industry and in
academia (Alcantara & Melgar, 2016; Vijayakumar & Arun, 2017). Perhaps because of the well
understood and researched nature of risk assessments, there is a tendency to equate what works
with on-premise risk assessments with Cloud risk assessments, but the best of recent research
shows that solutions must change with several possible directions (Brender & Markov, 2013;
Rittle, Czerwinski, & Sullivan; Togan, 2015).
Documentation
The search terms used for this literature review generally followed the presentation of
themes. Parameters were bounded by peer reviewed journals, and 2014 or later for all searches.
The first set of searches using all available databases were Cloud, Cloud computing. Cloud
service provider, Cloud adoption. The next set of searches focused on Cloud security, improving
Cloud security, Cloud security solutions, Cloud security problems. Following searches drilled
16
down into specific types of Cloud security solutions including (Cloud OR virtual) security AND
encryption, hypervisor, network, software, or framework. Based on the previous searches, Cloud
SLAs, Cloud SecSLAs, Cloud SecaaS, were the next set of searches. Moving on to SMEs
included searches such as (SME OR small medium) Cloud, Cloud security, Cloud adoption,
Cloud security solutions. Risk based searches included Cloud risk, Cloud adoption risk, Cloud
risk assessment, SME Cloud risk solutions.
Theoretical Framework
The underlying conceptual framework for this research study is that SMEs have different
needs than large enterprises regarding Cloud computing environment risk assessments, and
academic research has not answered those needs yet (Haimes, Horowitz, Guo, Andrijcic, &
Bogdanor, 2015; Gritzalis, Iseppi, Mylonas, & Stavrou, 2018; Moncayo, & Montenegro, 2016).
This is not a giant leap into the unknown, but more of a much-needed enhancement and
specialized focus of the current business risk paradigm. Practitioners have done work on
adapting large enterprise risk assessment paradigms for Cloud computing environments but even
so, the current conceptual framework of business risk assessments does not work for SMEs
evaluating Cloud computing environments (Mahmood, Shevtshenko, Karaulova, & Otto, 2018;
Priyadarshinee, Raut, Jha, & Kamble, 2017; Wang & He, 2014). SMEs trying to use standard
risk assessment processes based on previous academic research will make incorrect decisions
regarding the risk posed by adopting Cloud computing (Kritikos & Massonet, 2016; Vasiljeva,
Shaikhulina, & Kreslins, 2017). This can lead to SMEs making poor financial decisions and
costing SMEs a competitive advantage in their field (Al-Isma’ili, Li, Shen, & He, 2016;
Fernando & Fernando, 2014). Researchers trying to use current on-premises paradigms to guide
their research efforts in SME risk assessments regarding Cloud computing adoption will not
17
discover useful validated theory. A refined conceptual framework focused on Cloud computing
risks and threats is needed both for use by SMEs in the business world and for academic
researchers trying to discover how SMEs can best use Cloud computing environments (Ali,
Warren, & Mathiassen, 2017; Islam, Fenz, Weippl, & Mouratidis, 2017).
Risk assessments are a standard business process for organizations making significant
changes to their operations (Mahmood, Shevtshenko, Karaulova, & Otto, 2018; Weintraub &
Cohen, 2016). The codification of risk assessments as part of an organization’s decision-making
process have been going on since business practices started (Lanz, 2015; Szadeczky, 2016). As
new opportunities and environments including IT present themselves, organizations assess the
potential risk of changing the way the organization does business (Djuraev & Umirzakov, 2016;
Gupta, Gupta, Majumdar, & Rathore, 2016). The incredible growth of IT use in business has led
to mature and well accepted standard frameworks for addressing IT risk for large enterprises
(Atkinson, & Aucoin, 2015; Lanz, 2015; Lawson, Muriel, & Sanders, 2017).
IT risk assessments are an integral part of major changes in large enterprise’s IT
operations and there are several large-scale industry created IT risk and operations frameworks
(Calvo-Manzano, Lema-Moreta, Arcilla-Cobián, & Rubio-Sánchez, 2015; Moncayo, &
Montenegro, 2016). COBIT and ITIL as examples of industry-based frameworks, work very well
for large enterprises but are too much work for a typical SME (Devos, & Van de Ginste, 2015;
Oktadini & Surendro, 2014). SMEs have previously used ad-hoc risk assessment tools and
smaller scale solutions when evaluating IT risk (Erturk, 2017; Gastermann, Stopper, Kossik, &
Katalinic, 2014). SME risk tools are fairly well adapted to evaluating on-premises computing
risks but do not address important Cloud computing environment issues and threats (Aljawarneh,
Alawneh, & Jaradat, 2016; Lalev, 2017).
18
Cloud computing is still in its infancy and many SMEs that perform IT risk assessments
are trying to use their current on-premises IT risk assessment frameworks to evaluate whether or
not Cloud computing will save costs or provide a competitive advantage (Erturk, 2017;
Gastermann, Stopper, Kossik, & Katalinic, 2014). Their existing frameworks do not accurately
capture or describe the advantages and disadvantages of Cloud computing environments
(Goettlemann, Dahman, Gateau, Dubois, & Godart, 2014; Lai & Leu, 2015). SMEs also do not
have the capacity to adopt large enterprise risk frameworks that can create modifications for use
in evaluating Cloud computing environments (Devos, & Van de Ginste, 2015; Oktadini &
Surendro, 2014). For example, a central tenet of current SME on-premise IT risk assessments is
that an organization’s data can only be truly secure on the organization’s own IT infrastructure
(Chiregi & Navimipour, 2017). If this is a core principle of an SME’s IT security policy, then the
SME cannot use Cloud computing, as the definition of Cloud computing is that of using someone
else’s hardware. By enhancing and focusing the existing SME risk assessment framework to
properly identify Cloud computing environment risks, this research study adds to the academic
field of SME Cloud risk assessment frameworks and create a validated risk instrument that
SMEs may use.
A large number of SMEs are not IT focused and have used their existing business
processes related to risk instead of trying to adapt current large enterprise IT risk frameworks
(Haimes, Horowitz, Guo, Andrijcic, & Bogdanor, 2015; Tisdale, 2016). Some SMEs trying to
avoid using the current frameworks of on-premises IT risk assessments by following non-IT
business practices and mitigating or transferring IT risk to a third party using managed IT
solutions or managed security services providers (MSSP) (Chen & Zu, 2017; Torkura, Sukmana,
Cheng, & Meinel, 2017). Even for those SMEs that do not use IT risk assessment frameworks or
19
transfer risk by using MSSPs, Cloud computing is a very attractive alternative primarily due to
lower costs (Bildosola, Río-Belver, Cilleruelo, & Garechana,2015; Lacity & Reynolds, 2013).
The SMEs not using the dominant on-premises IT risk assessment frameworks will be able to
use this research study’s framework in a manner similar to the SMEs current business practices
with third party IT providers and MSSPs (Chen & Zu, 2017; Torkura, Sukmana, Cheng, &
Meinel, 2017). These SMEs will be able to adapt to Cloud computing using the framework of
this research study by identifying important SLAs and SecSLAs that can be understood by non-
IT risk processes and business practices, thereby realizing the promise of lower costs when using
Cloud computing (Luna, Suri, Iorga, & Karmel, 2015; Oktadini & Surendro, 2014; Na & Huh,
2014).
Organizations of all sizes from all income level countries would benefit from a Cloud
environment risk assessment, although the research study focuses on high-income country-based
businesses (Assante, Castro, Hamburg, & Martin, 2016; Hussain, Hussain, Hussain, Damiani, &
Chang, 2017). Although SMEs based in all income level countries have distinct issues
(Bildosola, Rio-Belver, Cilleruelo, & Garechana, 2015; Carcary, Doherty, & Conway, 2014;
Kumar, Samalia, & Verma, 2014), every SME can face greater challenges solving these
problems as SMEs have less resources and fewer skilled employees to resolve Cloud computing
risk assessment issues than large scale enterprises (Assante, Castro, Hamburg, & Martin, 2016;
Carcary, Doherty, & Conway, 2014; Chiregi & Navimipour, 2017). A common factor for all
SMEs is that many academic solutions to properly risk assessing Cloud computing environments
require highly technical knowledge and skills that are not found in an SME’s IT staff (Hasheela,
Smolander, & Mufeti, 2016; Kumar, Samalia, & Verma, 2017) or large enterprise sized budgets
(Mayadunne & Park, 2016; Moyo & Loock, 2016). While a SME could use a good large
20
enterprise solution, the cost in both employee skills and financial outlay prohibit these solutions
in the real world (Bildosoia, Rio-Belver, Cillerueio, & Garechana, 2015; Carcary, Doherty,
Conway, & McLaughlin, 2014). This literature review illuminates how appropriate solutions for
large multi-national enterprises are rarely the correct answer for SMEs in most IT solutions, and
certainly not in Cloud security risk assessment activities.
The smaller budgets of SMEs require Cloud environment risk assessments that not only
require smaller initial cost or capital expenditures (CapEx), but also require small or controllable
continuing costs or operating expenses (OpEx). Academic solutions to Cloud environment risk
assessment needs do not always contribute to the field of SME Cloud computing environment
risk assessments. While foundational research that indicates SMEs need workable Cloud
computing environment risk assessments is present (Lacity & Reynolds, 2014; Phaphoom,
Wang, Samuel, Helmer, & Abrahamsson, 2015; Priyadarshinee, Raut, Jha, & Kamble, 2017), the
next step of addressing the need is not yet current, and the research study helps address that gap
(Trapero, Modic, Stopar, Taha, & Suri, 2017; Wang, Wood, Abdul-Rahman, & Lee, 2016).
Addressing the need of SMEs to properly assess the risk of using Cloud computing environments
is a new field of research hampered by factors unique to the SME paradigm. Due to the reduced
levels of skill and budget amounts available to SMEs (Bieber, Grivas, & Giovanoli, 2015;
Hanclova, Rozehnal, Ministr, & Tvridkova, 2015; Ndiaye, Razak, Nagayev, & Ng, 2018), the
research for SME Cloud computing risk assessments must focus on simpler ways to assess and
reduce the risk of Cloud computing adoption. This research study attempts to supply a workable
risk assessment instrument based on research that other researchers can extend and amplify
going forward.
21
Themes
Before focusing on specific themes, it is important to describe the fundamental constructs
used in this research study and in the literature review. In almost all research papers cited in this
literature review, researchers base their definition of Cloud computing on the NIST description
(Bayramusta & Nasir, 2016; Chang, Chang, Xu, Ho, & Halim, 2016; Doherty, Carcary, &
Conway, 2015; Dhingra & Rai, 2016; Tang & Liu, 2015; Zissis & Lekkas, 2012). No matter the
journal or the authors’ academic associations, the NIST definition of Cloud computing is the
standard. This makes perfect sense as the NIST definition for Cloud and Cloud security is as
close to foundational concepts as Cloud research has (Alijawarneh, Alawneh, & Jaradat, 2016;
Coppolino, D’Antonio, Mazzeo, & Romano, 2017; Demirkhan & Goul, 2011; Hallabi &
Bellaiche, 2018). While chapter one of the thesis presents most of these definitions, it is
important to define the terms here as all discussions in the cited research papers and the analysis
and synthesis in this literature review are based on a common understanding of what Cloud
computing is. The NIST definition of Cloud computing includes the following essential
characteristics:
On-demand self-service: The organization has full control of the virtual server or service
creation without intervention by the CSP (Mell & Grance, 2011).
Broad network access: The organization and its customers can reach the virtual server or
services over the Internet without proprietary tools provided by the CSP (Mell & Grance, 2011).
Resource pooling: Although the organization may be able to specify which data center the CSP
uses, the CSP uses shared resources in a multi-tenant model. The CSP allocates the resources
desired by the organization in a manner in which the CPS chooses the physical hardware and
networking systems (Mell & Grance, 2011).
22
Rapid elasticity: The organization may increase, change, or decrease the virtual server or
services in rapid and almost unlimited fashion (Mell & Grance, 2011).
Measured service: the CSP bills resource usage in units of time, and provides the organization
with the ability to monitor and control resource usage (Mell & Grance, 2011).
The NIST definition of Cloud computing include the concept of service models:
Software as a service (SaaS): The CSP provides access to an application for the organization
and its customers. The CSP is responsible for all aspects of the underlying virtual and physical
hardware with the exception of some user related settings (Mell & Grance, 2011).
Platform as a service (PaaS): PaaS is a step lower into control of the Cloud environment where
the organization is able to deploy its own applications and control most aspects of the application
without having to manage and control the underlying virtual server and network environment
(Mell & Grance, 2011).
Infrastructure as a service (IaaS): IaaS is the lowest level of control and responsibility
provided to the organization by the CSP. The organization can control the servers’ operating
systems, size, and speed, storage characteristics, networking, and accessibility by its customers
to the organization’s servers and applications (Mell & Grance, 2011).
The NIST definition of Cloud computing includes the concept of deployment models:
Private Cloud: a private Cloud is one provisioned for use by a single organization. The
organization or the CSP may manage the physical location and control over the hardware (Mell
& Grance, 2011).
Community Cloud: a community Cloud gets created for use by a group of organizations sharing
similar concerns or requirements. As with a private Cloud, one of the organization or the CSP
may manage the physical location and control over the hardware (Mell & Grance, 2011).
23
Public Cloud: the CSP allows any organization to provision virtual servers or services in its
Cloud computing environment. Popular examples in North America include Amazon web
services (AWS), Google Cloud, and Microsoft Azure (Mell & Grance, 2011).
Hybrid Cloud: a combination of two or more Cloud environments tied together through
technology to allow virtual servers and services to move from one Cloud environment to another
(Mell & Grance, 2011).
Risk is an important concept to define for this literature review also. Risk as used in the
cited articles and this literature review is not as specific as the Cloud definitions. One commonly
defines risk in IT using an equation as shorthand. Risk = probability x impact / cost (Choo, 2014;
Jouini & Rabai, 2016). This literature review and the research project slightly expands this
definition to include any risk that a risk assessment tool can measure. This definition of risk
includes the risk of insufficient management buy in and the risk of the organization’s technical
staff not having the requisite Cloud knowledge and skills (Ahmed & Abraham, 2013; Luna, Suri,
Iorga, & Karmel, 2015; Shao, Cao, & Cheng, 2014).
A definition of SME as used in this literature review and the research study is important
as none of the research papers included in this literature review specify what a small enterprise or
medium enterprise is. Based on a careful reading of the research papers cited in this literature
review; the European commission definition of SMEs seems accurate. The EC definition of
SMEs is are small (15 million or less in annual revenue) to medium (60 million or less in annual
revenue) sized enterprises that are not subsidiaries of large enterprises or governments, or wholly
or partially supported by large enterprises or governments (Papdopoulos, Rikana, Alajaasko,
Salah-Eddine, Airaksinen, & Loumaranta, 2018). While Gartner may consider anything under a
billion dollars a year as a medium enterprise (Gartner, 2014), that is a very American centric
24
viewpoint and as usual Gartner is wrong, and there is no evidence that the authors of the cited
research papers relied on Gartner’s definitions of SMEs. The U.S. small business administration
(SBA) has a very complicated spreadsheet showing what types of SMEs are in a large number of
different industries (SBA, 2017). The SBA SME spreadsheet overview tab has over one
thousand entries alone and is very confusing to use, so there is no real number to gain from a
hypothetical use of the SMB spreadsheet, and there is no evidence that any of the cited research
articles used the spreadsheet when calculating enterprise sizes.
Cybersecurity
The broader field within which this research study resides is cybersecurity, or the security
of computing and IT environments. Earlier studies use the term information security or IS, but
cybersecurity has become the dominant phrase for the subject of protecting computing and IT
environments (Bojanc & Jerman-Blazic, 2008; Rahman & Choo, 2015; Tang, Wang, Yang, &
Wang, 2014).Significant cybersecurity research is only a few decades old, and as expected in
such a young field, paradigms and foundational studies are still not clear (Anand, Ryoo, & Kim,
2015; Ho, Booth, & Ocasio-Velasquez, 2017; Paxton, 2016). Rapid change is a central theme of
this literature review and a strong reason why case study-based theory and a Delphi technique
are so important for the research study. Cybersecurity is a smaller part of the information
technology (IT) field, and Cloud cybersecurity focuses on the security of Cloud based computing
environments. The IT field as a whole, is a new one compared to many business-related fields.
Cybersecurity is even newer with rapidly changing paradigms that require new research on an
ever-increasing pace (Fidler, 2017).
Cloud computing at its simplest and perhaps most disparagingly is virtual computing on
someone else’s hardware (Daylami, 2015). This simple and accurate, yet limiting definition
25
highlights the fundamental changes needed in cybersecurity. For the past thirty years,
cybersecurity has grown from a physical model to a model that is more abstract (Rabai, Jouini,
Aissa, & Mili, 2013). The first major cybersecurity paradigm of perimeter defense used physical
similes such as fences with barbed wire and armed guards, or locked server room doors and
secure server rack cages to describe the cybersecurity process. The reliance on physical examples
and thought processes based on physical security has always hampered Cybersecurity but
continued through succeeding paradigm shifts such as defense in depth, and “assume your
network is compromised” (Kichen, 2017). Cloud risk assessments suffer from the same limited
view based on physical properties (Iqbal, Kiah, Dhaghighi, Hussain, Khan, Khan, & Choo, 2016;
Mishra, Pilli, Varadharajan, & Tupakula, 2017; Rao & Selvamani, 2015). Dominant paradigms
based on physical models are obsolete when considering Cloud computing environments and so
are risk assessments for Cloud computing risk assessment. SMEs need the creation of new tools
and frameworks to stay current with Cloud security.
The rapid growth of Cloud computing is drastically changing cybersecurity (Dhingra &
Rai, 2016; Khalil, Khreishah, & Azeem, 2014; Singh, Jeong, & Park, 2016). As more SMEs
adopt Cloud computing, the industry needs to adapt to SME specific concerns, and one of the
results of the research study is a validated risk instrument that SMEs can freely use. The
cybersecurity industry already has put some effort into solutions for SMEs but SMEs need more
research (Chiregi & Navimpour, 2017; Vasiljeva, Shaikhulina, & Kreslins, 2017). In some ways
Cloud computing is similar to other fields where solutions created for large enterprises can be
pared down to SME size, such as using SaaS solutions (Assante, Castro, Hamburg, & Martin,
2016; Bildosola, Rio-Belver, Cilleruelo, & Garechana, 2015; Wang & He, 2014) or micro virtual
computing concepts such as Docker or containers (Salapura & Harper, 2018; Sun, Nanda, &
26
Jaeger, 2015). In other ways, cybersecurity has failed SMEs and Cloud computing may be a way
to avoid those mistakes (Ali, Khan, & Vasilakos, 2015; Assante, Castro, Hamburg, & Martin,
2016; Hasheela, Smolander, & Mufeti, 2016). The validated risk instrument that is one goal of
the research study will attempt to advance cybersecurity for SMEs in Cloud computing
environments and the other goal of contributing to the new academic field of Cloud computing
security will do the same.
Cloud Computing
The adoption of Cloud computing has become a business inflection point for all
organizations of any size, necessitating new research and new industry solutions for both IT and
cybersecurity (Chen, Ta-Tao, & Kazuo, 2016). Industry and the academic field are having to
react to an amazingly fast rate of change in Cloud computing industry and research topics
(Ramachandra, Iftikhar, & Khan, 2017). Although even the broader field of IT and computing in
general are very fast-moving fields of research, research in Cloud computing has to proceed at a
breakneck pace to keep up with current industry practice (Tang & Liu, 2015; Tunc & Lin, 2015).
Even with researchers working as fast as they can to describe and create theory on Cloud
computing, there are difficulties speed alone will not solve. Researchers following accepted and
respected models of academic research are falling behind in predicting and describing current
Cloud computing in the real world as it is hard to get data regarding organizations’ security
policies and procedures (Hart, 2016; Ardagna, Asal, Damiani, & Vu, 2015). Very few
organizations are willing to expose the inner workings of their IT and Cloud computing
operations (Quigley, Burns, & Stallard, 2015; Sherman et al, 2018). CSPs are even more reticent
for several reasons (Hare, 2016; Elvy, 2018). The design of survey instruments for the research
study take this into effect and avoid asking for potentially compromising information. The
27
inability to ask pertinent demographic question in the survey instruments for the research study is
another indicator of why a qualitative case study-based theory research study using a Delphi
approach with three rounds is the appropriate approach to answering the research questions.
Although research discussed under improving Cloud security theme is starting to
recommend that CSPs differentiate themselves by offering different security options (Preeti,
Runni, & Manjula, 2016; Coppolino, D’Antonio, Mazzeo, & Romano, 2017; Paxton, 2016),
there is not much evidence that CSPs are doing so (Ring, 2015; Singh, Jeong, & Park, 2016). In a
following section discussing SLAs and SecSLAs more detail on the potential security options
will be discussed but interest in these options is currently limited to smaller CSPs which have
their own drawbacks for SMEs or the solutions are not likely to be adopted (Elsayed &
Zulkernine, 2016; Furfaro, Gallo, Garro, Sacca, & Tundis, 2016; Lee, Kim, Kim, & Kim, 2017).
Potential solutions or paradigm shifts in Cloud computing security are not particularly relevant to
SMEs until SMEs understand what they need from a Cloud computing environment and what
those security risks are (Huang, Shen, Zhang, & Luo,2015; Karras, 2017; Lanz, 2015). SMEs
are not generally up to date on Cloud computing security or what potentials solutions are their
best choices (Hasheela, Smolander, & Mufeti, 2016; Hussain, Hussain, Hussain, Damiani, &
Chang, 2017), although there is some research showing that SMEs do not rank their ignorance as
a primary factor (Qian, Baharudin, & Kanaan-Jeebna, 2016; Mohabbattlab, von der Heidt, &
Mohabbattlab, 2014).
At the beginning of the Cloud computing adoption wave, many organizations and
researchers tried to evaluate Cloud computing environments based on their existing on-premises
computing security paradigms (Koualti, 2016; Barrow, Kumari, & Manjula, 2016; Paxton,
2016). To some extent, this is still the case in academic research. Researchers have done
28
foundational work on the general topic of Cloud computing and Cloud computing adoption, and
the research on these topics has reached the point where meta analyses are possible.
Bayranmusta and Nasir present an excellent example of a literature review of two hundred and
thirty-six papers with their article titled “A fad or future of IT? A comprehensive literature
review on the Cloud computing research: (Bayranmusta & Nasir, 2016). Many other papers
cover similar ground regarding Cloud computing adoption. Some papers present qualitative
survey results (Oliveira, Thomas, & Espadanal, 2014) some comprehensive quantitative results
(Phaphoom, Wang, Samuel, Helmer, & Abrahamsson, 2015), and some papers use advanced
techniques such as neural networks (Priyadarshinee, Raut, Jha, & Gardas, 2017). The best papers
covering Cloud computing adoption approach seminal status in this very recent field by
presenting quantitative results in clear and convincing fashion (Ray, 2016; Wang, Wood, Abdul-
Rahman, & Lee, 2016). Some of the papers in the field that do not rise to the level of seminal
works remain interesting as their research focuses on specific parts of the business or academic
world, or particular parts of the world (Chang, Chang, Xu, Ho, & Halim, 2016; Lian, Yen, &
Wang, 2014; Musungwini, Mugoniwa, Furusa, & Rebanowako, 2016). Because many research
articles describing Cloud computing have achieved the goal of repeatable findings, and find no
significant new findings, it is time to research more specific topics in Cloud computing. The
more important of these topics are under separate theme headings in this literature review.
Cloud Security General
One of the ways researchers have been advancing the field past that of the original
researchers discussed above, is to focus on Cloud security issues, rather than just stating Cloud
security is a concern (Raza, Rashid, & Awan, 2017; Coppolino, D’Antonio, Mazzeo, & Romano,
2016; Khalil, Khreishah, & Azeem, 2014). This is a natural outgrowth of the results found by the
29
researchers cited in the general Cloud theme of this literature review. A clear and consistent
result from those studies is that Cloud security is a major concern for organizations moving to
the Cloud (El Makkaoui, Ezzati, Beni-hssane, & Motamed, 2016; Anand, Ryoo, & Kim, 2015;
Dhingra & Rai, 2016). Most researchers writing about Cloud security use standard US
Department of Commerce National Institute of Standards (NIST) Cloud and Cloud security
definitions (Ring, 2015; Khan & Al-Yasiri, 2016; Charif & Awad, 2016). Most researchers focus
on broader public Cloud security concerns (Dhingra & Rai, 2016; Khalil, Khreishah, & Azeem,
2014), although research based on particular parts of the world such as China tend to use the
private Cloud paradigm (Lian, Yen, & Wang, 2014). Even though most researchers use the same
definitions of Cloud, risk and SMEs, there are differences between researchers in what they think
is the correct approach to improving Cloud security.
Interesting differences start to appear when looking at research articles regarding general
Cloud security articles based on the journals that published the articles. Articles from more
computer science and engineering-based journals such as the Journal of Network and Computer
Applications or Annals of Telecommunications tend to focus on lower levels of the Cloud
computing stack such as hypervisor escapes or VMware based virtual computing environments
(Mishra, Pilli, Varadharajan, & Tupakula, 2017; Raza, Rashid, & Awan, 2017). Some would
argue, however, that there is a small distinction between virtual computing and Cloud
computing, with Cloud computing a subset of virtual computing (Khan & Al-Yasiri, 2016; Iqbal
et al., 2016). Cloud computing environments, especially software as a service (SaaS) Cloud
computing environments, however, have diverged so much from hypervisor-based virtualization
that researchers have to consider the topics separately (Huang & Shen, 2015; Goode, Lin, Tsai,
& Jiang, 2014). Journals not focused on computer scientists or engineers such as the Journal of
30
International Technology & Information Management, or the Journal of Business Continuity &
Emergency Planning tend to produce articles more focused on private or public Cloud computing
environments offered by Cloud Service providers (CSP) such as Microsoft Azure or Amazon
web services (AWS) (Ferdinand, 2015; Srinivasan, 2013).
Even with the different approaches based on the academic field that the researcher
focuses on, there do not seem to be many Cloud security improvements that are specific to SMEs
(Aljawarneh, Alawneh, & Jaradat, 2016; Assante, Castro, Hamburg, & Martin, 2016; Hasheela,
Smolander, & Mufeti, 2016). The technical solutions require large well-trained cybersecurity
teams that have budgets to support mathematicians or encryption subject matter experts (Feng &
Yin, 2014; Khamsemanan, Ostrovsky, & Skeith, 2016; Mengxi, Peng, & HaoMiao, 2016). The
research articles based on governance or adoption of industry frameworks such as COBIT, ITIL,
or ISO 2700 clearly do not focus on anything but large enterprises (Barton, Tejay, Lane, &
Terrell, 2016; Tisdale, 2016; Vijayakumar, & Arun, 2017). Compliance focused research articles
do not seem to scale down to SME budgets and staff either (Bahrami, Malvankar, Budhraja,
Kundu, Singhal, & Kundu, 2017; Kalaiprasath, Elankavi, & Udayakumar, 2017; Yimam &
Fernandez, 2016) even literature reviews with large numbers of articles (Halabi & Bellaiche,
2017). Improving Cloud security with SLAs and SecSLAs would seem to be the most promising
avenue for a large-scale solution working for SMEs, however, there are two main issues with this
approach. The first issue is that the currently proposed solutions are still too complicated for the
cybersecurity staff of a normal SME (Demirkan & Goul, 2011; Na & Huh, 2014; Oktadini &
Surendro, 2014) even if the cost is low (Rojas, et al, 2016). The second issue for SMEs using
SLAs and SecSLAs to secure their Cloud computing environments is that CSPs only modify
31
SLAs and SecSLAs CSPs when the customer is a very large one (Kaaniche, Mohamed, Laurent,
& Ludwig, 2017; Trapero, Modic, Stopar, Taha, & Suri, 2017).
This literature review and the research study are in partial fulfillment of the requirements
for a PhD from the school of business so the focus of this literature review is not on specific
virtual environment software or hypervisors. Given the accelerating rate of change in Cloud
computing, and the increasing adoption of public Cloud offerings in the western world, it would
not make sense to focus on specific software that will be outdated by the publication date of this
literature review. This literature review is based on English language articles and the researcher’s
focus is primarily on Western world public Cloud offerings. The American based public Cloud
providers such as AWS and Azure are the largest and fastest growing Cloud computing providers
(Darrow, 2017) and it makes sense to focus primarily on those types of offerings for this
dissertation.
Improving Cloud Security
The themes may seem to be very small slices on a single issue, but Cloud computing
security as an industry activity and an academic research field is very new and rapidly
expanding, it makes sense to separate improving Cloud security from Cloud security in general.
Many Cloud articles in the past five years are still doing important academic work by simply
defining what Cloud security is and listing potential fixes for specific issues (Bhattacharya &
Kumar, 2017; Diogenes, 2017; Ferdinand, 2015; Iqbal, Mat, Dhaghinghi, Hussein, Khan, Khan,
& Choo, 2016; Soubra & Tanriover, 2017; Srinivasan, 2013; Van Till, 2017). Cloud computing
and Cloud computing security are very new academic research fields (Bayramusta & Nasir,
2016; Chang, Chang, Xu, Ho, Halim & 2016; Lian, Yen, & Wang, 2014; Oliveira & Espanal,
2014; Priyaadarshinee, Raut, Jha, & Gardas, 2017). Articles baselining what Cloud computing
32
and Cloud computing security have been very valuable in these early days of Cloud computing
(Ab Rahman & Choo, 2015; Aich, Sen, & Dash, 2015; Gangadharan, 2017; Novkovic & Korkut,
2017; Phaphoom, Wang, Samuel, Helmer, & Abrahamsson, 2015). It is fairly simple in 2018 to
identify Cloud security as a concern for organizations that are looking to adopt Cloud computing
including SMEs (Albakri, Shanmugam, Samy, Idris, & Ahmed, 2014; Haimes, Horowitz, Guo,
Andrijcic, & Bogdanor, 2015; Vasiljeva, Shaikhulina, & Kreslins, 2017). It is not so simple to
take the next step and identify solutions that work in the business world today (Ali, Warren, &
Mathiassen, 2017; Devos & van de Ginste, 2015; Moral-Garcia, Moral-Rubio, Fernandez, &
Fernandez-Medina, 2014). Many potential solutions discussed in other themes of this literature
review may end up as the dominant paradigm in Cloud security in the next decade but they do
not help in the current business environment. If SMEs cannot find a workable solution to Cloud
security issues, the SMEs will not be secure as they adopt Cloud computing (Assante, Castro,
Hamburg, & Martin, 2016; Bildosola, Rio-Belver, Cilleruelo, & Garechana, 2015; Carcary,
Doherty, & Conway, 2015; Kumar, Samalia, & Verma, 2017; Lacity & Reynolds, 2013;
Moncayo & Montenegro, 2016; Wang & He, 2014).
This section is based on a smaller group of academic papers than most of the other
themes in this literature review. This is the simplest indication of the gap in research and the size
of the Cloud security problem, there are very few useful and even fewer accurate papers that
provide Cloud security solutions (Aljiwaneh, Alawneh, & Jaradat, 2016; Imran, Hlavacs, Haq,
Jan, Khan, & Ahmed, 2017; Islam, Fenz, Weippl, & Mouratidis, 2017). The research study plans
to fit into this section as the research goal is to determine if there is a consensus on how
organizations use a risk-based orientation to make business decisions to help secure an
organization’s Cloud computing environment. Later themes in this literature review focus on
33
new paradigm changing approaches to Cloud security that future researchers may consider
foundational ten years from now but have not yet bridged the gap between academic research
and real-life application (Cao, Moore, O’Neil, O’Sullivan, & Hanley, 2016; Carvalho, Andrade,
Castro, Coutinho, & Agoulmine, 2017; Casola, DeBenedeictis, Modic, Rak, & Villano, 2014;
Dasgupta & Pal, 2016; Torkura, Sukana, Cheng, & Meinel, 2017). Ten years may seem like a
reasonable gap between academic theory creation and industry-based applications in many fields,
but a decade in Cloud computing security is more than half the lifetime of the field itself
(Fernandes, Soares, Gomes, Freire, & Inacio, 2014; Daylami, 2015; Bunkar and Rei, 2017).
Currently available research that moves beyond discussing specific Cloud security
problems such as data storage (Paxton, 2016; Wang, Su, Dio, Wang, & Ge, 2018), moving
current physical device security paradigms to the Cloud (Khalil, Khreishah, & Azeem, 2014;
Mishra, Pilli, Varadharajan, & Tupakula, 2017), or IAM solutions (Iqbal, Mat Kiah, Dhaghighi,
Hussain, Khan, Khan, & Choo, 2016; Younis, Kifayat, & Merabti, 2014), use a variety of
methods and techniques to improve Cloud security. Solutions range from the systems
development life cycle (Aljawarneh, Alawneh, & Jaradat, 2016) to increased hypervisor security
(Coppolino, D’Antonio, Mazzeo, & Romano, 2017), to the trusted computer base (TCB)
(Navanati, Colp, Aiello, & Warfield, 2014) to a laundry list of current vulnerabilities and
solutions (Iqbal et al, 2016; Khan & Al-Yasiri, 2016), to vendor specific solutions (Diogenes,
2017).
There is a very promising encryption-based solution discussed in deeply technical
computer science and electrical engineering journals named homomorphic encryption (Bulgurcu,
Cavusoglu, & Benbasat, 2016; Feng & Xin, 2014; Khamsemanan, Ostrovsky, & Skeith, 2016;
Zibouh, Dalli, & Drissi, 2016) but the only business process focused article that mentions it that I
34
have found so far is the one by Coppolino, D’Antonio, Mazzeo, and Romano published in 2017
(Coppolino, D’Antonio Mazzeo, & Romano, 2017). Although homomorphic encryption looks
like it will allay many security concerns regarding Cloud computing adoption, it is not in use yet
and looks to be expensive and complicated, negating its use for SMEs (Dasgupta & Pal, 2016;
Feng & Xin, 2014; Souza & Puttini, 2016). Perhaps the most effective argument against
homomorphic encryption in an academic setting is that it is just another Band-Aid on IT and
Cloud security (Potey, Dhote, & Sharma, 2016; Ren, Tan, Sundaram, Wang, Ng, Chang, &
Aung, 2016). Homomorphic encryption solutions allow organizations including SMEs to make
the same choices and mistakes that they do in an on-premises environment (Elhoseny, Elminir,
Riad, & Yuan, 2016; Mengxi, Peng, & Hao Miao, 2016; Wu, Chen, & Weng, 2016). Just as the
research project that this literature review is a part of does not propose a once in a generation
paradigm change to Cloud computing, homomorphic encryption lets organizations make the
same security mistakes they have been making since computing became a major business
function (Bulgurcu, Cavsogliu, & Benbasat, 2016; Potey, Dhote, & Sharma, 2016). The other
major drawback to homomorphic encryption as a topic for a thesis submitted to a business
department is that it requires very advanced mathematical skills and knowledge.
Industry-based Framework Solutions for Large Enterprises
There is a large amount of academic research in changing or transforming industry-based
framework solutions for large enterprises into solutions for SMEs (Bildosola, Rio-Belver,
Cilleruelo, & Garechana, 2015; Moyo & Loock, 2016; Seethamraju, 2014). This section of
literature review includes research papers based on industry-based frameworks such as data-
based governance (Al-Ruithe, Benkhelifa, & Haneed, 2016), general governance (Barton, Tejay,
Lane, & Terrell, 2016; Elkhannoubi & Belaissaoui, 2016), or industry standard based governance
35
efforts such as ISACA’s control objectives for information and related technologies (COBIT)
(Devos & Van de Ginste, 2015). There are several complex and all-consuming industry-based
frameworks for almost all IT activities and functions including cybersecurity (Cao & Zhang,
2016; Oktadini & Surendro, 2014; Tajammul, & Parveen, 2017). These industry-based
frameworks for large enterprises have real world drawbacks for use by SMEs. These industry-
frameworks are incredibly expensive in time, training, and financial terms (Haufe, Dzombeta,
Bradnis, Stantchev, & Colomo-Palacios, 2018; Kovacsne, 2018; Lanz, 2015). The return on
investment (ROI) calculation for these types of endeavors is far too small for most SMEs
(Moral-Garcia, Moral-Rubio, Fernandez, & Fernandez-Medina, 2014; Schmidt, Wood, &
Grabski, 2016).
Researchers and practitioners may be able to adapt these industry-based frameworks to
effective and useful Cloud computing security research but these industry-based frameworks are
very rigid, and do not encourage change (Tajammul, & Parveen, 2017). Industry will only accept
changes proposed by academic research when versions change for ITIL, COBIT, or ISO 2700
(Atkinson & Aucoin, 2016; IT Process Maps, 2018). Because of the structure of these industry-
based frameworks, they are very antithetical to change, in fact, the design of these industry-based
frameworks encourage elimination of any change or diversion from strict standards wherever
possible (Betz & Goldenstern, 2017; Lawson, Muriel, & Sanders, 2017). Despite these
drawbacks there does appear to be some research is currently taking place on industry-based
framework-based solutions to let organizations perform accurate and useful risk analyses of
CSPs but not particularly for SMEs (Silva, Westphall, & Westphall, 2016; Karras, 2016; Cayrici,
Garaga, de Oliveira, & Roudier, 2016).
36
Leaving aside the issue of whether or not these industry-based frameworks for large
enterprises could effectively work for SMEs, these industry-based frameworks are not able to
keep up with the massive rate of change in Cloud computing (Cram, Brohman, Gallupe, 2016;
Lohe & Legner, 2014). For example, the previously mentioned COBIT saw seven years between
COBIT 4 and COBIT 5 releases. To stay effective and timely for SME risk analysis and
assessment of Cloud computing and Cloud computing security, COBIT would have to decrease
the time between releases to seven months (Tajammul & Parveen, 2017). Additionally, within
the proposed seven-month release cycle, the industry-based frameworks would have to be re-
engineered for SMEs. Unlike academic research, creators of industry-based frameworks for large
enterprises such as ITIL, COBIT, and ISO 2700 do so for-profit motives (Leclercq-
Vandelannoitte & Emmanuel, 2018). Printed documents, training, and certifications of
employees and organizations are very expensive to obtain and create large profits for the
certifying organization (Skeptic, 2009). SMEs are not good customers for these industry-based
frameworks as they do not have the budget for them in employee time or financial budgets
(Assante, Castro, Hamburg, & Martin, 2016Carcary, Doherty, & Conway, 2014; Senaratha,
Yeoh, Warren, & Salzman, 2016). Perhaps the only reasonable way for academic research to
start with an industry framework for large enterprises and end up with a solution for SMEs is to
focus a small part of an industry framework such as SLAs (Carvalho, Andrade, Castro,
Couitinho, & Agoulmine, 2017; Luna, Suri, Iorga, & Karmel, 2015; Na & Huh, 2014). A
separate theme discusses SLAs but find their antecedents in large enterprise agreements with
vendors such as CSPs.
37
SMEs
SMEs are present in almost every country in the world (Calvo-Manzano, Lema-Moreta,
Arcilla-Cobian, & Rubio-Sanchez, 2015) and in some countries employ more than 95% of the
total workforce (Fernando & Fernando, 2014). SMEs are responsible up to sixty per cent of all
employment, and up to forty per cent of all reported national income in emerging countries
(Ndiaye, Razak, Nagayev, & Ng, 2018). SMEs are a very large segment of the business world
and deserve a large amount of the industry-based solutions research and the academic research
for IT and Cloud security solutions (Robu, 2013; Seethamraju, 2014). While it is possible to
scale down some large enterprise solutions to SME size, as previously discussed, most cannot
(Kritikos, & Massonet, 2016; Parks & Wigand, 2014). SMEs need their own IT solutions
including Cloud security and Cloud risk assessments. These solutions will need to consider the
constraints that SMEs face such as financial and employee skill levels (Carcary, Doherty,
Conway, & McLaughlin, 2014; Hasheela, Smolander, & Mufeti, 2016). The academic research
for SME Cloud security risk assessments must accept these constraints to be useful, both in
potential industry-based solutions and for academic frameworks. There is no value in solutions
that have no chance of implementation. Even though many current SME solutions are not
probable in today’s business and industry settings, they are possible (Hussain, Hussain, Hussain,
Damiani, & Chang, 2017; Liu, Xia, Wang, & Zhong, 2017; Mohabbattalab, von der Heidt,
Mohabbattalab, 2014). As discussed previously, trying to adapt a large enterprise solution that
may cost more than the entire SME yearly budget to solve SME Cloud security problems is not
prudent (Cram, Broham, & Gallupe, 2016; Lawson, Muriel, & Sanders, 2017; Devos & Van de
Ginste, 2015). Adapting the large enterprise solutions will not yield useful results in the
incredibly fast-moving Cloud security industry or research field.
38
The same is true for academic solutions that require advanced mathematics or high levels
of skill in arcane academic fields to succeed (Potey, Dhote, & Sharma, 2016; Ren, Tan,
Sundaram, Wang, Ng, Chang, & Aung, 2016; Zibouh, Dali, & Drissi, 2016). A homomorphic
encryption solution that requires periodic changes to the cryptographic elements of the solution
are not a reasonable solution for SMEs (Feng & Xin, 2014; Khamsemanan, Ostrovsky, & Skeith,
2016; Wu, Cheng, Weng, 2016). Nor are solutions that require skill in an academic model unique
to a single or small group of papers (Deshpande et al, 2018; Kholidy, Erradi, Abelwahed, &
Baiardi, 2016; Nanavati, Colp, Aeillo, & Warfield, 2015). Unique models such as Security
threats management model (STMM) (Lai & Leu, 2015), or a data provenance model (Imran,
Hlavacs, Haq, Jan, Khan, & Ahmad, 2017), or even more common models such as using the
Software development life cycle framework to create a Software assurance reference dataset
(SARD) (Aljawarneh, Alawneh, & Jaradat, 2016) are very unlikely to be adopted by SMEs.
SMES are not just scaled down versions of large enterprises. SMEs have fundamentally
different designs and structures that require different approaches and reactions to new
technologies such as Cloud computing (Cheng & Lin, 2009; Diaz-Chao, Ficapal-Cusi, &
Torrent-Sellens, 2017; Lai, Sardakis, & Blackburn, 2015). Research attempting to discover or
create solutions for SMEs to securely adopt Cloud computing environments needs to be
fundamentally different also (Hussain, Hussain, Hussain, Damiani, & Chang, 2017; Moyo &
Loock, 2016; Wang & He, 2014). Research leading to academic and industry-based solutions for
SMEs need to focus on solutions that are closer to “turn-key” or ones SMEs can adopt without
special expertise. If an SME cannot implement a solution with its current staff, the SME is less
likely to adopt the solution (Bildosola, Río-Belver, Cilleruelo, & Garechana, 2015; Kumar,
Samalia, & Verma, 2017).
39
SMEs Local
Much of the current research on SMEs focuses on a geographically distinct group of
SMEs. While the focus on the research may be Cloud or Cloud security, the group under study is
usually in the same region of the world (Assante, Castro, Hamburg, & Martin, 2016; Bolek,
Lateckova, Romanova, Korcek, 2016; Carcary, Doherty, & Conway, 2014). After reading a large
number of research papers based on SMEs from the same state, country or continent, several
differences between regions become evident (Moyo & Loock, 2016; Senarathna, Yeoh, Warren,
& Salzaman, 2016). The differences between SMEs in one region versus another region as
regards Cloud computing adoption and Cloud security would make for a fascinating thesis by
themselves but for the purposes of this literature review and the research study it is enough to
differentiate SMEs geographically by World bank average income level groupings (Fosu, 2017).
There is an obvious and broad correlation between the SMEs in low, lower-middle, upper-middle
economies and the SMEs in high-income economies in terms of Cloud computing adoption.
SMEs in non-high-income countries do not appear to be adopting Cloud computing at a level
where they would need to do Cloud security risk assessments yet (Kumar, Samalia, & Verma,
2017; Moyo & Loock, 2016; Vasiljeva, Shaikhulina, & Kreslins, 2017). SMEs in high-income
countries, however, need Cloud security risk assessments and would find a validated risk
instrument to be a valuable commodity (Haines, Horowitz, Guo, Andrijicic, & Bogdanor, 2015;
Rahulamathavan, Rajarajan, Rana, Awan, Burnap, & Das, 2015; Sahmim & Gharsellaoui, 2017).
High income economy SMEs will be the assumed target of the research study unless otherwise
specified.
40
SMEs and Cloud
Cloud computing is a new and rapidly developing field of research (Khan & Al-Yasiri,
2016). SMEs and Cloud computing is an even newer subset of that field of research (Chiregi &
Navimipour, 2017). Even as a new subset of a new field of research there are interesting threads
developing in the field (Bildosola, Río-Belver, Cilleruelo, & Garechana, 2015; Hussain, Hussain,
Hussain, Damiani, & Chang, 2017; Mohabbattalab, von der Heidt, & Mohabbattalab, 2014).
SMEs are as competitive as large enterprises and look for potential business advantages such as
Cloud. Current research studies find that SMEs want to adopt Cloud computing for predicted
cost savings (Al-Isma’ili, Li, Shen, & He, 2016; Chatzithanasis & Michalakelis, 2018; Shkurti &
Muca, 2014), business process improvement (Chen, Ta-Tao, & Kazuo, 2016; Papachristodoulou,
Koutsaki, & Kirkos, 2017; Rocha, Gomez, Araújo, Otero, & Rodrigues, 2016), or to reach new
customer bases (Ahani, Nilashi, & Ab Rahim, 2017; George, Gyorgy, Adelina, Victor, & Janna,
2014; Stănciulescu, & Dumitrescu, 2014). SMEs’ Cloud options are different than large
enterprise options (Gholami, Daneshgar, Low, & Beydoun, 2016; Salim, Darshana, Sukanlaya,
Alarfi & Maura, 2015; Yu, Li, Li, Zhao, & Zhao, 2018). With very few exceptions in current
research (Wang, Wang, & Gordes, 2018), SMEs do not have the financial means or staff
expertise to create and adopt private or hybrid Clouds (Hsu, Ray, & Li-Hsieh, 2014; Keung &
Kwok, 2012; Michaux, Ross, & Blumenstein, 2015), nor do SMEs want to focus on Cloud
operations as a core business practice. SMEs are more likely than large enterprises to be the
customer of a community based CSP, either non-profit or for profit based (Baig, R., Freitag, F.,
Moll, A., Navarro, L., Pueyo, R., Vlassov, V., (2015; Bruque-Camara, Moyano-Fuentes, &
Maqueira-Marin, 2016), although most SMEs will use a public Cloud offering (Buss, 2013;
Cong & Aiqing, 2014). Large enterprises are more likely to adopt a private Cloud or leverage
41
their large enterprise size to be a valued customer of one of the largest CSPs such as AWS,
Azure, or Google Cloud (Chalita, Zalila, Gourdin, & Merle, 2018; Persico, Botta, Marchetta,
Montieri, & Pescape, 2017; Vizard, 2016). SMEs cannot offer the scale of purchasing to receive
significant discounts from the large CSPs and are more likely to see value in a community Cloud
that understands the SMEs core business practices, or a smaller CSP that specializes in the
SMEs’ core business practices (Huang, et al, 2015; Wang & He, 2014).
SMEs and IaaS
A number of research studies show that SMEs should be more likely to adopt SaaS CSP
offerings than PaaS or IaaS CSP offerings but researchers need to do more work. One of the
issues with concluding that SMEs should use SaaS Cloud computing options is that the research
does not show that SMEs actually are using SaaS more than IaaS and PaaS (Achargui & Zaouia,
2017; Hasheela, Smolander, & Mufeti, 2016). While the arguments made by the researchers are
imminently logical, the same research does not show that they have not yet persuaded SMEs
with those arguments. The research study’s survey instruments and the validated risk instrument
associated with the research include SaaS Cloud computing.
There is research describing SMEs use or lack of use of PaaS Cloud computing
environments (Bassiliades, Symeonidis, Meditskos, Kontopoulos, Gouvas, & Vlahavas, 2017;
Ionela, 2014). The current research regarding SMEs and PaaS does not reach a reproducible
conclusion and the research tends to focus on PaaS offerings of very large business application
software suites (Bassiliades, Symeonidis, Meditskos, Kontopoulos, Gouvas, & Vlahavas, 2017;
Kritikos, Kirkham, Kryza, & Massonet, 2015; Papachristodoulou, Koutsaki, & Kirkos, 2017).
The recent research studies discussing high-income country-based SMEs and PaaS Cloud
computing environments tend to focus on the SMEs adoption and use of the large software
42
programs such as enterprise resource planning programs (ERP) or huge customer relationship
management programs (CRM) (Calvo-Manzano, Lema-Moreta, Arcilla-Cobián, & Rubio-
Sánchez, 2015; Rocha, Gomez, Araújo, Otero, & Rodrigues, 2016). Research based on lower
income-based country SMEs and PaaS Cloud computing environment offerings tend to focus on
the smaller SMEs and their adoption of the more individual customer-based PaaS Cloud
environments such as Google Gmail or Microsoft Office 365 (Chatzithanasis & Michalakelis,
2018; Hasheela, Smolander, & Mufeti, 2016).
Some research shows that SMEs tend to adopt Cloud paradigms that the SME’s current
staff is comfortable using (Assante, Castro, Hamburg, & Martin, 2016; Carcary, Doherty,
Conway, & McLaughlin, 2014). In many cases the simplest Cloud service to adjust to when first
adopting Cloud computing, is that of IaaS (Cong &Alquing, 2014; Fernando & Fernando, 2014;
Keung & Kwok, 2012). An SME’s IT staff can perform the same job duties on a virtual server in
an IaaS environment that they did on an on-premise computer server. While the IT staff will
have to become conversant with the CSP’s IaaS server provisioning process, all major CSPs
allow one to create a server and network online through a web page. The SME’s IT staff can also
create and destroy IaaS servers quickly and cheaply to learn the CSP’s process (Chalita, Zalila,
Gourdin, & Merle, 2018; Persico, Botta, Marchetta, Montieri, & Pescapé, 2017; Vizard, 2016).
The SME’s will need to learn the most cost-effective way to utilize the CSP’s IaaS environment,
but that holds true for the CSP’s PaaS and SaaS environments.
IaaS Cloud computing environments are a good choice for SMEs in several scenarios. If
an SME is ready to make a wholesale move to the Cloud, perhaps as part of the initial IT setup
and configuration, moving to an IaaS environment can offer a base virtual environment where
the SME’s IT team can setup its environment any way it deems best (Chalita, Zalila, Gourdin, &
43
Merle, 2018; Vizard, 2016). If an SME is moving an existing server room or data center into a
public or private Cloud, and the SME can outsource the forklift portion of adopting a Cloud
computing environment, the SME’s IT staff need to learn a smaller set of Cloud computing
specific skills and tasks (Fahmideh & Beydoun, 2018). IaaS Cloud computing environments are
the best choice for SMEs that have to move into a Cloud computing environment quickly (Al-
Isma’ili, Li, Shen, & He, 2016; Baig, Freitag, Moll, Navarro, Pueyo, Vlassov, 2015).
IaaS Cloud environments are attractive to SMEs in other scenarios too. If the SME
decides on a gradual move to the Cloud with a policy of all new servers created in a CSP hosted
environment, the SME’s IT staff can gradually learn the skills needed server by server
(Senarathna, Wilkin, Warren, Yeoh, & Salzman, 2018). A gradual move to a CSP environment
can coincide with other business processes within the SME such as amortization and cost write-
offs for servers and server room equipment, or technology life-cycle events such as aging out of
a specific server model (Gupta & Saini, 2017; Rocha, Gomez, Araújo, Otero, & Rodrigues,
2016). A gradual move based on business processes has the additional benefit of stronger buy-in
from other business units within the SME for continued Cloud operations (Kouatli, 2016; Raza,
Rashid, & Awan, 2017).
PaaS and SaaS CSP options can be strong options for SMEs in various scenarios. Very
small businesses, may only need a limited amount of IT perhaps one or two applications such as
email and document sharing and storage (Hasheela, Smolander, & Mufeti, 2016; Moyo & Loock,
2016). If a small enterprise only needs to use applications, not to build and change them, SaaS or
PaaS would be an appropriate choice (Musungwini, Mugoniwa, Furusa, & Rebanowako, 2016).
If a small enterprise’s IT needs do reach the level of individual servers or a server room, the
SME may not IT staff with competencies much higher than that of an IT Help-Desk. IF the SME
44
does not currently manage on premise servers, PaaS or SaaS would be a logical choice for a
Cloud computing environment (Bassiliades, Symeonidis, Meditskos, Kontopoulos, Gouvas, &
Vlahavas, 2017; Ionela, 2014). As the research study focuses on high-income SMEs that need a
validated risk instrument for adopting Cloud computing solutions securely, single customer-
based SaaS or PaaS solutions will not get much coverage.
SME Cloud Security
Just as in general Cloud security research, SME focused Cloud security research has
reached the point where researchers have done the general descriptive baselining of what a
current Cloud computing environment is (Kumar, Samalia, & Verma, 2017; Lacity & Reynolds,
2013), how Cloud security is different than on premise IT security (Liu, Xia, Wang, Zhong,
2017), and why Cloud security is important for SMEs (Mohabbattalab, von der Heidt, &
Mohabbattalab, 2014). Even research that does not start with the focus on SMEs can be very
informative when describing Cloud computing environments and the security needs of SMEs for
Cloud adoption (Shaikh & Sasikumar, 2015; Sun, Nanda, & Jaeger, 2015). Even though parts of
this literature review make large the differences between SMEs and large enterprises, at a basic
level, a Cloud computing environment has a basic structure that is the same for any size
company (Phaphoom, Wang, Samuel, Helmer, & Abrahamsson, 2015; Ray 2016). So too, Cloud
computing security starts the same for an organization whether they have one employee or fifty
thousand employees.
Differences between Large Enterprises and SMEs
The differences between SME on-premise IT security and potential Cloud security can
look fairly similar to the same comparison for large enterprises (Diogenes, 2017; Hussain,
Mehwish, Atif, Imran, & Raja Khurram, 2017). SMEs tend to have very different on-premises IT
45
security needs, budgets, and practices than large enterprises, but at the basic level, Cloud
computing is using someone else’s hardware (Daylami, 2015). This tends to be truer for SMEs
than large enterprises as financial budgets play a large role in whether or not an organization
decides to create its own Cloud environment such as a private Cloud or a hybrid Cloud
environment leading to more use of public Cloud offerings by SMEs (Shkurti, & Muça, 2014).
The lack of Cloud expertise held by SME’s IT staff would also tend to negate the possibilities of
an SME creating its own Cloud baseline. When a researcher starts to investigate actual business
practices and the details in Cloud adoption and Cloud security, SMEs start to differentiate
themselves from large organizations (Chatzithanasis, & Michalakelis, 2018; Rocha, Gomez,
Araújo, Otero, & Rodrigues, 2016).
Aside from decision making influences such as budget and IT staff expertise, the
importance of SME Cloud security can look similar to what a large enterprise considers
important in Cloud security when focusing on the details. In general, if a large enterprise in a
specific industry faces a security threat when adopting Cloud computing environments, SMEs
face similar concerns, just on a smaller scale. In specific cases, large enterprises do have greater
security concerns and concomitant practices to allaying those security concerns (Bahrami,
Malvankar, Budhraja, Kundu, Singhal, & Kundu, 2017; Yimam & Fernandez, 2016). Large
enterprises have access to solutions that SMEs do not such as creating new Cloud security teams,
or hiring CSP based security subject matter experts. As with differences between on-premises
and Cloud security between large enterprises and SMEs, the differences between SMEs and large
enterprises regarding Cloud computing security start to gain prominence when a researcher starts
to focus on details. Focusing on these and other details is a basic part of the case study-based
theory coding process and played an integral part of the research study.
46
As discussed, Cloud computing security for SMEs starts at a similar place to Cloud
security for larger enterprises. Cloud computing involves using someone else’s hardware and the
corresponding loss of control that giving up physical security involves (Daylami, 2015). While
SMEs have similar security concerns and all organizations would like to keep confidential
information secret, the size of an organization affects the way in which SMEs secure their data.
SME focused academic research solutions tend to be smaller scale and less expensive (Bildosola,
Río-Belver, Cilleruelo, & Garechana, 2015; Gastermann, Stopper, Kossik, & Katalinic, 2014
Lacity & Reynolds, 2013; Senarathna, Yeoh, Warren, & Salzman, 2016). Some current SME
Cloud computing security solutions are just lists of threats and how to remediate the threats,
which although very cost effective, are not forward-looking solutions frameworks (Lalev, 2017;
Preeti, Runni, & Manjula, 2016). Other current SME Cloud computing security solutions require
SMEs to create new teams or business processes (Haimes, Horowitz, Guo, Andrijcic, &
Bogdanor, 2015; Lai & Leu, 2015). The best of current academic SME solutions to Cloud
security do not create new processes or tools for SMEs to learn but instead help SMEs to
simplify their treatment of data and to reduce the SME’s attack surface (Carcary, Doherty,
Conway, & McLaughlin, 2014; Ertuk, 2017; Gritzalis, Iseppi, Mylonas, & Stavrou, 2018).
SME Using Cloud to Reduce Costs
SMEs are less likely to embrace the costs of creating and maintaining server rooms with
dedicated power and cooling than large enterprises (Tso, Jouet, & Pezaros, 2016). SMEs are
more likely to be based in one physical location making redundancy and failover more difficult
for the organization’s pre-Cloud IT infrastructure (Lent, 2016). As such, Cloud computing may
look more attractive for an SME than a large enterprise when the viewpoint is financial or
business process related (Bildosoia, Rio-Belver, Cillerueio, & Garechana, 2015; Carcary,
47
Doherty, Conway, & McLaughlin, 2014). In terms of a Cloud security solution, SMEs may be
able to bridge some of the gap between them and large enterprises in that SME Cloud security
solutions can include multi-Cloud or fully redundant solutions without major increases in cost or
effort (Ertuk, 2017; Salim, Darshana, Sukanlaya, Alarfi & Maura, 2015; Wang & He, 2014). An
SME that can adopt business processes or solutions normally restricted to large enterprises can
gain a competitive advantage (Hsu, Ray, & Li-Hsieh, 2014). Cloud computing can be a tool for
SMEs to adopt some large enterprise IT standards such as full redundancy and auto scaling of
organizational resources to customer demand (Buss, 2013; Huang, et al, 2015; Michaux, Ross, &
Blumenstein, 2015).
There are many ways for an organization to adopt Cloud computing and many different
ways to manage the risk of Cloud computing adoption. The differences between the solutions,
including security solutions can be much more than a result of “throwing more money at it” that
can dismissively explain the differences between SMEs and large enterprises when discussing
on-premises IT security solutions. SMEs may be able to adopt solutions such as multi-Cloud
(Zibouh, Dalli, Drissi, 2016, Cloud access security broker (CASB) (Paxton, 2016), or automation
of security controls (Tunc, et al, 2015) that if based on-premises would be solely the provenance
of large enterprises. These possible Cloud security solutions are still outside of the main stream
for SMEs, however, and SMEs need a way to assess the risk of using these or more standard
solutions. The research study is a start to providing SMEs a way to assess the risks of adoption a
Cloud computing solution, even if the solutions is one that the SME would never consider in an
on-premises environment (Albakri, Shanmugam, Samy, Idris, & Ahmed, 2014; Ngo.
Demchenko, & de Laat, 2016; Younis, Kifayat, & Merabti, 2014). The promise of gaining an
edge on their competition should have SMEs looking at the feasibility of these new solutions.
48
One important promise of Cloud computing for SMEs is that rather than be forced to
decide from a scaled down, reduced cost version of a large enterprise solution or a simpler, less
secure solution, SMEs will be able to choose from a larger selection of Cloud security solutions
if it is easier for SMEs to assess the risk of each solution. Due to previously discussed dual
constraints of smaller financial budgets and lower skill levels of IT staff, SMEs tend to
contemplate different priorities in Cloud computing risk calculations. If SMEs could reasonably
assess the risk of using new Cloud computing security solutions, SMEs could be much more
secure in the Cloud than they currently are on-premises (Chen, Ta-Tao, & Kazuo, 2016;
Seethamraju, 2014). SMEs cannot realize the great promise of Cloud computing adoption SMEs
if the SMEs cannot properly asses the risk of using a Cloud computing environment.
Cloud Security as an Improvement
An interesting difference between SMEs and large enterprises shows up in some SME
based research papers that indicates that for many SMEs, basic Cloud security is an improvement
over the SMEs existing IT security (Lacity & Reynolds, 2013; Mohabbattalab, von der Heidt, &
Mohabbattalab, 2014). The SMEs where basic CSP provided security is better than the SME in
house security, are most likely the smaller SMEs discussed earlier that have limited IT needs and
limited IT staffs (Mayadunne & Park, 2016; Senarathna, Yeoh, Warren, & Salzman, 2016; Wang
& He, 2014). Some SMEs have very limited in IT security. For those SMEs, the adoption of
Cloud computing environments is an improvement in the SMEs security posture. These SMEs
are among those that would find the greatest help from a validated risk instrument. An
inadequate cybersecurity budget almost certainly means, at the very least, the staff have little free
time to research Cloud security options.
49
While public CSPs have many whitepapers discussing their security and the risk
assessment attestations they have (Chalita, Zalila, Gourdin, & Merle, 2018; Persico, Botta,
Marchetta, Montieri, & Pescapé, 2017; Vizard, 2016), the documents and attestations do not
apply to the CSP’s customer’s security. Even if the SME’s cybersecurity team has been able to
read the CSPs explanations of how their Cloud security offerings work, the SME still needs to
work through how each solution fits the organization’s specific needs. Having said that, the
current academic research in SME Cloud security is moving towards a consensus that SMEs can
be more secure at a lower cost in the Cloud than on-premises (Al-Isma’ili, Li, Shen, & He, 2016;
Bassiliades, Symeonidis, Meditskos, Kontopoulos, Gouvas, & Vlahavas, 2017; Famideh &
Beydoun, 2018; Shkurti, & Muça, 2014). SMEs still need a way to ensure that the solution they
pick make the SME more secure, and the research study produced a validated risk instrument
that will help SMEs do so.
Risk
As discussed earlier in this literature review, professionals commonly define risk in IT
using an equation as shorthand. Risk = probability x impact / cost (Choo, 2014; Jouini & Rabai,
2016). Researchers have done good academic research on the risk involved with Cloud
computing adoption (Jouini & Rabai, 2016; Vijayakumar, & Arun, 2017). Researchers have
published less regarding the risk SMEs take in adopting Cloud computing and how SMEs
evaluate the risk (Assante, Castro, Hamburg, & Martin, 2016; Hussain, Hussain, Hussain,
Damiani, & Chang, 2017). The research study helps to fill the gap in academic research about
SME risk assessment processes when adopting Cloud computing, and the research study
generated a validated instrument that SMEs can use during a Cloud risk assessment. This
50
literature review followed the same general pattern for researching risk as the sections for SMEs
and Cloud security; starting broadly and narrowing down to the final topic.
Risk Descriptive
The first section of the literature review research on SME Cloud computing risk
assessments is the general field describing risk relating to Cloud computing. There is adequate
research on broad questions such as the differences between on-premises IT risk and Cloud
computing environment risks, with the simplest answer being that risk is different in the Cloud
(Li & Li, 2018; Rittle, Czerwinski, & Sullivan, 2016; Shackleford, 2016). More nuanced
analyses of how Cloud security presents different risks is also represented in the literature, from
highly detailed quantitative models (Hu, Chen, & We, 2016; Jouini & Rabai, 2016; Tanimoto et
al, 2014) to qualitative descriptive research papers (Iqbal et al, 2016; Khalil, Khreishah, &
Azeem, 2014; Hussain, Mehwish, Atif, Imran, & Raja Khurram, 2017) ) to presentations of
controls and responses that should be taken to ameliorate Cloud computing risk (Khan & Al-
Yasiri, 2016; Preeti, Runni, & Manjula, 2016).
Unfortunately, due to the newness of the field, many articles on Cloud computing risk are
more descriptive based rather than discovery focused (Mishra, Pilli, Varadharajan, & Tupakula,
2017; Singh, Jeong, & Park, 2016). While many of these research papers do a very good job of
describing Cloud computing risk at a high level (Choi & Lambert, 2017; Shackleford, 2016) and
some can be very informative at a lower level of Cloud risk (Casola, De Benedictis, Erascu,
Modic, & Rak, 2017; Lai & Leu, 2015), very few research papers reach the level that would have
other researchers want to expand or extend the research (Masky, Young, & Choe, 2015; Ngo,
Demchenko, & de Laat, 2016). It is logical that academic researchers have to fully describe a
new problem, environment, process, or framework before the important work of discovering how
51
to improve it. One cannot expect quality research from the academic field regarding Cloud
computing security risk until that risk is fully detailed, but the research study and this literature
review took steps in that direction.
The incredible rate of change in the Cloud computing industry is surprising even by IT
standards (Bayramusta & Nasir, 2016) The value of properly researched and peer reviewed
articles based on the details of specific risks involved in adopting Cloud computing such as
hypervisor attacks (Nanavati, Colp, Aeillo, & Warfield, 2014), or other specific CSP weaknesses
(Deshpande, et al. 2018) is minimal as the industry will have reacted before the research paper is
published (Kurpjuhn, 2015; Preeti, Runni, & Manjula, 2016). The current research available does
a better job describing the risks involved with using a Cloud computing IaaS environment than a
PaaS or SaaS Cloud computing environment (Gritzalis, Iseppi, Mylonas, & Stavrou, 2018; Wang
& He, 2014). This is predictable as an IaaS environment is closest to existing on-premises
environments and existing research models and paradigms for computing security. While some
researchers actively focus on risk in PaaS and SaaS Cloud computing environments (Gupta,
Gupta, Majumdar, & Rathore, 2016; Weintraub & Cohen, 2016) the ratio seems to be off. It is
reasonable to expect that just as on-premises computing is being supplanted by Cloud computing
for all the reasons discussed in this literature review, so too will be IaaS with PaaS, SaaS, and
new paradigms that have not been created yet (Kritikos, Kirkham, Kryza, & Massonet, 2015;
Priyadarshinee, Raut, Jha, & Kamble, 2017). As the industry and CSPs move to more dynamic
and complicated computing paradigms and environments such as containers (Bahrami,
Malvankar, Budhraja, Kundu, Singhal, & Kundu, 2017), micro-services (Sun, Nanda, & Jaeger,
2015), compute as a service (Qiang, 2015), security as a service (SecaaS) (Torkura, Sukmana,
52
Cheng, & Meinel, 2017), and eventually everything as a service (Sung, Zhang, Higgins, & Choe,
2016), academic research focused on just describing old models of Cloud risk will not be useful.
SME Risk Assessment
Organizations are rapidly moving to the Cloud (Khan & Al-Yasiri, 2016). The vast
majority of medium to large organizations have policies and procedures regarding adoption and
use of IT such as Cloud computing (Madria, 2016; Shackleford, 2016). The core of the research
project is discovering how organizations are approving the use of Cloud computing
environments based on a risk paradigm. At a high level, organizations can use existing risk
frameworks such as ISO2700, or COBIT (Devos & Van de Ginste, 2015), alter their current risk-
based procedures or alter the organization’s Cloud computing environments to match the
organization’s current risk requirements.
SMEs are not likely to use large industry-based IT control frameworks such as ITIL,
COBIT, or ISO2700 because of the cost of implementing the industry frameworks (Barton,
Tejay, Lane, & Terrell, 2016; Tisdale, 2016; Vijayakumar, & Arun, 2017). As discussed
previously, the cost of training staff and implementing such frameworks can be higher than an
SME’s entire IT budget (Atkinson & Aucoin, 2016; IT Process Maps, 2018). Perhaps the true
value of such frameworks is the ability to standardize IT processes across a global company and
across many business units (Cao & Zhang, 2016; Oktadini & Surendro, 2014; Tajammul, &
Parveen, 2017). Such standardization is not a business driver for SMEs and is usually a goal of
mature large enterprises.
SMEs are likely to alter their current risk procedures when adopting Cloud computing.
The alteration process is not that of a large enterprise, however. A large enterprise will have
entire teams dedicated to IT risk assessments and may even have separate teams based on the
53
type of risk (Zong-you, Wen-long, Yan-an, & Hai-too, 2017). Global enterprises may have
different IT risk assessment teams dedicated to applications, infrastructure, and new software
acquisitions (Damenu & Balakrishna, 2015). Large enterprises may reasonably treat a new Cloud
computing environment as any one of those types of risk assessment types and only require
minor alterations to the large enterprises business processes to finish a Cloud computing risk
assessment. SMEs have no such separate risk teams, and may have no internal audit or risk teams
whatsoever (Mahmood, Shevtshenko, Karaulova, & Otto, 2018). SMEs are more likely to
contract outside audit and risk firms, and only when required by law for financial audits and
other matters (Gupta, Misra, Singh, Kumar, & Kumar, 2017). SMEs may use consultants for a
Cloud computing risk assessment but would save money by using something similar to the
validated risk instrument that is an output of the research study.
As with altering their current risk assessment procedures, large enterprises are most likely
to require constraints and limitations on a Cloud computing environment before approving
moving to the Cloud. Large enterprises have the resources, both financially and in qualified staff
to create a private Cloud limited to a single organization if they wish (Gupta, Misra, Singh,
Kumar, & Kumar, 2017). More commonly, large organizations will use their greater resources to
design a Cloud environment to their specifications that will pass an internal risk assessment
(Chang & Ramachandran, 2016). A very simple example is that a large organization can separate
confidential and non-confidential data to prevent the storage of data in the Cloud. A large
organization may also leverage the size of their Cloud deployment to secure changes and
discounts from the CSP (Gupta, Misra, Singh, Kumar, & Kumar, 2017). These are all examples
of changes to a Cloud environment that SMEs are not able to do. SMEs are far more likely to
have to accept what a CSP offers as the SME has no leverage with the CSP to enact changes.
54
One choice SMEs do have in their favor is the choice of CSP or combinations of CSPs. A
validated risk instrument that will allow SMEs to make effective choices between various public
CSPs will be a very useful tool.
Cloud Risk Solutions
A thorough review and understanding of current risk assessment and acceptance policies
and procedures is critical to this research project. This section of the literature review focuses on
how academic research is writing about Cloud computing risk. Many authors have done good
work in identifying Cloud computing risk factors (Jouini & Rabai, 2016; Hu, Chen, & We, 2016;
Alali & Yeh, 2012). One can see many challenges found in on-premises computing risk
assessments also identified in Cloud computing environments along with many risk factors that
are distinct to the Cloud (Cayirci, Garaga, de Oliveira, & Roudier, 2016; Madria, 2016). From a
focus on those specific threats and corresponding security controls (Sen & Madria, 2014;
Albakri, Shanmugam, Samy, Idris, & Ahmed, 2014; Ramachandran & Chang, 2016) to higher
level risk analysis focuses (Brender & Markov, 2013; Shackleford, 2016; Gupta, Gupta,
Majumdar, & Rathore, 2016), there is a wide range of published research.
Some of the proposed solutions are interesting, including a new access control model for
Cloud computing by Younis, Kifayat, and Merabti that incorporates features of mandatory access
control models (MAC), role-based access control models (RBAC), discretionary access control
models (DAC), and attribute-based access control models (ABAC) to create a new model of risk-
based access control (RBAC) (Younis, Kifayat, & Merabti, 2014). A focus on business process
modeling notation (BPMN) also looks promising (Ramachandran & Chang, 2016). There are
proposals to use fuzzy decision theory (de Gusmao, e Silva, Silva, Poleto, & Costa, 2015), and
55
extensible access control markup language (XACML) (dos Santos, Marinho, Schmitt, Westphall,
& Wesphall, 2016) as the basis of solutions to Cloud computing risk.
SME Cloud Risk Solutions
Unfortunately, none of these solutions are a good fit for SMEs. Based on previously
discussed constraints of financial and staff skill constraints, complicated additional skill needed
processes will not help SMEs properly assess the risk involved in adopting Cloud computing.
One of the main attractions of the Cloud for SMEs is that the promise for SMEs that they will
have to do less work and spend less money than with on-premises solutions (Bayramusta, &
Nasir, 2016; Lalev, 2917; Raza, Rashid, & Awan, 2017). Adding very complicated and complex
controls based on RBAC or BPMN or fuzzy decision theory is a non-starter for most SMEs
(Ramachandran & Chang, 2016). One of the promises of Cloud computing for SMEs is that
different paradigms and solutions will emerge in the high rate of change within the Cloud
computing field. A successful solution for SMEs is one that SMEs can easily understand and
adopt. At the present time, this means that the solution or paradigm has to be based on a
currently understood model such as the classic equation of risk, risk = probability x impact / cost
(Choo, 2014; Jouini & Rabai, 2016).
Research is lacking on proposed more traditional risk assessment instruments for Cloud
computing but portions of the research papers that are public has some very good ideas and
potential avenues to research (Gritzalis, Iseppi, Mylonas, & Stavrou, 2018). Building a risk
instrument based upon a simple to understand industry tool such as the common vulnerability
scoring system (CVSS) has promise for SMEs but real-world examples are lacking (Maghrabi,
Pfluegel, & Noorji, 2016). Several groups of authors are presenting research papers that are
attempting to provide validated risk instruments similar to the research paper, including
56
RAClouds based on ISO27001 (Silva, Westphall, & Westphall, 2016) and risk instruments based
on the ISO 31000 risk management framework (Viehmann, 2014) but the results are not easy to
use.
Some researchers are investigating making Cloud risk assessments more accessible to
SMEs but solutions are not complete (Damenu & Balakrishna, 2015; Djuraev & Umirzakov,
2016; El-Attar, Awad, & Omara, 2016). Other approaches to understanding and properly
assessing Cloud computing risk get complicated very quickly even though they propose
interesting solutions. If research concepts such as reducing Cloud computing risk assessments to
simple business process evaluations (Goettlemann, Dalman, Gateau, Dubois, & Godart, 2014), or
Cloud risk assessments based on Markov models (Karras, 2017); or Cloud risk assessments
based on vertical stacking of groups of SMEs (Mahmood, Shevtshenko, Karaulova, & Otto,
2018) come to fruition, perhaps validated risk instruments will not be as important as they are
today. More optimistic researchers are positing theories based on getting CSPs to change and
offer more services such as Security SLAs or more access to the CSPs internal workings
(Rasheed, 2014; Razumnikov, Zakharova, & Kremneva, 2014; Tang, Wang, Yang, & Wang,
2014; Weintraub & Cohen, 2016). The research study provides a validated risk instrument that
can help SMEs assess the risk of adopting Cloud computing in a simple and rational way that
will work without proposing radical changes in the way CSPs conduct business. While large
enterprises can force changes on CSPs due to the large amounts of money they spend, SMEs do
not have that leverage.
Summary
The theoretical framework discussed in this literature review is that SMEs have different
needs than large enterprises regarding Cloud computing environment risk assessments, and
57
academic research has not answered those needs yet (Haimes, Horowitz, Guo, Andrijcic, &
Bogdanor, 2015; Gritzalis, Iseppi, Mylonas, & Stavrou, 2018; Moncayo, & Montenegro, 2016).
The process to researching potential solutions for SME Cloud computing risk assessments started
with broad searches involving cybersecurity (Anand, Ryoo, & Kim, 2015; Ho, Booth, & Ocasio-
Velasquez, 2017; Paxton, 2016), Cloud computing (Chen, Ta-Tao, & Kazuo, 2016;
Ramachandra, Iftikhar, & Khan, 2017), and Cloud security (Coppolino, D’Antonio, Mazzeo, &
Romano, 2017; Ring, 2015; Singh, Jeong, & Park, 2016). There is research that investigates
industry-based framework solutions for large enterprises as potential solutions but they were
found not to be appropriate for SMEs (Al-Ruithe, Benkhelifa, & Haneed, 2016; Bildosola, Rio-
Belver, Cilleruelo, & Garechana, 2015; Elkhannoubi & Belaissaoui, 2016; Moyo & Loock,
2016; Seethamraju, 2014). SMEs have their own requirements and constraints for most IT
solutions (Gholami, Daneshgar, Low, & Beydoun, 2016; Salim, Darshana, Sukanlaya, Alarfi &
Maura, 2015; Yu, Li, Li, Zhao, & Zhao, 2018) and for adopting Cloud computing such as cost
savings (Al-Isma’ili, Li, Shen, & He, 2016; Chatzithanasis & Michalakelis, 2018; Shkurti &
Muca, 2014) or business process improvements (Chen, Ta-Tao, & Kazuo, 2016;
Papachristodoulou, Koutsaki, & Kirkos, 2017; Rocha, Gomez, Araújo, Otero, & Rodrigues,
2016). SME specific Cloud computing risk assessments that do not use old and outdated on-
premises paradigms are not evident in the literature (Baig, R., Freitag, F., Moll, A., Navarro, L.,
Pueyo, R., Vlassov, V., (2015; Bruque-Camara, Moyano-Fuentes, & Maqueira-Marin, 2016;
Buss, 2013; Cong & Aiqing, 2014). The research study creates a validated risk assessment
instrument, and advances the academic field of SME Cloud computing which is lacking in
research focused solely on SME Cloud computing risk assessments (Aljawarneh, Alawneh, &
58
Jaradat, 2016; Assante, Castro, Hamburg, & Martin, 2016; Feng & Yin, 2014; Hasheela,
Smolander, & Mufeti, 2016).
59
Chapter 3: Research Method
The problem the researcher addressed with this study is that there is no commonly
understood and adopted best practice standard for small to medium sized enterprises (SMEs) on
how to specifically assess security risks relating to the Cloud. The purpose of this qualitative
case study research study was to discover an underlying framework for research in SME risk
analysis for Cloud computing and to create a validated instrument that SMEs can use to assess
their risk in Cloud adoption. In this chapter, the researcher presents the research methodology
and design in detail, including population, sample, instrumentation, data collection and analysis,
assumptions, and limitations. Collecting data using a Delphi technique with three rounds
provided the researcher with enough information from multiple case studies regarding SME
Cloud computing risk assessments. Using a Delphi technique is more successful if the sample is
from a population of subject matter experts. Using subject matter experts informed the
limitations, assumptions, and ethical practices in this research study.
SMEs have a different relationship with risk in general (Assante, Castro, Hamburg, &
Martin, 2016) and Cloud adoption risk in particular (Lacity & Reynolds, 2013; Qian, Baharudin,
& Kanaan-Jeebna, 2016; Phaphoom, Wang, Samuel, Helmer, & Abrahamsson, P. 2015). While
many SMEs see Cloud adoption as an avenue to increase their overall security posture
(Bildosola, Río-Belver, Cilleruelo, & Garechana, 2015; Mohabbattalab, von der Heidt, &
Mohabbattalab, 2014; Wang & He, 2014), they do not have the skilled staff or requisite expertise
to create the business and IT processes and procedures to ensure a more secure result (Carcary,
M., Doherty, Conway, & McLaughlin, 2014; Hasheela, Smolander, & Mufeti, 2016). It is
standard practice for medium to large enterprises to use risk assessments before adopting new
computing environments and SMEs should follow the same process (Cayirci, Garaga, Santana de
60
Oliveira, & Roudier, 2016; Jouini & Rabai, 2016). SMEs, however, cannot generally create their
own security procedures and need a process or validated instrument such as a risk assessment to
determine if they should move to the Cloud (Bildosola, Rio-Belver, Cilleruelo, & Garechana,
2015; Carcary, Doherty, & Conway, 2014; Hasheela, Smolander, & Mufeti, 2016). Current
research does not provide a commonly used strategy by SMEs to identify and address Cloud
security risks (Carcary, Doherty, Conway, & McLaughlin, 2014; Kumar, Samalia, & Verma,
2017).
Research Methodology and Design
The decision to approach this research topic on a qualitative case study basis with a
Delphi instrument was based on several factors (Chan, & Mullick, 2016; Flostrand, 2017;
Ogden, Culp, Villamaria, & Ball, 2016). The primary factor is that the product of this research
study is a new approach to SME Cloud computing risk assessments and a validated risk
assessment tool that SMEs can use going forward. With this research study the researcher is not
building on theories from previous studies but looking to discover a thesis and answers from
analyzing what SMEs are currently doing. The most appropriate way to generate new theses and
answers from research based on what organizations are currently practicing is through the use of
case studies (Leung, Hastings, Keefe, Brownstein-Evans, Chan, & Mullick, 2016; Waterman,
Noble, & Allan, 2015). There are currently no easily adaptable tools for SMEs to use as they
decide to adopt Cloud computing (Huang, Hou, He, Dai, & Ding, 2017; Kritikos, Kirkham,
Kryza, & Massonet, 2015; Lacity & Reynolds, 2013). This case study-based research study used
an appropriate Delphi technique to harness the expertise of a large group of subject matter
experts and to synthesize the output of that expertise into a useable product (Flostrand, 2017;
Ogden, Culp, Villamaria, & Ball, 2016). Using case study-based recordation and analyzation
61
techniques on the replies of the subject matter experts that belong to a local ISACA chapter was
a unique chance to create new theory and validated risk instruments.
In other academic fields, a researcher may do well with a grounded theory-based
methodology. A grounded theory design would be an appropriate choice in similar circumstances
except for several major flaws. If there are SMEs risk teams that have created a validated risk
instrument or have answered the research questions in this study, they have not shared them
(Chiregi & Navimipour, 2017; Lacity & Reynolds, 2013; Liu, Xia, Wang, T., Zhong, 2017). If a
researcher built a study on grounded theory coding techniques of SMEs that are in the process of
solving the research questions, the SMEs would not share the additional and ancillary
information critical to grounded theory coding processes due to security concerns (Korte, 2017;
Ring, 2015). The same reticence on the part of cybersecurity professionals rules out quantitative
approaches in general. Quantitative based research study methodologies such as experimental,
quasi-experimental, descriptive, or correlational are not appropriate for two main reasons. The
subject population that can provide answers posed by the survey instruments of this research
study are a very select group and a very small portion of the general population. Random
selection is not possible given the specific knowledge required of the participants of this research
study. A second major concern that obviates the ability to use quantitative methods is that
cybersecurity professionals are not able to share specific details of their work or their
organizations’ challenges (Lalev, 2017; Ring, 2015). Ethnographic, narrative, and
phenomenological methodologies did not fit the focus of this research study. Perhaps the most
important reason for a qualitative case study approach for this research study is that the answers
are not evident and reporting and proving the answers is a useful addition to the field of
academic research and standard industry practices.
62
This research study was based on the replies from a group of risk subject matter experts
to a multi-round web-based survey. The publishing of a web link to the first round of the survey
on the home page of the greater Washington D.C. chapter of ISACA is the way the subject
matter experts accessed the survey instrument. All respondents received a random identification
number based on the order in which they responded to the survey.
The first round of the web-based survey contained general demographic questions such as
the respondent’s risk background, professional role, and size of organization that employs the
respondent. Careful consideration is important on the demographic based questions for both
ethical grounds and security grounds. Cybersecurity professionals have rarely gained permission
to share any information that may identify weaknesses within their organization (Wilson &
Wilson, 2011). The second section of the first web-based survey included general questions
about risk assessments, Cloud security, and SMEs. Sample web-based survey questions included
those similar to the following. How long have you worked in an IT risk-based field? Have you
created or used risk-based tools to assed your organization adopting Cloud computing? What are
some of the deficiencies you have witnessed in using risk assessments created for on-premises
computing environments? Analysis of the differences and similarities of these responses should
be the major driver in creating the questions based on the second-round web-based survey
(Mustonen-Ollila, Lehto, & Huhtinen, 2018). As researcher used the Delphi technique in this
research study, the second web-based survey asked the participants to assess the results of the
first web-based survey and create new subjects or concepts (Greyson, 2018). Continued analysis
of the answer took place with the second-round results and led to the creation of the third round
of questions (Mustonen-Ollila, Lehto, & Huhtinen, 2018).
63
Population and Sample
The population for this research study was the approximately three thousand strong
current membership of the greater Washington D.C. chapter of ISACA. ISACA is a nonprofit
global association that authors COBIT, a framework for IT governance, and certifications for
audit, governance, and risk professionals (da Silva & Manotti, 2016). A paid membership in
ISACA is a strong indicator that the subject is interested enough in a risk assessment related field
to spend approximately $200 a year to be a member. Membership in an ISACA chapter by itself
is an indicator that the member is not only a risk professional but an expert in the field (Lew,
2015). The sample used in this research study was self-selecting based on members of the D.C.
area chapter of ISACA that respond to the advertisement of this research study. The board of
directors for the local chapter gave permission to place a short article advertising this research
study on the main page of the chapter’s website and granted informal site permission.
The estimated number of members that could have responded to the study ranged from
approximately one hundred replies based on the local chapter’s board of directors estimates to
approximately twenty members based on research on web-based survey tools (Bickart &
Schmittlein, 1999; Brüggen & Dholakia, 2010). There are indications from previous Delphi
studies that much lower numbers such as ten to twenty respondents can be effective in reaching
saturation (Gill, Leslie, Grech, & Latour, 2013; Ogden, Culp, Villamaria, & Ball, 2016). A
response rate of less than one per cent of the local ISACA chapter satisfied a twenty to thirty
subject count. If the response rate was less than one per cent, there is a potential to include other
ISACA chapters or LinkedIn groups in the population to increase respondent counts.
64
Materials and Instrumentation
The main instrument for this research study was an online or web-based survey
instrument with three rounds. The responses to each round played a major role in the creation of
the next round of questions through the review and analysis of the responses process (Mustonen-
Ollila, Lehto, & Huhtinen, 2018). The first round of questions included general demographic
questions and open-ended multiple-choice questions that directly tie back to the research study
questions. The structure of the survey was such that respondents answer questions from the
general IT risk domain and then progress to the specific Cloud risk field. While the survey
questions were new, the questions only used standard terms and paradigms in the IT risk
framework. As part of the case study-based review and analysis process, the first round of
questions was the basis for the researcher to inductively generate the next round of questions and
the discovery of a theory that describes how SMEs can secure Cloud computing environments.
Appendix A includes sample survey questions in a spreadsheet.
The researcher created the survey using the software SurveyMonkey (SurveyMonkey,
2018). SurveyMonkey is a comprehensive solution with sample validated questions and
instructions for creating effective survey questions, however, the questions for this research
study were new to this research study. This research study included the definition of standard
industry terms as part of the survey questions. No questions used non-standard industry terms.
ISACA’s COBIT 5, a framework for the governance and management of enterprise IT provided
any definitions of standard terms needed (da Silva Antonio & Manotti, 2016). It is reasonable to
expect risk subject matter experts to either be familiar with COBIT 5 terms or be able to
reference them as needed.
65
The researcher used standard and recommended design elements such as the use of a
five-point Likert scale, free form text boxes and checkboxes. The collection of this type of
quantitative data allowed the researcher to discover where there is a consensus of ideas and
provide an opportunity to hone in on a unifying theory and validated risk instrument (O’Malley
& Capper, 2015). Future researchers will be able to change the text of any question while
remaining in standard design formats such as Likert scales. The survey questions did not follow a
particular framework such as Technology Organization Environment theory (TOE) but instead
the focus of the questions was on fact finding and straightforward response generation.
The result of this research study includes a validated risk instrument that is freely
available and usable by the general SME community. The participants of the web-based survey
and other members of the local ISACA chapter may help validate the risk instrument at some
future point. Publication of the finished risk instrument product will take place on the web page
of the local ISACA chapter and comments requested. As the risk instrument should be usable by
SME staff that may not be expert risk professionals, there may be a need for validation of the
finished risk instrument by outside review groups as the local ISACA chapter members may
have a bias towards validating the Cloud risk instrument because they contributed to its creation.
The use of the appropriate SME LinkedIn groups should provide the necessary sample on non-
risk experts if needed.
Study Procedures
A short article including a link to a web survey hosted on the SurveyMonkey site
appeared on the front page of the Greater Washington D.C. ISACA chapter. Review and analysis
of all responses to the survey that answer the majority of the questions took place. Once
analyzed, incorporation of the first-round responses into the study took place and the potential
66
respondents received a second link based on the SurveyMonkey website. A similar review and
analysis procedure took place for all responses to the second-round survey that answer the
majority of the questions. Once analyzed, incorporation of the second-round responses into the
study followed and the potential respondents received a third link based on the SurveyMonkey
website.
Once the researcher performed a case study-based review and analysis of the survey
results, the researcher identified a consensus on what is working, and creation of a risk
instrument that is usable by SMEs was the next step. The risk instrument consists of a web-based
SurveyMonkey survey. The finished risk instrument has the same simple branching question
format as the one used by this research study. The risk instrument has distinct sections of
demographic, IT related, and CSP related questions. The risk instrument IT and CSP questions
are adaptive based on the demographic responses. If the risk instrument user indicates that their
organization is a particular size in a particular industry, the following questions for the
organization’s IT staff changes. The same is true for following questions for a potential CSP. For
example, if the risk instrument user indicates that they are a small enterprise in the health care
industry, the risk instrument branch to a set of questions for potential CSPs that ask pertinent
questions for such an organization.
Review of the finished risk instrument by the risk experts in the local ISACA chapter and
non-risk expert SME employees as discussed previously may take place based on voluntary
participation. A successful validated risk instrument must be usable by non-risk experts. The
projected audience of the risk instrument includes SME IT teams and SME accounting
departments. Many SMEs have no dedicated cybersecurity personnel and rely on general IT staff
for all related IT and cybersecurity functions (Wang & He, 2014). SMEs commonly do not have
67
dedicated risk or audit teams and rely on members of the accounting team to evaluate financial
costs and risks for new expenditures such as adopting Cloud computing (Assante, Castro,
Hamburg, & Martin, 2016; Gritzalis, Iseppi, Mylonas, & Stavrou, 2018; Hasheela, Smolander, &
Mufeti, 2016). Evaluation of the final risk instrument by a representative group of SME IT and
accounting staff is a primary step for validation. The intent of the validated risk instrument is to
allow non-Cloud expert SME staff to ask and answer the appropriate questions for their
organization. The risk instrument will also help SMEs decide the appropriate control sets to use
for their organization. While approval and use of the risk instrument by subject matter experts
and general business users is no substitute for academic testing and validation, acceptance by the
IT risk assessment community will be a useful data point for future academic research.
Data Collection and Analysis
The researcher used commonly accepted case study techniques and processes to analyze
the data gathered from the subjects of the local ISACA chapter. Although the collection of the
information takes place via web-based surveys, the subjects were able to respond to many of the
questions in a free form text manner that emulates responding to interview questions. This
allowed the subjects to respond in as much detail as they wish and were allowed to by their
organizations (Mustonen-Ollila, Lehto, & Huhtinen, 2018). Although there were few respondent
comments, the use of a memoing technique after return of the responses took place along with
electronic recordation and formatting for long term storage of all subject answers. Common field
interview drawbacks such as logistics were not be a factor in this research study. Creswell’s data
analysis spiral is a visual representation of how this research study processed and analyzed the
data collected from the web-based surveys (Creswell, 2007).
68
Figure 1. Creswell data analysis spiral. This figure visualizes the data analysis process.
(Creswell, 2007).
After collection of the first round of survey responses, the researcher created preliminary
categories (Ogden, Culp, Villamaria, & Ball, 2016). The researcher repeated this process after
each succeeding round, including deciding which category shows the most consensus among
respondents either for what works or what has failed in the Cloud security risk assessment
process (Parekh, DeLatte, Herman, Oliva, Phatak, Scheponik, Sherman, 2018). Based on the
appropriateness or usefulness of the central category, alteration of the succeeding round of
questions took place to produce a new central theme. For example; if a particular security
concern or the use of a specific technique emerges from the first round of questions, then the
second round of questions would have focused on that security concern. The new and focused
categories would have included which part of the risk field the respondent is most experienced in
and other demographic categories (Ogden, Culp, Villamaria, & Ball, 2016). The categories also
include on which type of risk assessments or analyses the subject has based their response. For
example, respondents that have vast experience in compliance-based audits will focus on
different aspects of Cloud computing environment risks than a respondent that commonly uses a
tool such as the Center for Internet Security (CIS) benchmarks as the basis for their assessments
69
(Center for Internet Security, 2018). After completing review and analysis of the second round of
responses, the researcher conducted a final phase of review and analysis (Ogden, Culp,
Villamaria, & Ball, 2016). As one of the end goals of this research was to develop a validated
risk instrument for SMEs, the entire process was focused on the end goal of presenting a useful
tool for SMEs.
Once the review and analysis process ended, the next step was to create a simple risk
instrument from the results. The risk instrument starts with demographic questions to branch into
the questions for the organization’s IT staff and potential CSPs. The business decision makers
and risk assessors of SMEs now have a simple tool to ask the correct questions of their IT staff
and potential CSPs based on the results of this research study. The first part of the research study
generated a new thesis and the second part packages that thesis and knowledge into a useful tool
for SMEs. The validated risk instrument allows SMEs to properly evaluate Cloud computing
adoption risk by showing the SMEs which questions they need to have answered. As discussed
previously, the risk instrument will help the SMEs identify what they need to know from their IT
teams and potential CSPs.
Assumptions
Research methodological assumptions are particularly important in qualitative case
study-based research and perhaps more so for case study research using a Delphi panel (Parekh,
DeLatte, Herman, Oliva, Phatak, Scheponik, Sherman, 2018; Wiesche, Jurisch, Yetton, &
Krcmar, 2017). The researcher is developing new theses from case study data collection and
assumptions play a large part in the results of the research study. During the review and analysis
of data in this research study, one may embrace different realities or ontological viewpoints
(Parekh, DeLatte, Herman, Oliva, Phatak, Scheponik, Sherman, 2018). One goal of the
70
researcher for this research study was to find a solution and create a validated risk instrument, so
the researcher took care to make sure that the data supports any reality or paradigm discovery
from the data review and analysis process. It was tempting for the researcher to base acceptance
of a paradigm with a solution rather than a paradigm the truly fits the results of the data coding.
The researcher was open to the possibility that there is no current solution to assessing Cloud
security risks for SMEs. The research properly done, led to the methodological assumptions
being paramount. The design of the review and analysis process of this qualitative case study-
based research study was to start with discrete facts and to weave them into an overarching
theory that explains the connections between the details (Glasser, 2016). Each succeeding round
of the Delphi technique elucidated additional results that inductively got closer to a true solution
and a workable validated risk assessment instrument.
Epistemological assumptions of this research paper focus on alternative ways in which to
gain subjective knowledge from the participants of the web-based survey (Guba & Lincoln,
2008). The cybersecurity field is a very difficult one in which to practice field research and to get
close to subjects (Lalev, 2017; Ring, 2015). This research study attempted to ameliorate the
negative results of a lack of subjective closeness between the researcher and the subjects through
the use of a Delphi technique (Johnson, 2009). On the positive side, being that all contact
between the researcher and the subject population was through responses to a web-based survey,
the researcher assumed that it took less effort to increase the distance between the researcher and
the subjects. (Parekh, DeLatte, Herman, Oliva, Phatak, Scheponik, Sherman, 2018).
Axiological research assumptions include the biases, predilections, and beliefs of the
researcher (Denzin, 2001). Axiological research assumptions for this research study include the
researcher’s industry and practical experience as a multi-decade long member of a cybersecurity
71
team. When discussing whether a Cloud computing environment is secure, the default and
immediate reaction of a cybersecurity team member is “No” (Aljawarneh, Alawneh, & Jaradat,
2016; Kholidy, Erradi, Abdelwahed, & Baiardi, 2016; Lai, & Leu, 2015). As the researcher has
the assumptions and biases of a cybersecurity team member, data coding was more rigorous and
exhaustive than it might be if the researcher had a different background. The researcher
anticipates that there is a need to take care to avoid rejecting paradigms that successfully explain
the coded data because they provide a way to make Cloud computing secure.
Limitations
General limitations include funding, time, and access to the subject population. This
research study does not require funding as the expected costs include only a temporary paid
SurveyMonkey account. Limitations to this study did not include time pressures. Although the
limited time period for completing the research phase of this study, the design of this research
study led to completion in a timely manner primarily through the use of easily and quickly
accessed web-based surveys. Access to the subject population was the greatest possible
limitation to this research study. The researcher has spent several years volunteering, teaching,
and working with the board of the local ISACA chapter. The researcher has already obtained site
permission to reach the members of the local ISACA chapter and is continuing to build bonds
with the local ISACA chapter governing bodies.
Cybersecurity professionals do not have permission to share detailed information
regarding what their organizations do to combat threats (Beauchamp, 2015; Lalev, 2017; Ring,
2015). This study attempts to mitigate this limitation by using a Delphi technique with web-
based survey questions that do not require disclosure of specific identifiable information.
Another potential limitation is the risk of subject drop-outs on later rounds of the Delphi
72
technique (Ogden, Culp, Villamaria, & Ball, 2016). Starting with a large number of participants
should ameliorate this risk. If the number of first round respondents is too low, additional steps
such as using LinkedIn groups to increase the number of subjects may take place. The design of
this research study as a qualitative case study-based research study is a potential limitation and
perhaps also a delimitation. Instead of statistically verifiable experimental results, this research
study depends on the inductive reasoning process of the researcher as he reviews and analyzes
the responses (Guba & Lincoln, 2008). While the researcher is an experienced industry
practitioner, the researcher is not yet an experienced expert academic researcher. With the help
of this research study’s dissertation committee, the researcher expects to reach the required level
of academic expertise.
Delimitations
A major delimitation of this research study was the choice of participants (Gomes et al.,
2018). The population for this research study was only those subscribing members of a local
ISACA chapter. This selection criteria was central to the design of this research study and the
research questions. A qualitative case study-based theory research project using a Delphi
technique needs experts (Strasser, 2017). The use of experts implies a limited population.
Additional rounds of a Delphi technique supply additional data needed to complete the review
and analysis process (Strasser, 2017). The design of the problem statement and purpose
statement presumes answers by experts. The problem stated in this research study is a narrow
and specific one that a random sample of any particular population cannot answer. The
researcher has designed answerable research questions for a group of experts with the
aforementioned constraint that they not share specific data on their organization’s cybersecurity
activities including risk assessments of Cloud computing environments.
73
The research decision to investigate solutions to SME risk assessments of Cloud
computing environments had direct ties to the availability of a large subject matter expert sample
of risk professionals that are members of a local ISACA chapter. The researcher has spent
several years working with these experts and the opportunity to collect data from such an expert
group was too great to pass up on. Once the researcher made the decision to use the risk subject
matter experts belonging to the local ISACA chapter, a Delphi technique seems obvious. The use
of a Delphi technique to answer the research questions lead to the selection of a qualitative case
study-based theory approach. Multiple levels of coding enhanced the power of a large group of
experts focused on the research questions of this study (Trevelyan & Robinson, 2015).
Ethical Assurances
The researcher received approval from Northcentral University’s Institutional Review
Board (IRB) prior to data collection. The risk to participants was minimal. No collection of
personally identifiable information (PII) took place. The researcher assumed that access to and
participation on the local ISACA chapter website is proof of expertise. The limited demographic
data collected is not able to identify participants. The intent was to design the web-based survey
questions as generic enough that a bad actor cannot identify respondents’ organizations. For
example, questions relating to the industry or size of a respondent’s organization offers responses
in broad categories and not specific numbers. Only sharing of the data from the responses in
aggregate numbers will take place.
The storage of data from the study including all rounds of the Delphi technique using a
web-based survey instrument follow the Northcentral University’s requirements. Compression
and encryption of the data will take place. The researcher will then upload the data to a free
Gmail account. Storage of the encryption key will be in a LastPass password manager account.
74
Magnification of the researcher’s role in this research study can happen by the qualitative
multiple case study-based theory framework. The researcher reviewed and analyzed data from
multiple sources during three phases and personal and professional biases could have easily
influenced the theory discovery process based on the researcher’s view of the data. Personal
biases are a lesser concern for this research study as collection of the data uses a web-based
survey instrument. There was no personal interaction with the subject population. The research
study did not collect demographic data regarding race, sex, nationality, or any other factor that
could play into personal bias by the researcher.
Professional experience bias is a greater concern. As the researcher has spent decades on
cybersecurity teams, the concept that one can never fully secure data stored on someone else’s
computer is a truism. The long professional career of the researcher has also deeply ingrained the
idea of rapid change in IT. The researcher understands that Cloud computing adoption is rampant
and increasing at a rapid rate (Bayramusta & Nasir, (2016). The tone of this research study is
deeply optimistic. The goal is to find a solution to SMEs’ adoption of Cloud computing securely.
While the researcher’s professional experience is likely to cause greater scrutiny on data coding
results that appear to offer solutions, that is a feature of good research
Summary
Professionals in SMEs need proven ways to evaluate risk in adopting loud computing
environments and this qualitative case study-based theory research study has produced a
validated risk instrument for that purpose. The research methodology and design of this study
included web-based survey instruments, and a Delphi technique. Review and analysis of the data
collected from the three rounds of web-based surveys used case study-based theory techniques
and the results formed the basis of a freely available Cloud computing risk assessment
75
instrument for SMEs. The rarely available subject population is the key to this research study and
the basis for all design and methodology decisions. The design of this research study intends to
yield maximum results from a large group of IT risk subject matter experts based on membership
in a local chapter of ISACA.
76
Chapter 4: Findings
The researcher’s purpose with this qualitative case study was to discover an underlying
framework for research in SME risk analysis for cloud computing and to create a validated
instrument that SMEs can use to start assessing their risk in cloud adoption. To determine if they
are ready to transition to cloud computing, SMEs need a process or validated instrument such as
a risk assessment (Bildosola, Rio-Belver, Cilleruelo, & Garechana, 2015; Carcary, Doherty, &
Conway, 2014; Hasheela, Smolander, & Mufeti, 2016). Current research does not show that
SMEs using a risk-based approach have reached a consensus on how to identify and address
cloud security risks. (Carcary, Doherty, Conway, & McLaughlin, 2014; Kumar, Samalia, &
Verma, 2017). In this chapter, the researcher describes the ways in which this research study
achieved trustworthiness of the data. Also, in this chapter, the researcher presents the results of
the research including how the researcher answered each of the four research questions.
Trustworthiness of the Data
The researcher confirmed the trustworthiness of the qualitative data gathered in this
research project by prolonged engagement, triangulation, transferability, dependability, and
confirmability. Prolonged engagement for this research study involved the researcher being a
member of the local chapter of ISACA (GWDC) for over five years and volunteering for over
one hundred hours of conference events hosted by the local chapter. The researcher worked hard
to build a close and effective volunteer relationship with the local chapter officers. This research
study was the first one supported by GWDC and the first that GWDC allowed to use the local
chapter email list and chapter events to promulgate the three web surveys. Prolonged
engagement with GWDC by the researcher led to the researcher gaining the trust of the subject
population of risk experts. Familiarity with the field of risk assessment and the expert
77
practitioners of risk assessments by the researcher helped to make sure the survey questions were
based on pertinent risk assessment practices. The research project was a three round Delphi panel
with anonymous participation. The survey questions were specific to the risk assessment field
and required expert knowledge of the field to answer coherently.
As the research was based on anonymous surveys, the primary type of triangulation was
that of data triangulation. The same group of respondents may have completed each survey or a
totally different group each time. The researcher designed each of the surveys to ask questions
related to each of the four research questions. The design of several questions in each of the three
surveys intended to elicit consistent responses to those asked in the other two surveys. For
example; survey one, question eleven that asks “What IT security control standards do you see
SMEs using?” and survey three, question thirteen asking “Once controls have been identified for
the SME’s Cloud environment, what effect do they have on existing SME IT controls?” were
purposely asked in separate surveys rather than one right after the other as a way to increase data
triangulation.
The use of three web surveys is a simple form of method triangulation. A GWDC
newsletter announced each survey, and the researcher used the SurveyMonkey web-based tool
for each survey so the method triangulation was weak but each survey presented questions
differently than the other surveys. For example; survey one used two questions for each major
point, such as question ten “For SMEs that are planning to adopt Cloud computing, do you see
SMEs using IT security control standards?” and question eleven “What IT security control
standards do you see SMEs using” would be one question in survey two or three. Survey two had
questions that directly referenced the results of survey one such as question ten “100% of
respondents to survey 1 have seen recommendations to outsource the transition to a Cloud
78
environment. Which portions of a transition to a Cloud environment have you seen
recommended to be outsourced?” There was only one researcher so investigator triangulation
was not possible. During the coding portion of the data analysis the researcher used a simple
form of theory triangulation when grouping results of the survey questions under each research
question.
There are limits to the transferability of the data due to the very specific field that the
questions focused on. Within the field of Cloud computing risk assessments, however, the
transferability of the data is very strong due to the consistent format of the surveys as web based
with questions predominately presented as multiple choice. The researcher does not need to
provide thick description where the researcher “provides a robust and detailed account of their
experiences during data collection” (Statistics Solutions, 2019) for this research project as the
data is three series of questions with multiple choice answers. The use of a Delphi technique
specifically reduces the subject population to subject matter experts in a particular topic (Choi &
Lee, 2015; El-Gazzar, Hustad, & Olsen, 2016; Johnson, 2009). By reducing the participants to
risk subject matter experts, the researcher greatly lessened most of the concerns about
transferability due to broader social or cultural concerns regarding data collection or participants’
biases. Further reducing the subject population to those experts that are members of a local
geographical based chapter of an international risk professional association helps to reduce
potential cultural biases when answering the survey questions. Risk assessments are fact finding
exercises and risk assessment results are statements of success and failure (He, Devine, &
Zhuang, 2018). Risk assessments and audits avoid emotional or culturally based descriptors or
modifiers. (King et al, 2018).
79
The researcher can state many of the results of this research project in simple declarative
statements such as “100% of risk experts surveyed found that SMEs have gotten
recommendations to transit to Cloud computing operations”. While the interpretation of how
various questions within each of the three surveys relate to each other may change when viewed
by other researchers, the basic information gained by surveying D.C. area risk subject matter
experts on specific Cloud computing risk assessment topics is clear and transferable with high
fidelity to the original results. One can never completely eliminate bias on the part of the
researcher or participants but the use of multiple-choice questions based solely on a fact-based
profession that requires declarative statements as the work product goes a long way to reducing
any potential bias (de Bruin, McCambridge, & Prins, 2015).
The researcher’s design of this research study is that of a qualitative case study approach
with a Delphi technique. A qualitative approach using a case study methodology is the best
solution for dependability when trying to elucidate data from cybersecurity professionals
regarding potentially confidential processes. A problem solved by using a qualitative case study
approach is that the subject population of risk-based Cloud computing research experts were able
to respond with qualitative data but not quantitative numbers to avoid compromising their
organization’s security (Glaser, 2014). Using a three-round survey with multiple choice
questions allows cybersecurity professionals to answer questions regarding Cloud transition risk.
This improves the dependability as future researchers can ask the same questions without
concern that respondents will not be able to answer.
The researcher’s use of a Delphi technique, in the case of this research study, improves
the dependability of the data gathered in this research study. Limiting the subject population to
experts in the risk field reduces the variability of potential subjects for both good and bad (Lu,
80
2018). In the case of dependability, reduced variability makes the research study easier to
replicate if one uses the same strictures on respondents. The use of a Delphi technique allowed
the researcher to pose specific questions about a very narrow field. The more specific the
questions, the more easily a future researcher can replicate the study. The use of multiple-choice
answers also increases the ease in which a future researcher may be able to replicate this study. If
the future researcher wants to add new choices based on recent technology or a different research
focus, they will be able to just add more choices to existing questions. While there is always the
chance that questions reworded to add or remove bias may gather different answers in a future
research study, short simple answer choices remove some of that problem (Bard & Weinstein,
2017).
The dependability of the data from this research study is very strong aside from one
hurdle. If a future researcher gains access to GWDC, then the researcher could simply replicate
the study completely by posting the same three surveys. Based on the local chapter’s board with
this research project, they have verbally agreed to similar efforts in the future. The international
chapter of ISACA is pushing to have greater student involvement and research projects such as
this would help further that goal. Future researchers attempting to replicate this research study
would most likely find approval from the local chapter board if the researcher was a member of
the chapter and a student. If a future researcher wished to replicate this study without being a
member of the local chapter, they would need to increase efforts to reach risk experts. The future
researcher would also have to devise a way to make sure that the respondents were actual risk
experts. One of the benefits of limiting the population to members of the local ISACA chapter is
that it is reasonable to conclude that only risk professionals would agree that paying international
81
and local dues in the current amount of one hundred and sixty-five dollars a year to ISACA is
worth doing.
The researcher used a straightforward approach to address the confirmability of the data
(Korstjens & Moser, 2018). This research study is based on a Delphi technique and surveys risk
experts using multiple choice questions. The data received from the surveys is clear and easily
summarized by each question in simple to read tables. Presentation of pertinent tables takes place
when discussing the research questions. Suspected bias in answering multiple choice questions
can be determined by looking at the survey results broken down by respondents. After looking at
results grouped by respondents, the researcher discovered no such bias and readers can find the
results by respondent in the Appendix. Readers may check for bias by the researcher in this
project by reading the multiple-choice questions and potential answers. The field of research is
very narrow and focused on risk assessments related to a SME transitioning to a Cloud
computing environment. The use of loaded terms with emotional overtones is not apparent in the
questions or the answer choices. The researcher took care to change the question style between
surveys to eliminate unconscious attempts to lead respondents to a particular answer. For
example; survey one generally asked two questions for each area of focus. Survey one, question
eighteen “Do you see SMEs adopting Cloud security controls?” and question nineteen “What
Cloud security controls do you see SMEs adopting?” are an example of this. Some of survey two
questions had multiple sentences in the question and did state assumptions in the question but the
assumptions did not include emotional or bias elements. For example; Survey two, question 7
“Most SMEs have Cloud operations in progress. Which scenarios have you seen and which have
you seen audited by SMEs?” states the assumption that most SMEs have current Cloud
operations.
82
Results
With this research study, the researcher used a Delphi technique with three rounds of
surveys to ask risk assessment experts questions about cloud computing adoption by SMEs and
the risk assessment process involved with the SME’s transition to the cloud. The presentation of
survey instrument questions follows the four research questions in the study. The researcher
gathered data for each of the four research questions and presents and organizes the results are by
research question in this chapter. The survey instrument questions had multiple choice answers.
Design of the questions differ in some ways to elicit accurate and consistent answers.
RQ1. What are the current frameworks being leveraged in Cloud specific risk
assessments? When answering survey questions related to this RQ, respondents described several
frameworks showing common use. RQ2. What are the primary categories of concern presently
being addressed in Cloud specific risk assessments? Respondents answered this RQ with
multiple concerns with one concern very prevalent. RQ3. What are the commonly used and
tailored security controls in Cloud specific risk assessments? Answering this RQ happened at a
high level with several control families predominant. RQ4. What are the commonly
recommended mitigations in Cloud specific risk assessments? Respondents also answered this
RQ with several mitigations currently in use. This chapter organizes survey instrument questions
and answers by research question.
The participants were anonymous. There was no requirement for participants to give their
names. The research process involved three web surveys hosted by SurveyMonkey. The survey
instrument did not track or record of the IP addresses of the respondents. GWDC shared the
surveys’ web links via the Washington D.C. chapter of ISACA (GWDC) weekly newsletter, the
GWDC website and at GWDC one day conferences. This had a purposeful effect of limiting the
83
population to members of ISACA in general and GWDC specifically. Participants did not have
to have a membership in GWDC but the limited dissemination of the web links worked to limit
the population to GWDC members for the most part.
Aside from probable membership in GWDC which assumes the subject population is
based in the D.C. geographic area, the demographic details of the respondents are not known in
great detail. To take part in each of the surveys the respondents had to say that they were
eighteen years or older and that they had at least five years of risk experience. The researcher
decided not to collect further demographic details of the respondents. The researcher determined
that it was sufficient for the respondents to give their expert opinion on Cloud computing risk.
Age other than adult, gender, or nationality do not impact the respondents’ replies to the
multiple-choice questions in the surveys.
The designs of the surveys included strong efforts to let respondents be as anonymous as
possible and to not require demographic detail due to the sensitive nature of the survey questions.
The survey questions are not personally sensitive to the respondents but the questions would be
sensitive if the question involved a specific SME. The professional cybersecurity population
rarely has permission to discuss their SME’s cybersecurity efforts in any detail due to SME
concerns about giving adversaries damaging information. Cybersecurity risk professionals face
the same constraints. The design of the surveys focused on making sure that there was no
possible identification of respondents or the SME that employs them from any possible
combination of the data gathered in this research study.
As the survey questions are predominately multiple choice, the coding process for this
research study is seemingly straightforward. Complex and complicated coding was not an
effective option when the survey respondents were truly anonymous but some themes still
84
emerged. The age, ethnicity, gender, or life experiences of the respondents to the three surveys in
this research study cannot be determined. Survey questions were very focused on a narrow field
of expertise and this research study does not require demographic details of the respondents. On
the other hand, as this research study uses a Delphi technique to survey a group of experts, even
small differences in results can lead to new themes discoveries.
Because the recruitment of respondents took place through a Washington D.C. chapter of
a professional risk association, government experience is likely for many of the respondents. One
can find confirmation of this government experience in some of the responses to certain survey
questions. For example; survey one, question eleven asked about IT security controls. The
responses look at least partially tilted to NIST security control standards that the U.S. federal
government uses. Response to survey one, question thirteen offers further confirmation of the
federal background of some of the respondents. The top three choices by respondents to survey
one, question thirteen are Federal government based. DoD, DISA, and FedRAMP Cloud security
baselines. The Federal government IT frameworks and Cloud security controls are based on a
compliance paradigm. To remove this potential Federal government bias, surveys two and three
questions avoided potential compliance-based questions for the most part.
An additional research question one related theme, is that current frameworks in use are
not as current as they might be. More respondents reported seeing existing frameworks and
guidelines in use that either are not Cloud focused or have creation dates well before Cloud
computing was predominant. Although respondents see SMEs accepting CSP attestations and
SLAs, they are not using the CSPs advanced security tools. Even SMEs, more nimble and more
easily able to change than large enterprises, do not keep up with the dramatic changes in Cloud
85
computing. This strongly relates to the predominant theme discovered by replies related to
research questions two and three.
The biggest theme, and the one that most of the coding process led to, is that SMEs do
not have Cloud capable staff. SMEs respond to research question two with a resounding
uniformity. Lack of properly trained IT staff is the major theme of research question two results.
SMEs’ lack of Cloud trained staff affects every research question in this study. By far, the
consensus of the risk experts surveyed is that SMEs need outside help with Cloud transitions.
When SMEs have the choice of either investing in their IT staff, or outsource or contract out
work involved in the SMEs Cloud transition, risk experts recommend outsourcing. In relation to
research question three, the risk experts recommend controls that rely on third parties.
Commonly recommended mitigations, research question four, also relied heavily on third-parties
or outsourcing.
Research question 1. What are the current frameworks being leveraged in Cloud
specific risk assessments?
Tables present pertinent survey questions and the respondents’ answers below for clarity.
The researcher designed several survey questions related to research question one to be
exploratory and level setting to make sure the subject population was the appropriate group to
answer the other survey questions. The purpose of some survey questions was to cross-check
previous survey questions. All survey question tables are in appendix B.
Survey one, question nine asked which IT related frameworks are SMEs adopting. This
question directly addresses research question one. Respondents reported three common IT
frameworks as commonly used by SMEs with COBIT, ITIL, and ISO/IEC 38500 each receiving
eleven of nineteen replies. The responses to this question helped direct the focus of surveys two
86
and three. The answers to this survey question help to identify a common theme of SMEs not
using current or best practice frameworks.
Table 1
Survey 1. Q9: What IT related frameworks (partially or completely) do you see SMEs adopting
Answer Choices Responses Count
COBIT (Control Objectives for Information and Related
Technologies)
61.11% 11
ITIL (formerly Information Technology Infrastructure Library) 61.11% 11
TOGAF (The Open Group Architecture Framework for enterprise
architecture)
27.78% 5
ISO/IEC 38500 (International Organization for
Standardization/International Electrotechnical Commission Standard
for Corporate Governance of Information Technology)
61.11% 11
COSO (Committee of Sponsoring Organizations of the Treadway
Commission)
38.89% 7
Other 16.67% 3
A large proportion of respondents to survey one have seen Cloud security configuration
baselines used by SMEs. Survey one respondents identified many Cloud security configuration
baselines in use by SMEs with no one baseline predominant. Respondents choose the federal
government-based Cloud security configuration baselines at a higher rate than normal for a more
general risk expert population. As the subjects were members of a D.C. based risk organization,
87
a bias towards government-based examples the researcher should have expected this result.
Identification of this minor theme occurred early in the coding process, and the design of surveys
two and three mitigated its effects.
Table 2
Survey 1, Q 13: What Cloud security configuration baselines have you seen used by SMEs?
Please select all that apply.
Answer Choices Responses Count
DoD Cloud Security requirements guides (Department of Defense) 62.5% 10
DISA/IASE Security requirements guide (Defense Information
Systems Agency Information Assurance Support Environment)
56.25% 9
CSA Cloud security guidance (Cloud Security Alliance) 31.25% 5
FedRAMP Cloud security baselines (Federal Risk and Authorization
Management Program)
68.75% 11
AWS SbD (Amazon Web Services Security by Design) 50% 8
CIS Cloud baselines (Center for Internet Security) 50% 8
Other 0% 0
A majority of survey two respondents do not see the use of current frameworks changed
as a result of Cloud transitions. This directly applies to research question one and is related to a
theme discovered in this research study. If Cloud specific risk assessments are not changing the
framework used by a SME, perhaps a Cloud environment does not need a new or tailored
framework. Most likely the theme of SMEs not using the best frameworks for a Cloud transition,
88
however, is directly related to the major theme of this research study; that SMEs do not have
properly trained Cloud personnel.
89
Table 3
Survey 3, Q14: Have you seen Cloud risk assessments change other previously completed SME
risk assessments in the ways listed below? Please select all that apply.
Answer Choices Responses Count
Previous risk assessments changed because of CSP location. 6.25% 1
Previous risk assessments changed because of new legal or
regulatory requirements based on Cloud usage.
37.50% 6
Previous risk assessments changed because of new financial
requirements based on Cloud usage.
6.25% 1
Previous risk assessments changed because of new insurance
requirements based on Cloud usage.
6.25% 1
Previous risk assessments changed because of new market
requirements based on Cloud usage.
0% 0
Previous risk assessments changed because of new operational
requirements based on Cloud usage.
37.5% 6
Previous risk assessments changed because of new strategic
requirements based on Cloud usage.
6.25% 1
Other (Please describe) or any additional comments (We want your
expertise)?
0% 0
90
As a refutation to the conclusion that Cloud specific risk assessments are not changing
the framework used by a SME the results of survey three, question fifteen show that Cloud
transitions are changing SME risk and audit teams. Most respondents to survey three see an
increased work load and rate of change for SME risk assessment teams. This is slightly
tangential to research question one, but does indicate that there is an increased use of
frameworks. In the coding process, the results of this survey question indicate that there may be a
valid counterpoint to assuming that current frameworks are not changing.
Table 4
Survey 3, Q15: Cloud transitions almost always promise cost saving and Cloud operations
usually require less effort than on-premise IT operations. Cloud transitions, however, increase
the risk and audit team’s responsibilities, knowledge, and skills requirements. How do you see
SMEs changing their risk and audit teams to adapt to Cloud environments? Please select all that
apply.
Answer Choices Responses Count
Increase size and budget of risk and audit teams. 41.18% 7
Reorganize or change structure of risk and audit teams 64.71% 11
Increase outsourcing or use of consultants to perform Cloud risk and
audit duties.
47.06% 8
Increase workload of existing risk and audit teams. 76.47% 13
91
Research question 2. What are the primary categories of concern presently being
addressed in Cloud specific risk assessments?
Every respondent to survey one saw non-technical areas of concern and IT (not security)
areas of concern for SMEs transitioning to the Cloud. Several survey questions directly
addressed research question two, including survey one, questions fourteen, fifteen, sixteen, and
seventeen. The two tables following, show the responses to questions fifteen and seventeen.
Every respondent to survey one saw non-technical areas of concern for SMEs transitioning to the
Cloud, with the majority of those concerns being privacy, business process, governance,
financial, or legal related. Respondents see more than one non-technical area of concern for
SMEs. Majorities of survey one respondents saw IT team knowledge and skills, IT audit results,
type of Cloud to use, network path to Cloud, backup and restore, and cost as primary categories
of concern in Cloud risk assessments. A plurality of respondents to survey one, question fifteen
and question seventeen selected every response except other.
Respondents to the second survey found a large number of non-IT related concerns for
SMEs when transitioning to the Cloud. While there was no one specific concern with a majority
of respondents, there were ten concerns with a third or more respondents selecting them. Survey
two, question thirteen added additional choices of concerns including business process and risk
assessment. Survey two, question thirteen also included choices specific to outsourcing the
concerns listed in survey one, questions fourteen through seventeen. These results reinforce the
major them of this research study. SMEs need more Cloud expertise and if it is not present in
existing staff, one solution is to use a competent third-party to help.
92
Table 5
Survey 1, Q15: What non-technical areas of concern do you see when SMEs are contemplating
Cloud adoption?
Answer Choices Responses Count
Governance 80% 16
Business Process 85% 17
Financial (non-technical) 70% 14
Privacy 85% 17
Legal 55% 11
Other 15% 3
Any additional comments (We want your expertise)? 15% 3
93
Table 6
Survey 1, Q17: What IT (non-security) areas of concern do you see for SMEs as they adopt
Cloud computing? Please select all areas of concern that you have seen.
Answer Choices Responses Count
Backup and Restore 60% 12
IT Audit Results 75% 15
Transition Process to Cloud 100% 20
Type of Cloud to use IaaS (Infrastructure as a Service), PaaS
(Platform as a service), SaaS (Software as a service)
70% 14
IT Team Knowledge and Skills 75% 15
Network Path to Cloud (redundant paths, multiple Internet service
providers)
65% 13
Cost 55% 11
Psychological Barriers/Concerns 50% 10
Other 0% 0
Other (please specify) 5% 1
Based on what SMEs pay attention to when starting the transition to the Cloud,
respondents to survey two report that a strong majority see the choice of a CSP and then the
choice of the type of Cloud infrastructure as primary concerns. SMEs make these choices before
the SME would normally conduct a risk assessment process. Researchers would need to conduct
94
further research before concluding that the SME risk assessment team was involved with
choosing a particular Cloud vendor or Cloud computing infrastructure. Almost half of survey
two respondents see choice of security controls and choice of Cloud security baselines as
primary concerns.
Table 7
Survey 2, Q8: When starting to plan a transition to a Cloud environment, what have you seen
SMEs start with before risk assessments or collections of requirements? Please select all that
apply.
Answer Choices Responses Count
Choice of CSP (Cloud service provider). 86.96% 20
Choice of infrastructure such as IaaS (Infrastructure as a Service),
PaaS (Platform as a Service), or SaaS (Software as a Service).
69.57% 16%
Choice of IT framework such as COBIT, ITIl, or ISO/IEC 38500. 30.43% 7%
Choice of security control standards such as NIST SP 800-53 or
CSF, HIPAA, or PCI-DSS.
47.83% 11
Choice of Cloud security baselines such as FedRAMP, CIS, or CSA. 47.83% 11
Automation tools such as DevOps or DevSecOps. 26.09% 6
Other 4.35% 1
The responses to survey three, question ten indicate that a lack of current SME IT staff
expertise is a major concern for SMEs in survey one. In survey two, the researcher asked
participants several questions in an attempt to identify the cause of a lack of staff expertise and
95
possible solutions. In response to survey two, question eleven a majority of respondents
identified multiple causes including IT staffs that are undersized, budget deficiencies,
governance and management issues, and SME business structure. Again, this points to the
predominant theme of this research; SMEs do not have enough Cloud expertise on staff.
In response to survey two, question twelve risk experts identified several solutions with a
preponderance of choices using third parties or outside consulting. If a SME is outsourcing its
operations and on-premises hardware to a CSP, it may make sense to include all facets of a
Cloud computing operation (Fahmideh & Beydoun, 2018). Outsourcing is certainly an option for
SMEs in other business operations (Al-Isma’ili, Li, Shen, & He, 2016; Baig, Freitag, Moll,
Navarro, Pueyo, Vlassov, 2015), outsourcing a Cloud transition may be the best way to address
Cloud specific risk concerns (Fahmideh & Beydoun, 2018). Survey two, question fifteen
attempted to correlate SMEs choices of CSP to help address research question two. The choice
of CSP could help inform primary categories of concern because CSPs offer different services,
security offerings and control choices.
Respondents to survey two, however, selected CSPs at a rate very similar to the general
publics’ usage of CSPs and Cloud offerings. No respondent picked a specialty CSP aside from
Oracle Cloud. Responses to survey two, question fifteen may be related to the lack of IT staff
training and knowledge shown in responses to survey two questions. Further study on this topic
may be fruitful. Respondents to survey three showed by their choices to answer question ten that
SMEs were making changes to address primary categories of concern identified in previous
survey questions even if the choice of CSP does not indicate a meaningful trend. Majorities of
respondents choose new IT controls, Cloud security guides, IT governance frameworks, and CSP
recommended practices as ways in which SMEs were addressing primary categories of concern.
96
A researcher can show that the responses to this correlate with previous questions that show
outsourcing as a primary tool to alleviate a lack of staff Cloud expertise but that is not a
conclusion drawn from this research.
Table 8
Survey 3, Q10: When assessing risk of Cloud environments, do you see SMEs changing their
process in the ways listed below? Please select all that apply.
Answer Choices Responses Count
Using CSP recommended practices 55.56% 10
Using any IT governance frameworks not previously used by the
SME.
61.11% 11
Using any IT controls not previously used by the SME. 77.78% 14
Using any Cloud security control guides not previously used by the
SME.
61.11% 11
Other (Please describe) or any additional comments (We want your
expertise)?
0% 0
Research question 3. What are the commonly used and tailored security controls in
Cloud specific risk assessments?
For the purpose of this research study the definition of Cloud security controls does not
have great specificity. Although security control catalogues abound, and include great detail in
every part of applying and using a particular security control, the goal of this research study is
not to pick individual controls. As a practical matter, asking survey respondents to go through
97
thousands of individual controls was not feasible. Asking respondents about control families is
the proper level of detail for this research study.
Almost all respondents to survey one have seen security controls used in Cloud risk
assessments and almost all respondents to survey one see SMEs adopting Cloud security
controls. These are similar questions with a difference in tense. The underlying requirement for
research question three is that SMEs are using security controls in Cloud computing
environments. A large majority of respondents to survey one selected all choices for IT security
controls by large majorities except for CIS top twenty controls with just over fifty per cent
selection and two control families; IEC 62443 and ENISA with less than sixteen per cent. Based
on the percentages of selection by respondents, SMEs are using many different security controls.
98
Table 9
Survey 1, Q11: What IT security control standards do you see SMEs using? Please select the
standards from the list below.
99
Answer Choices Responses Count
CIS (Center for Internet Security) top 20 controls 52.63% 10
NIST SP 800-53 (National Institute of Standard and Technology
Special Publication 800-53 Security and Privacy Controls for
Information Systems and Organizations)
84.21% 16
NIST Cybersecurity Framework (National Institute of Standard and
Technology)
84.21% 16
ISO/IEC 27001 (International Organization for
Standardization/International Electrotechnical Commission
Information Security Management Systems)
73.68% 14
IEC 62443 (International Electrotechnical Commission Industrial
Network and System Security)
5.26% 1
ENISA NCSS (European Union Agency for Network and
Information Security National Cyber Security Strategies)
15.79% 3
HIPAA (Health Insurance Portability and Accountability Act) 78.95% 15
PCI-DSS (Payment Card Industry Data Security Standard) 68.42% 13
GDPR (General Data Protection Regulation) 78.95% 15
Other 5.26% 1
Respondents to survey one selected all choices for Cloud security controls in question
nineteen. There is a clear split with newer control choices such as CASB or SecaaS under fifty
100
per cent, and older tools such as virtual firewalls and physical security devices receiving closer to
seventy per cent. This is a very interesting question for future research. As DevOps and
DevSecOps becomes more prevalent in IT, this balance may change (Betz & Goldenstern, 2017).
Cloud computing cannot realize its full power until SMEs start adopting Cloud specific tools and
paradigms. The use of DevOps and DevSecOps would help address the main theme of this
research. If SMEs adopted newer Cloud processes and procedures, SME IT staff would be able
to raise their Cloud expertise.
101
Table 10
Survey 1, Q 19: What Cloud security controls do you see SMEs adopting? Please select all
Cloud security controls that you have seen.
102
Answer Choices Responses Count
Data storage 68.42% 13
VMs (Virtual Machines) 57.89% 11
Micro services (Docker, Kubernetes, etc.) 31.58% 6
Networks 52.63% 10
Virtual security devices (for example; virtual Firewalls or Amazon
Web Services (AWS) security groups)
73.68% 14
Physical security devices (for example; a Hardware Security
Module (HSM))
57.89% 11
CASB (Cloud Access Security Broker) 21.05% 4
Encryption at rest 78.95% 15
Encryption in transit 89.47% 17
Encryption during compute (homomorphic encryption) 31.58% 6
Backup 52.63% 10
SecaaS (Security as a Service) 31.58% 6
SecSLA (Security Service Level Agreement) 15.79% 3
IAM (Identity and Access Management) 63.16% 12
MultiCloud 15.79% 3
Other 0% 0
103
Common across the three surveys, respondents agree with the idea that outsourcing or
using a third party for all or parts of a SME’s Cloud transition is a proper solution in many cases,
again supporting the major theme of this research. Outsourcing the entire Cloud transition is a
common way for SMEs to enact Cloud controls. Survey respondents also see recommendations
to outsource the planning of transferring data to the Cloud as a commonly used security step for
Cloud transitions. When looking at moving further down into the process of transitioning to a
Cloud computing environment, the pattern of outsourcing continues. For example; less than half
of the respondents have seen recommendations to have the SME IT team execute specific Cloud
security controls, Similar to the concept of DevOps and DevSecOps transforming Cloud
environments and Cloud security tools, a future research project may find outsourcing diminish
as SME IT teams become more conversant in DevOps and DevSecOps (Fahmideh & Beydoun,
2018). As outsourcing or the use of a third party is so prevalent, SMEs may not focus on
tailoring and using Cloud security tools.
104
Table 11
Survey 2, Q10: 100% of respondents to Survey 1 have seen recommendations to outsource the
transition to a Cloud environment. Which portions of a transition to a Cloud environment have
you seen recommended to be outsourced? Please select all that apply.
Answer Choices Responses Count
Entire transition including choice of CSP (Cloud Service
Provider), type of virtual environment, and transfer of data.
15.79% 3
Selecting CSP and type of infrastructure such as IaaS, PaaS, or
SaaS.
47.37% 9
Creating and executing data transfer plan to Cloud environment. 68.42% 13
Creating and executing security controls in Cloud environment. 42.11% 8
Managed or professional services including ongoing management
of SME data and IT operations.
73.68% 14
Managed security services including scheduled audits or
penetration testing.
42.11% 8
Other. 0% 0
Respondents to survey two, question sixteen show a similar pattern to survey two,
question six in that many SMEs are using Cloud tools but a smaller percentage are also auditing
the tools. Survey three, question seven respondents recognize new hazards that need controls by
a large margin for CSP environments. This raises the issue of how SMEs are using and tailoring
105
Cloud security controls again. If SMEs are not auditing or risk assessing Cloud tools and Cloud
environments, then SMEs are most likely not tailoring controls based on specific threats.
As shown by the replies to survey three, question eleven, SMEs see a shift in standard
risk practice when transitioning to the Cloud. Respondents see risk assigned to business owners
less than forty per cent of the time. Respondents see the SME security team assigned the risk
almost as often. This is a change from usual SME practice (Brender & Markov, 2013). Perhaps
this shift is a result of more outsourcing and use of third parties but only more research can
confirm this hypothesis. This may be a tangential theme to that of Cloud outsourcing or just an
indication of SMEs not truly understanding Cloud computing.
The responses to survey three, question twelve split evenly on accepting CSP based
controls. A strong plurality or respondents see SMEs using new IT governance controls and
Cloud security control guides. SMEs are using new security controls as they transition to a Cloud
environment but this research study results do not show that SMEs have reached the point where
SMEs are tailoring Cloud security controls for specific risks. Again, as SMEs adopt Cloud
specific paradigms and tools such as DevOps and DevSecOps, this may change.
A majority of survey three, question thirteen respondents see SMEs integrating new
Cloud security controls into existing control catalogues. Over a third of respondents are reporting
that SMEs are keeping Cloud controls separate. Perhaps the use of “tailoring” in research
question three seems imprecise or used too early in the general SME Cloud adoption process.
Some current research suggests that DevOps and DevSecOps, among other changes, may
revolutionize Cloud security control changes and allow continuous security control changes
(Betz & Goldenstern, 2017). Revisiting research question three in five to ten years may show
very interesting results.
106
Table 12
Survey 3, Q 13: Once controls have been identified for the SME’s environment, what effect do
they have on existing SME IT controls? Please select all that apply.
Answer Choices Responses Count
New Cloud controls are kept separate from existing control
catalogues.
35.29% 6
New Cloud controls are combined with existing controls to form
larger control catalogues.
64.71% 11
New Cloud controls promise to replace or reduce existing control
catalogues spurring increased Cloud transitions.
17.65% 3
New Cloud controls appear onerous and reduce Cloud transitions
due to increased difficulty.
5.88% 1
Other (Please describe) or any additional comments (We want
your expertise)?
5.88% 1
Research question 4. What are the commonly recommended mitigations in Cloud
specific risk assessments?
As shown by the responses to survey one questions, all respondents have seen
recommendations to outsource at least a portion of the SMEs transition to the Cloud. Survey one
respondents have not seen a risk assessment recommendation to avoid Cloud computing. A
majority of survey two respondents see SMEs receive recommendations to mitigate Cloud risk
107
by outsourcing the transition or planning the details of the data transition. A majority of survey
respondents have seen multiple recommendations such as accept CSP attestations, accept CSP
SLAs, and outsourcing of Cloud operations. These results reinforce the main theme of this
research. If SMEs do not have enough well-trained Cloud staff, the SME may make poor
decisions such as blindly accepting CSPs’ initial SLAs and attestations.
Survey two respondents did not converge on any particular mitigation to non-IT related
concerns for a Cloud transition with most respondents selecting several concerns. A majority of
respondents to survey three, question eight and nine see Cloud risk assessments changing SME
mitigation and risk avoidance procedures with changes in Cloud mitigation and risk strategies
and procedures predominant. There does not appear to be an actionable recommendation from
these two questions rather than devout more attention to GDPR. While one could argue that
perhaps a SME would not need to worry about the effects of GDPR on their business if the SME
did not adopt Cloud computing, this does not appear to be a Cloud specific mitigation.
If one considers accepting CSP attestations, SLAs, and guidelines as third-party
guidance, a preponderance of survey respondents report seeing recommendations to outsource at
least part of the SME’s Cloud transition. Responses to survey three, question six show that a
majority of respondents to survey three have accepted that contracting outside help is an
appropriate mitigation. Mitigating Cloud risk involves adding Cloud expertise to the SME or the
SME should consider outsourcing Cloud risk assessments. Again, an almost overwhelming
amount of coding done in this research study leads to the central theme of SMEs lacking
competent Cloud staff.
Discussions related to research question two and three detail some of the reasons for the
outsourcing or consulting recommendations. Only a small portion of respondents to survey two,
108
question ten have seen recommendations to outsource the entire Cloud transition process. A large
majority of respondents, however, have seen recommendations to use managed or professional
services including ongoing management of SME data and IT operations. A similar majority have
seen recommendations for outsourcing the creation and execution of a data transfer plan to the
SMEs Cloud environment. Close to half of the respondents have seen recommendations for
SMEs to outsource the selection of a CSP and type of infrastructure such as IaaS, PaaS, or SaaS.
Almost half responded affirmatively to the use of managed security services including scheduled
audits or penetration testing. Reinforcing the central theme of the research results, it seems clear
by the data collected in this research study that the most commonly recommended mitigation in
Cloud specific risk assessments is to outsource at least part of the transition to the Cloud process.
109
Table 13
Survey 2, Q10: 100% of respondents to Survey 1 have seen recommendations to outsource the
transition to a Cloud environment. Which portions of a transition to a Cloud environment have
you seen recommended to be outsourced? Please select all that apply.
Answer Choices Responses Count
Entire transition including choice of CSP (Cloud Service Provider),
type of virtual environment, and transfer of data.
15.79% 3
Selecting CSP and type of infrastructure such as IaaS, PaaS, or
SaaS.
47.37% 9
Creating and executing data transfer plan to Cloud environment. 68.42% 13
Creating and executing security controls in Cloud environment. 42.11% 8
Managed or professional services including ongoing management
of SME data and IT operations.
73.68% 14
Managed security services including scheduled audits or
penetration testing.
42.11% 8
Other. 0% 0
Evaluation of the Findings
The results of this research study both agree and extend current research in the field yet
disagree in some instances. SMEs understand what Cloud computing is and show a good
knowledge of what different types of Cloud services are available. As previous research has
110
found, SMEs are not well prepared for secure transitions to the Cloud (Lacity & Reynolds, 2013;
Mohabbattalab, von der Heidt, & Mohabbattalab, 2014). While previous research has done a
good job identifying security issues for SMEs adopting Cloud computing, most of the proposed
solutions are not currently in use by SMEs based on the respondents to this research study. This
study shows that SMEs are not yet using well prepared or defined plans to mitigate Cloud
computing risks.
Regarding research question one, this research shows no convergence in attempts by
SMEs to use large enterprise solutions such as recognized IT frameworks, Cloud security
baselines, or Cloud control guidelines or families. This research study confirms earlier research
indicating that SMEs have taken a piece meal approach to Cloud computing with most SMEs
using at least one Cloud service without necessarily conducting a risk assessment on that service
(Al-Isma’ili, Li, Shen, & He, 2016; Bassiliades, Symeonidis, Meditskos, Kontopoulos, Gouvas,
& Vlahavas, 2017; Famideh & Beydoun, 2018; Shkurti, & Muça, 2014). Based on this research,
SMEs are not doing a good job auditing or risk assessing new Cloud environments. A finding
from this research is that SMEs are more likely to outsource or use third parties to conduct Cloud
transitions than previous research has shown. This research study expands on current research by
showing that a lack of competent Cloud trained staff is the genesis of most of these behaviors.
Until SMEs have more in-house Cloud expertise, their use of Cloud related frameworks will be
lacking.
Regarding research question two, this research study agrees with earlier research that
shows SMEs have a variety of concerns with Cloud computing that are non-technical based
(Senarathna, Wilkin, Warren, Yeoh, & Salzman, 2018). This research shows a lack of SME staff
preparedness and training budget for transitioning to the Cloud is the primary concern for SMEs,
111
building upon earlier research that lists this as one of a number of concerns (Fahmideh &
Beydoun, 2018). Again, this study shows SMEs turning to outsourcing or third parties to solve
this issue. The results of this study show that SME risk teams are trying to adapt to Cloud
computing in a variety of ways but the risk teams see an increasing work load in almost all cases.
This research study is one of the first to report on how risk teams are changing due to
new Cloud computing environments. Results of this research show that SMEs and SME risk
teams are not keeping up with the new demands of Cloud computing and the required
mitigations. This study shows that aside from outsourcing or using third parties to perform parts
or all of the SME transition to the Cloud, SMEs are not showing proper oversight of their Cloud
environments. SMEs are accepting CSP attestations and SLAs in large percentages, something
they would not allow their risk teams to do with other vendors. This research extends previous
research that shows SMEs are not prepared for a transition to the Cloud with more insight on the
details (Kumar, Samalia, & Verma, 2017; Moyo & Loock, 2016; Vasiljeva, Shaikhulina, &
Kreslins, 2017). The primary theme of this research study’s data is the lack of well-trained SME
Cloud teams, this seems to include the risk and audit teams also.
Regarding research question three, this research study shows that SMEs are not using
best practice or Cloud specific security tools in any large margin. Most respondents are either
using old non-Cloud specific security control guidelines or using third parties to select and apply
controls. The central theme of a lack of well-trained Cloud IT staff presents itself in these results
too. Perhaps a Cloud feedback loop of SMEs using true Cloud tools and controls such as DevOps
or DevSecOps will produce skilled Cloud staff who will then use more Cloud specific tools and
controls.
112
Regarding research question four, this research study follows the central theme that SMEs lack
proper Cloud trained staff which affects the recommended mitigations from SME Cloud risk
assessments. Based on the lack of internal Cloud staff, a large majority of risk professional
respondents have seen mitigation recommendations to outsource part or all of a Cloud transition.
This includes using managed or professional services for data transfer plans, CSP selection,
infrastructure selection, ongoing management of SME data and IT operations, even the risk
assessments and audits themselves.
Summary
Data collection for this qualitative research study consisted of a three-round survey of
GWDC risk experts. This chapter has established the trustworthiness of the data including how
credibility, dependability, and confirmability. This chapter has described the reasons and
assumptions made that led to keeping participation in the survey instruments anonymous and
collecting very little demographic detail. This chapter has organized the results of the research
study by research question. The questions in the survey instruments were multiple choice and
presentation of the data in this chapter includes tables as appropriate. The use of a Delphi
technique based three round survey instrument has resulted in data that answers the four research
questions of this study.
The answer to research question one is that SMEs using insufficient or non-Cloud
focused frameworks when risk assessing Cloud computing. The answer question to research
question two is that the primary concern for SMEs in Cloud specific risk assessments is the lack
of qualified SME Cloud and Cloud security teams. The answer question to research question
three is that SMEs commonly use a wide range of security controls and SMEs have not
converged on a particular process or set of controls to secure Cloud computing environments.
113
The answer question to research question four is that SMEs do not yet have a common set of
recommended mitigations for SME Cloud computing risk assessments. SMEs are still relying on
CSPs and existing frameworks, security guides and control families for mitigation
recommendations. SMEs are not at the point where the SME risk team can produce specific clear
and effective mitigation steps.
An additional result of this research study is a validated survey instrument that SMEs can
use to gauge their risk and needed next steps in the SMEs transition to the Cloud. The instrument
will be a freely available survey on SurveyMonkey. The page logic of the survey will help guide
SMEs to consider the answers to this research studies questions and how the SME can move
forward securely. While the survey instrument will not replace a full-fledged risk assessment, the
survey instrument will help guide SMEs to making more informed decisions at the start of the
SMEs’ Cloud transition. The survey questions and link to the survey are in the Appendix E.
114
Chapter 5: Implications, Recommendations, and Conclusions
The researcher used this qualitative cased study based research project to address the
problem that there is no commonly understood and adopted best practice standard for small to
medium sized enterprises (SMEs) on how to specifically assess security risks relating to the
Cloud (Coppolino, D’Antonio, Mazzeo, & Romano, 2016; El Makkaoui, Ezzati, Beni-Hssane, &
Motamed, 2016; Raza, Rashid, & Awan, 2017). Research in IT fields has a hard time keeping up
with real world applications due to the high rate of change in the industry. This issue increases
almost exponentially when one focuses on Cloud computing security. Many research studies
have taken the first step and identified risk-based organizational concerns with Cloud computing
security, and a few authors have proposed novel solutions. Evidence of what organizations are
doing to satisfy their risk requirements in Cloud computing adoption is not clear.
The purpose of this qualitative case study-based research study was to discover an
underlying framework for research in SME risk analysis for Cloud computing and to create a
validated instrument that SMEs can use to assess their risk in Cloud adoption. Unlike SMEs, the
vast majority of medium to large enterprises use risk assessments before adopting new
computing environments (Cayirci, Garaga, Santana de Oliveira, & Roudier, 2016; Jouini &
Rabai, 2016). SMEs need a process or validated instrument such as a risk assessment to
determine if they should move to the Cloud (Bildosola, Rio-Belver, Cilleruelo, & Garechana,
2015; Carcary, Doherty, & Conway, 2014; Hasheela, Smolander, & Mufeti, 2016). Research
shows that SMEs using a risk-based approach have not reached a consensus on how to identify
and address Cloud security risks (Carcary, Doherty, Conway, & McLaughlin, 2014; Kumar,
Samalia, & Verma, 2017). The creation of a new framework for academic treatment of SME
115
Cloud computing risk, and the creation of a validated instrument that SMEs can use to assess
their risk in Cloud adoption were the reasons for this research study.
A qualitative approach using a case study methodology was the best solution as the
theory relating to a successful Cloud computing risk assessment does not yet exist. A problem
avoided by using a qualitative case study approach is that the subject population of risk-based
Cloud computing research experts were able to respond with qualitative data but not quantitative
numbers to avoid compromising their organization’s security (Glaser, 2014). Making sure to
limit the subject population to subject matter experts allowed the researcher to create very
specific survey instruments. This helped eliminate potential areas of bias or confusion for
participants. Even though the audience for this research study commonly works in quantitative
ways, the audience will find value in qualitative case study research on this topic (Liu, Chan, &
Ran, 2016).
A survey with a Delphi technique of industry experts was an effective way to both
resolving those concerns of SMEs adopting Cloud computing and was a good step to increasing
the knowledge in the academic field of Cloud security. The RAND Corporation created the
Delphi technique to facilitate the collation and distillation of expert opinions in a field (Hsu &
Sanford, 2007). The Delphi technique seems well designed for the Internet with current
researchers using “eDelphi” based web surveys (Gill, Leslie, Grech, & Latour, 2013). Although
Cloud security is a very new field, some illustrative research is evident in the field using Delphi
techniques (Choi & Lee, 2015; El-Gazzar, Hustad, & Olsen, 2016; Liu, Chan, & Ran, 2016).
These studies use the Delphi technique in different manners, but similar to this proposed research
study, all rely on electronic communications with groups of experts.
116
The guiding framework of this research study was that the risk assessment process for
Cloud computing environments is fundamentally different for SMEs than large enterprises and
the primary data collection instrument is a web survey of risk experts with a Delphi technique.
The population for this research study has constraints on security information that they can share.
A qualitative case study-based theory approach was an effective way for the researcher to gather
the data needed to propose a unifying theory for SME Cloud computing risk assessment. As the
state of research in SME risk assessment tools and procedures is still in the nascent stages, case
study-based theory is the correct framework to advance the field and to create a validated
instrument for SME Cloud computing risk assessments.
The design of this research study evolved from the need to find out what was actually
happening in Cybersecurity Cloud risk assessments, a very specialized and secretive field. A
qualitative case study approach using a Delphi instrument was the research design chosen for this
research study. The researcher created a web-based survey with three rounds composed primarily
of multiple-choice questions to work around the strictures normally placed on cybersecurity
professionals. The researcher choose a very focused and small population of cybersecurity risk
experts in the Washington D.C. area. The researcher asked subjects to participate in three web-
based surveys over a period of five months. The researcher posted links to the surveys on the
GWDC web site and promulgated through GWDC emails and conferences.
The three rounds of responses from cybersecurity risk experts provided the researcher
with answers to the research questions posed by this study. The researcher was able to identify
current frameworks being leveraged in Cloud computing risk assessments. The researcher has
determined the primary areas of concern for SMEs as they transition to the Cloud. The researcher
has generally identified the commonly used security controls recommended in Cloud computing
117
risk assessments. The researcher has brought to light mitigations that risk professionals associate
with a SME transition to Cloud computing.
Limitations of this research study are based on the secrecy of information in the
cybersecurity field and the limited subject population that can provide useful information.
Organizations do not share security data including defense designs, breaches, and policies and
procedures. Potential survey questions for this research had to balance the need for pertinent
information and the limited ability of respondents to share specific information. The subject
population for this research question was a very small subset of IT professionals and the
researcher needed to do a large amount of preliminary work to gain access to an appropriately
sized group of respondents.
In this chapter, the researcher reiterates the problem statement, purpose statement,
methodology, design, results, and limitations. The researcher continues with the implications
from the results of this research study organized around the research questions. Following the
discussion of implications, the researcher presents recommendations for practice and future
research. The last section of this chapter is a conclusion.
Implications
The implications derived from this research study are best discussed by research
questions. The researcher focused research question one on the current state of Cloud computing
risk assessments and what frameworks SMEs are currently using. Based on the response to the
survey questions, predominately survey two questions, the current state of Cloud risk
assessments has not kept up with the changes in business and IT brought on by Cloud
computing. This is consistent with most Cloud transition research (Madria, 2016; Shackleford,
2016). Almost uniformly, SMEs are using Cloud computing without preforming complete risk
118
assessments on the Cloud tools and offerings as indicated by survey two, questions seven and ten
results. Previous research has not focused on this issue. Response to survey one, questions eight
and nine indicate that some SMEs are using large enterprise frameworks such as COBIT and
ITIL but SMEs are not showing a consensus on choice of frameworks based on survey. One
could infer this from current research but not definitely state it (Barton, Tejay, Lane, & Terrell,
2016; Tisdale, 2016; Vijayakumar & Arun, 2017). Government based Cloud security
configurations are being adopted more frequently than public sector ones, showing that if a SME
is compliance based, they are more likely to follow predetermined policies and procedures for
Cloud computing transition as per survey one, questions twelve and thirteen. This research study,
specifically survey two, question ten and survey three, question fifteen does indicate that SMEs
have almost uniformly considered outsourcing or using a third party to adapt a framework to
their Cloud transition. This is a strong amplification of previous research efforts that have
mentioned third parties or outsourcing as an option (Gupta, Misra, Singh, Kumar, & Kumar,
2017).
The researcher focused research question two on the primary areas of concern for SMEs
as they perform Cloud computing risk assessments. Most of the SMEs referenced by the survey
respondents do not start with a blank state. Responses to survey two questions seven and eight
show that most SMEs select a CSP and a type of Cloud infrastructure before the SME begins the
Cloud transition risk assessment process. This helps narrow the primary areas of concern for
SMEs to general IT concerns such as backups or network paths, and a wide array of non-
technical concerns that confirms previous research in the field (Cheng & Lin, 2009; Diaz-Chao,
Ficapal-Cusi, & Torrent-Sellens, 2017; Lai, Sardakis, & Blackburn, 2015). Responses to survey
one, questions fourteen and fifteen shows that every respondent reports non-technical concerns
119
for SMEs that they work with. Almost all respondents indicate that governance, business
process, financial, and privacy concerns affect SMEs that are transitioning to the Cloud.
In Survey 2, responses to questions ten, eleven and twelve support the finding that by far
the biggest primary concern reported by this research study participants is that of the SMEs IT
staff knowledge levels and Cloud readiness. SMEs are also concerned with the reasons that the
SME IT teams are not ready, including; lack of training, IT staff budget, and IT staff resistance
to Cloud computing environments as per the responses to survey two, question eleven. Survey
one, question twenty-one and survey two, question ten shows that SMEs are overwhelmingly
outsourcing IT tasks related to the SME’s Cloud transition or using third parties for their Cloud
transitions.
The researcher asked with research question three; what are the commonly used security
controls used in Cloud risk assessments. Almost all respondents to survey one, question ten,
eleven, and eighteen see SMEs use security controls specific to Cloud environments. This
confirms earlier research in the field (Haines, Horowitz, Guo, Andrijicic, & Bogdanor, 2015;
Rahulamathavan, Rajarajan, Rana, Awan, Burnap, & Das, 2015; Sahmim & Gharsellaoui, 2017).
Responses to survey one, question nineteen bounds the type of controls being used by SMEs.
Newer Cloud specific controls and tools such as CASB, SecaaS, and multi-Cloud are not in
widespread use by SMEs while older security controls are. Previous research in the field confirm
these results by not generally discussing modern controls and discussing a lack of SME focus on
best practices for a Cloud transition (Gholami, Daneshgar, Low, & Beydoun, 2016; Salim,
Darshana, Sukanlaya, Alarfi, & Maura, 2015; Yu, Li, Li, Zhao, & Zhao, 2018). Responses to
survey two, question sixteen indicate that whatever Cloud tools SMEs are using, they are not
being fully audited leading to the conclusion that SMEs need more controls. Research in the field
120
indicate a lack of preparation by SMEs for Cloud transitions including the use of security
controls but this research helps shed light on the details of SMEs’ lack of preparation (Huang et
al., 2015, 2015; Wang & He, 2014). Responses to survey three, question seven show that SMEs
still need a lot of work in this area with only seventy-one per cent of the risk experts seeing a
change in Cloud risk assessment hazards identification. Responses to survey three, question
twelve and thirteen indicate that the SMEs realize they need new security controls even if they
are not using them yet. Previous research supports this conclusion with several studies reporting
that many SMEs see a Cloud transition as a way to increase the SMEs’ IT security (Lacity &
Reynolds, 2013; Mohabbattalab, von der Heidt, & Mohabbattalab, 2014).
The researcher asked with research question four; what are the commonly recommended
mitigations in Cloud specific risk assessments. The design of this question intended to elicit
slightly different responses than just controls or control families. The assumption made by the
researcher was that all respondents would see specific recommendations made to SMEs but
respondents to survey one, question twenty show only seventy-five per cent have seen
recommendations. Previous research in the field supports this conclusion but this research study
is the first to quantify the number of SMEs seeing specific Cloud recommendations (Assante,
Castro, Hamburg, & Martin, 2016; Hussain, Hussain, Hussain, Damiani, & Chang, 2017).
Responses to survey one, question twenty-one help detail the specific mitigations recommended
to SMEs with eighty per cent of respondents saying that they have seen recommendations to
outsourcing or use third parties. This is a reoccurring theme in the data collected in this research
study. Previous research hints at the use of outsourcing by SMEs but this research shows how
prevalent it has become (Fahmideh & Beydoun, 2018). Overall, survey respondents show that
SMEs are adopting mitigations specific to the Cloud environments and changes to the mitigation
121
process with risk assessment changes as indicated by responses to survey three, questions six,
eight and nine. While accepting that Cloud computing environments require new mitigations and
new mitigation processes may seem obvious, previous research has not focused on this to any
great detail (Mohabbattalab, von der Heidt, & Mohabbattalab, 2014). Responses to survey three,
questions six and eleven, indicate that changes in the make-up of risk assessment teams and SME
risk responsibility assignment will affect recommended mitigations.
The primary factor that may have influenced the interpretation of the results of this
research study is that all significant results emerged from responses to multiple choice questions.
Perhaps the researcher did not include important choices in the answer choices. The researcher
has included all survey questions and responses in an appendix so the reader can decide. The
researcher presents the survey question answers in percentages so interpretation of the results
gathered is straightforward.
Recommendations for Practice
The researcher has encapsulated recommendations for practice in the validated risk
assessment instrument in the appendix. The primary recommendation is that SMEs need to spend
more time preparing for a Cloud transition. Even though responses to survey one, question eight,
nine, and twelve show a strong majority of SMEs transitioning to the Cloud do some planning,
SMEs need to do much more planning. Current research supports this finding but does not offer
many details (Bildosola, Río-Belver, Cilleruelo, & Garechana, 2015; Gastermann, Stopper,
Kossik, & Katalinic, 2014; Lacity & Reynolds, 2013; Senarathna, Yeoh, Warren, & Salzman,
2016). The validated risk instrument presented in the appendix does not get to the level of
specific controls but focuses on the decisions that SMEs must make before moving servers or
data to a CSP. Responses to survey three, questions seven, twelve, and thirteen indicate SMEs
122
realize they need to put more effort into the Cloud transition process and a validated risk
instrument with a series of fairly simple questions should help shape that effort. Existing
research shows that many SMEs see a Cloud transition as a way to increase security (Lacity &
Reynolds, 2013; Mohabbattalab, von der Heidt, & Mohabbattalab, 2014), this research study
helps show how the SMEs can achieve that goal. The findings from this research study do not
solve all SMEs’ problems with Cloud transitions, but if used as indicated by the validate risk
instrument, SMEs should have a smoother and more secure Cloud transition effort.
Recommendations for Future Research
The primary recommendation for future study is that more research should focus on how
SMEs plan for Cloud transitions. Current research does a good job of identifying why or why not
SMEs are adopting Cloud computing but current research does not identify how SMEs should
transition to Cloud computing. This research study has taken the first steps and identified the
current frameworks and primary areas of concern for SMEs as they adopt Cloud computing.
Extending the results of this research study, however, will require much more research. Adopting
Cloud computing is much more than just changing IT vendors or changing the type of servers
used by the SME. Adopting Cloud computing is a major paradigm shift for many SMEs that will
fundamentally change how SMEs do business and interact with each other and customers.
Based on results from this research study, specific fields of interest deserving more
research include several cross-domain topics. This research has shown that a primary concern for
SMEs during Cloud transitions is staffing. SMEs are trying to decide if it make sense to
outsource part or all of a move to Cloud computing. Pursuing research to help answer this
question may involve management theory, employee training, business risk analysis, and IT
among other research fields. Cloud computing is primarily a field in which IT and cybersecurity
123
researchers work. The results of this research study indicate several promising avenues of
research in IT and IT security fields. SMEs are not using modern Cloud based tools yet.
Research investigating what would it take to get SMEs to adopt a Cloud tool such as DevOps or
DevSecOps may show interesting results. If, as this research shows, SMEs are heavily using
third parties and outsourcing for the transition to the Cloud, further research on how that will
affect the IT and IT security fields will produce many topics. If SMEs adopt Cloud computing
with third parties in control, research on how the day to day operations of the SME would change
and could bear useful results. This research study indicates that the field of IT risk is changing as
a result of Cloud transitions. Research on how the field of IT risk adapts to Cloud computing
would be a very interesting research topic.
Conclusions
The researcher has only identified the issues for Cloud computing adoption by SMEs
from the perspective of risk experts. This research study has not identified ways in whcich those
risk experts can make the SME decision makers adopt these findings. Future research will need
to identify the answers and solutions that SMEs will adopt. Cloud computing is a very technical
field but this research study shows that SMEs’ biggest problems with transitioning to the Cloud
is human based, not technical. This research study builds on prior research and points the way for
future research. Earlier research has shown that SMEs show hesitation when moving to the
Cloud. Previous research studies have not identified the major concerns and road blocks for
SMEs as they transition to the Cloud. This research illuminates the major issues for SMEs
adopting Cloud computing.
This research study shows that SMEs are adopting Cloud computing in a piece-meal and
unorganized way. Future research on whether or not SMEs converge on best practices and
124
standards will be important work. As the use of Cloud computing is becoming an inflection point
for SMEs, SMEs need more research on both the process and the results. Cloud computing is
fundamentally changing the daily pace of business, and this research study shows that SMEs
have not kept pace. SMEs would greatly benefit from more research to help them adopt Cloud
computing securely and effectively.
125
References
Ab Rahman, N. H., & Choo, K. R. (2015). A survey of information security incident handling in the
Cloud. Computers & Security, 49, 45-69. doi:10.1016/j.cose.2014.11.006
Achargui, A., & Zaouia, A. (2017). Hosted, Cloud and SaaS, off-premises ERP systems adoption by
Moroccan SMEs: A focus group study. 2017 International Conference on Information and
Digital Technologies (IDT). doi:10.1109/dt.2017.8012125
Ahani, A., Rahim, N. Z., & Nilashi, M. (2017). Forecasting social CRM adoption in SMEs: A combined
SEM-neural network method. Computers in Human Behavior, 75, 560-578.
doi:10.1016/j.chb.2017.05.032
Ahmed, N., & Abraham, A. (2013). Modeling security risk factors in a Cloud computing
environment. Journal of Information Assurance & Security, 8(6), 279-289. Retrieved from
http://www.mirlabs.org/jias/
Aich, A., Sen, A., & Dash, S. R. (2015). A Survey on Cloud Environment Security Risk and Remedy.
2015 International Conference on Computational Intelligence and Networks.
doi:10.1109/cine.2015.45
Alali, F. A., & Chia-Lun, Y. (2012). Cloud Computing: Overview and Risk Analysis. Journal of
Information Systems, 26(2), 13-33. doi:10.2308/isys-50229
Alassafi, M. O., Alharthi, A., Walters, R. J., & Wills, G. B. (2017). A framework for critical security
factors that influence the decision of Cloud adoption by Saudi government agencies. Telematics
and Informatics, 34(7), 996-1010. doi:10.1016/j.tele.2017.04.010
Albakri, S. H., Shanmugam, B., Samy, G. N., Idris, N. B., & Ahmed, A. (2014). Security risk
assessment framework for Cloud computing environments. Security & Communication
Networks, 7(11), 2114-2124. doi:10.1002/sec.923
126
Alcantara, M., & Melgar, A. (2016). Risk Management in Information Security: A Systematic
Review. Journal of Advances in Information Technology, 7(1). Retrieved from:
http://www.jait.us/
Aldorisio, J. (2018). What is security as a service? A definition of SECaaS, benefits, examples, and
more. Retrieved from https://digitalguardian.com/blog/what-security-service-definition-secaas-
benefits-examples-and-more
Ali, M., Khan, S. U., & Vasilakos, A. V. (2015). Security in Cloud computing: Opportunities and
challenges. Information Sciences, 30(55) 357-383. doi:10.1016/j.ins.2015.01.025
Aljawarneh, S. A., Alawneh, A., & Jaradat, R. (2016). Cloud security engineering: Early stages of
SDLC. Future Generation Computer Systems, doi:10.1016/j.future.2016.10.005
Al-Ruithe, M., Benkhelifa, E., & Hameed, K. (2016). A conceptual framework for designing data
governance for Cloud computing. Procedia Computer Science, 94, 160-167.
doi:10.1016/j.procs.2016.08.025
Al-Anzi, F., S., Yadav, S., K., Soni, J., (2014) Cloud computing: Security model comprising
governance, risk management and compliance. 2014 International Conference on Data Mining
and Intelligent Computing (ICDMIC). doi:10.1109/ICDMIC.2014.6954232
Al-Ismaili, S., Li, M., & Shen, J. (2016). Cloud Computing Adoption Decision Modelling for SMEs:
From the PAPRIKA Perspective. Lecture Notes in Electrical Engineering, 597-615.
doi:10.1007/978-981-10-0539-8_59.
Anand, P., Ryoo, J., Kim, H., (2015) Addressing security challenges in Cloud computing — A pattern-
based approach. 2015 1st International Conference on Software Security and Assurance
(ICSSA), 13. doi:10.1109/ICSSA.2015.013
127
Assante, D., Castro, M., Hamburg, I., & Martin, S. (2016). The use of Cloud computing in SMEs.
Procedia Computer Science, 83 The 7th International Conference on Ambient Systems, Networks
and Technologies (ANT 2016), 1207-1212. doi:10.1016/j.procs.2016.04.250
Atkinson, S., & Aucoin, R. F. (2015). Adopting COBIT 5 in a government entity. COBIT Focus, 1(6).
Retrieved from: https://www.isaca.org/COBIT/focus/Pages/FocusHome.aspx
Bahrami, M., Malvankar, A., Budhraja, K., K., Kundu, C., Singhal, M., & Kundu, A., (2017)
Compliance-Aware provisioning of containers on Cloud. 2017 IEEE 10th International
Conference on Cloud Computing (CLOUD), 696. doi:10.1109/CLOUD.2017.95
Baig, R., Freitag, F., Moll, A., Navarro, L., Pueyo, R., Vlassov, V., (2015). Community network Clouds
as a case for the IEEE InterCloud standardization. 2015 IEEE Conference on Standards for
Communications and Networking (CSCN), Standards for Communications and Networking
(CSCN), 2015 IEEE Conference on, 269. doi:10.1109/CSCN.2015.7390456
Bard, G., & Weinstein, Y. (2017). The effect of question order on evaluations of test performance: Can
the bias dissolve? The Quarterly Journal of Experimental Psychology, 70(10), 2130–2140.
https://www.tandfonline.com/loi/pqje20
Bassiliades, N., Symeonidis, M., Meditskos, G., Kontopoulos, E., Gouvas, P., & Vlahavas, I. (2017). A
semantic recommendation algorithm for the PaaSport platform-as-a-service marketplace. Expert
Systems with Applications, 203-227. doi:10.1016/j.eswa.2016.09.032
Barrow, P., Kumari, R., & Manjula, R. (2016). Security in Cloud computing for service delivery models:
Challenges and solutions. Journal of Engineering Research and Applications.
doi:10.1016/j.cose.2016.02.007
128
Barton, K. A., Tejay, G., Lane, M., & Terrell, S. (2016). Information system security commitment: A
study of external influences on senior management. Computers & Security, 599-625.
doi:10.1016/j.cose.2016.02.007
Bayramusta, M., & Nasir, V. A. (2016). A fad or future of IT?: A comprehensive literature review on
the Cloud computing research. International Journal of Information Management, 36, 635-644.
doi:10.1016/j.ijinfomgt.2016.04.006
Betz, C. T., & Goldenstern, C. (2017). The New World: What do Agile and DevOps mean for ITSM and
ITIL – Kepner-Tregoe. Retrieved from https://www.kepner-tregoe.com/knowledge-
center/articles/technical-support-improvement/the-new-world-what-does-agile-and-devops-
mean-for-itsm-and-itil/
Beauchamp, P. (2015, March 26). Cyber security for professional service agencies: How to safeguard
your clients? Intellectual property and trade secrets. Retrieved from
http://www.inguard.com/blog/cyber-security-for-professional-service-agencies-how-to-
safeguard-your-clients-intellectual-property-and-trade-secrets
Bhattacharya, S., Kumar, C., S., From threats subverting Cloud security to a secure trust paradigm.
(2017). Inventive Communication and Computational Technologies (ICICCT), 2017
International Conference on, 510. doi:10.1109/ICICCT.2017.7975252
Bickart, B., & Schmittlein, D. (1999). The distribution of survey contact and participation in the United
States: constructing a survey-based estimate. Journal of Marketing Research (JMR), 36(2), 286-
294. Retrieved from:
https://www.ama.org/publications/JournalOfMarketingResearch/Pages/current-issue.aspx
Bieber, K., Grivas, S. G., & Giovanoli, C. (2015). Cloud Computing Business Case Framework:
Introducing a Mixed-Model Business Case Framework for Small and Medium Enterprises to
129
Determine the Value of Cloud Computing. 2015 International Conference on Enterprise Systems
(ES), 161. doi:10.1109/ES.2015.22
Bildosola, I., Río-Belver, R., Cilleruelo, E., & Garechana, G. (2015). Design and implementation of a
Cloud computing adoption decision tool: Generating a Cloud road. Plos One, 10(7),
doi:10.1371/journal.pone.0134563
Bojanc, R., & Jerman-Blazic, B. (2008). An economic modelling approach to information security risk
management. International Journal of Information Management, 28, 413-422.
doi:10.1016/j.ijinfomgt.2008.02.002
Brender, N., & Markov, I. (2013). Risk perception and risk management in Cloud computing: Results
from a case study of Swiss companies. International Journal of Information Management, 33(5),
726-733. Retrieved from: https://www.journals.elsevier.com/international-journal-of-
information-management
Brüggen, E., & Dholakia, U. M. (2010). Determinants of participation and response effort in web panel
surveys. Journal of Interactive Marketing, 24, 239-250. doi:10.1016/j.intmar.2010.04.004
Bruque-Camara, S., Moyano-Fuentes, J., & Maqueira-Marín, J. M. (2016). Supply chain integration
through community Cloud: Effects on operational performance. Journal of Purchasing and
Supply Management, 22, 141-153. doi:10.1016/j.pursup.2016.04.003
Bulgurcu, B., Cavusoglu, H., & Benbasat, I., (2016). Hopes, fears, and software
obfuscation. Communications of the ACM, 59(3), 88-96. doi:10.1145/2757276
Bunkar, R. K., & Rai, P. K. (2017). Study on security model in Cloud computing. International Journal
of Advanced Research in Computer Science, 8(7), 841. doi:10.26483/ijarcs.v8i7.4350
Buss, A. (2013). SMEs open to public Cloud services. Computer Weekly, 14. Retrieved from:
https://www.computerweekly.com/
130
Calvo-Manzano, J. A., Lema-Moreta, L., Arcilla-Cobián, M., & Rubio-Sánchez, J. L. (2015). How small
and medium enterprises can begin their implementation of ITIL?. Revista Facultad De
Ingenieria Universidad De Antioquia, 127. doi:10.17533/udea.redin.n77a15
Cao, X., Moore, C., O’Neill, M., O’Sullivan, E., & Hanley, N., Optimised multiplication architectures
for accelerating fully homomorphic encryption. (2016). IEEE Transactions on Computers, IEEE
Trans. Comput, (9), 2794. doi:10.1109/TC.2015.2498606
Cao, J & Zhang, S., (2016). IT Operation and Maintenance Process improvement and design under
virtualization environment. 2016 IEEE International Conference on Cloud Computing and Big
Data Analysis (ICCCBDA), Cloud Computing and Big Data Analysis (ICCCBDA), 2016 IEEE
International Conference on, 263. doi:10.1109/ICCCBDA.2016.7529568
Carcary, M., Doherty, E., & Conway, G. (2014). The adoption of Cloud computing by Irish SMEs — An
exploratory study. Electronic Journal of Information Systems Evaluation, 17(1), 3-14. Retrieved
from: http://www.ejise.com/main.html
Carcary, M., Doherty, E., Conway, G., & McLaughlin, S. (2014). Cloud computing adoption readiness
and benefit realization in Irish SMEs—An exploratory study. Information Systems Management,
31(4), 313-327. doi:10.1080/10580530.2014.958028
Carvalho, C. d., Andrade, R. C., Castro, M. d., Coutinho, E. F., & Agoulmine, N. (2017). State of the art
and challenges of security SLA for Cloud computing. Computers and Electrical Engineering, 59,
141-152. doi:10.1016/j.compeleceng.2016.12.030
Casola, V., De Benedictis, A., Modic, J., Rak, M., & Villano, U., Per-service security SLA: A new
model for security management in Clouds. (2016). 2016 IEEE 25th International Conference on
Enabling Technologies: Infrastructure for Collaborative Enterprises (WETICE), 2014 IEEE
23rd International, 83. doi:10.1109/WETICE.2016.27
131
Casola, V., De Benedictis, A., Rak, M., & Rios, E. (2016). Security-by-design in Clouds: A security-
SLA driven methodology to build secure Cloud applications. Procedia Computer
Science, 97(2nd International Conference on Cloud Forward: From Distributed to Complete
Computing), 53-62. doi:10.1016/j.procs.2016.08.280
Cayirci, E., Garaga, A., Santana de Oliveira, A., & Roudier, Y. (2016). A risk assessment model for
selecting Cloud service providers. Journal of Cloud Computing, 5(1), 1. doi:10.1186/s13677-
016-0064-x
Charif, B., & Awad, A. I. (2016). Feature: Towards smooth organisational adoption of Cloud computing
a customer-provider security adaptation. Computer Fraud & Security, 2016, 7-15.
doi:10.1016/S1361-3723(16)30016-1
Center for Internet Security. (2018). CIS benchmarks. Retrieved from https://www.cisecurity.org/cis-
benchmarks/
Chalita, S., Zalila, F., Gourdin, C., & Merle, P., (2018). A precise model for Google Cloud platform.
2018 IEEE International Conference on Cloud Engineering (IC2E), Cloud Engineering (IC2E),
2018 IEEE International Conference on, IC2E, 177. doi:10.1109/IC2E.2018.00041
Chang, Y., Chang, P., Xu, Q., Ho, K., Halim, W., (2016). An empirical investigation of switching
intention to private Cloud computing in large enterprises. 2016 22nd Asia-Pacific Conference on
Communications (APCC), 323. doi:10.1109/APCC.2016.7581451
Chang, V, & Ramachandran, I. (2016). Towards achieving data security with the Cloud computing
adoption framework. IEEE Transactions on Services Computing, Services Computing, IEEE
Transactions on, IEEE, 138. doi:10.1109/TSC.2015.2491281
132
Charif, B., & Aswad J. A. (2016). Towards smooth organisational adoption of Cloud computing – A
customer-provider security adaptation. Computer Fraud & Security, 2016(2), 7-15.
doi:10.1016/S1361-3723(16)30016-1
Chatman, C. (2010). How Cloud computing is changing the face of health care information technology.
Journal of Health Care Compliance, 12(3), 37-70. Retrieved from
http://www.healthcarecompliance.us/journal-of-health-care-compliance.html
Chatzithanasis, G., & Michalakelis, C. (2018). The Benefits of Cloud Computing: Evidence from
Greece. International Journal of Technology Diffusion (IJTD), 9(2), 61. Retrieved form:
https://www.igi-global.com/journal/international-journal-technology-diffusion/1135
Chen, T., Ta-Tao, C., & Kazuo, N. (2016). The Perceived Business Benefit of Cloud Computing: An
Exploratory Study. Journal of International Technology & Information Management, 25(4), 101-
121. Retrieved from: http://scholarworks.lib.csusb.edu/jitim/
Cheng, H., & Lin, C. Y. (2009). Do as the large enterprises do? Expatriate selection and overseas
performance in emerging markets: The case of Taiwan SMEs. International Business Review,
18, 60-75. doi:10.1016/j.ibusrev.2008.12.002
Chiregi, M., & Jafari Navimipour, N. (2017). Review: Cloud computing and trust evaluation: A
systematic literature review of the state-of-the-art mechanisms. Journal of Electrical Systems and
Information Technology, doi: 10.1016/j.jesit.2017.09.001
Choi, M., & Lee, C. (2015). Information security management as a bridge in Cloud systems from private
to public organizations. Sustainability, 7(9) 12032-12051 (2015), (9), 12032.
doi:10.3390/su70912032
Cong, C., & Aiqing, C. (2014). Cost analysis of public Cloud IaaS access of SMEs. Applied Mechanics
& Materials, 19(6) 631-632. doi:10.4028/www.scientific.net/AMM.631-632.196
133
Coppolino, L., D’Antonio, S., Mazzeo, G., & Romano, L. (2017). Cloud security: Emerging threats and
current solutions. Computers and Electrical Engineering, 59, 126-140.
doi:10.1016/j.compeleceng.2016.03.004
Cram, W. A., Brohman, M. K., & Gallupe, R. B. (2016). Hitting a moving target: a process model of
information systems control change. Information Systems Journal, 26(3), 195-226.
doi:10.1111/isj.12059
Creswell, J. W. (2007). Qualitative inquiry and research design: Choosing among five traditions.
Thousand Oaks: Sage.
da Silva Antonio, F., & Manotti, A. (2016). Using COBIT 5: Enabling Information to Perform an
Information Quality Assessment. COBIT Focus, 1-4. Retrieved from:
https://ww7w.isaca.org/COBIT/focus/Pages/FocusHome.aspx
Damenu, T. K., & Balakrishna, C. (2015). Cloud security risk management: A critical review. 2015 9Th
International Conference on Next Generation Mobile Applications, Services & Technologies,
370. doi:10.1109/NGMAST.2015.25
Dasgupta, S., & Pal, S.K. (2016). Design of a polynomial ring based symmetric homomorphic
encryption scheme. Perspectives in Science, Vol 8, 692-695. doi:10.1016/j.pisc.2016.06.061
Daylami, N. (2015). The origin and construct of Cloud computing. International Journal of the
Academic Business World, 9(2), 39-45. Retrieved from: http://jwpress.com/IJABW/IJABW.htm
de Bruin, M., McCambridge, J., & Prins, J. M. (2015). Reducing the risk of bias in health behaviour
change trials: Improving trial design, reporting or bias assessment criteria? A review and case
study. Psychology & Health, 30(1), 8–34. Retrieved from:
https://www.tandfonline.com/loi/gpsh20.
134
de Gusmão, A. H., e Silva, L. C., Silva, M. M., Poleto, T., & Costa, A. S. (2016). Information security
risk analysis model using fuzzy decision theory. International Journal of Information
Management, 36, 25-34. doi:10.1016/j.ijinfomgt.2015.09.003
Demirkan, H., & Goul, M. (2011). Taking value-networks to the Cloud services: Security services,
semantics and service level agreements. Information Systems and E-Business
Management, 11(1), 51-91. Retrieved from https://link.springer.com/journal/10257
Denzin, N. K. (2001). Interpretive interactionism (Vol. 16). Sage.
Deshpande, P., Sharma, S.C., Peddoju, S.K. et al. (2018) Security and service assurance in Cloud
environment. International Journal of System Assurance Engineering Management (2018) 9:
194. Retrieved from https://link.springer.com/journal/13198
Devos, J., & Van de Ginste, K. (2015). Towards a Theoretical Foundation of IT Governance – The
COBIT 5 case. Electronic Journal of Information Systems Evaluation, 18(2), 95. Retrieved from:
http://www.ejise.com/main.html
Dhingra, A., K., Rai, D., Evaluating risks in Cloud computing: Security perspective. (2016). 2016 5th
International Conference on Reliability, Infocom Technologies and Optimization (Trends and
Future Directions) ICRITO, 533. doi:10.1109/ICRITO.2016.7785013
Díaz-Chao, Á., Ficapal-Cusi, P., & Torrent-Sellens, J. (2017). Did small and medium enterprises
maintain better jobs during the early years of the recession? Job quality multidimensional
evidence from Spain. European Management Journal, 35, 396-413.
doi:10.1016/j.emj.2016.06.006
Diogenes, Y. (2017). Embracing Cloud computing to enhance your overall security posture. ISSA
Journal, 15(5), 36-41. Retrieved from: https://issa-cos.org/issajournal
135
Djuraev, R., X., & Umirzakov, B., M., (2016) Model of assessment of risks of information security in
the environment of Cloud computing. 2016 International Conference on Information Science
and Communications Technologies (ICISCT), 1. doi:10.1109/ICISCT.2016.7777391
Doherty, N. F., & Tajuddin, S. T. (2018). Towards a user-centric theory of value-driven information
security compliance. Information Technology & People, 31(2), 348. doi:10.1108/ITP-08-2016-
0194
dos Santos, D. R., Marinho, R., Schmitt, G. R., Westphall, C. M., & Westphall, C. B. (2016). A
framework and risk assessment approaches for risk-based access control in the Cloud. Journal of
Network and Computer Applications, 74, 86-97. doi:10.1016/j.jnca.2016.08.013
Elhoseny, M., Elminir, H., Riad, A., & Yuan, X., (2016). A secure data routing schema for WSN using
elliptic curve cryptography and homomorphic encryption. Journal of King Saud University:
Computer and Information Sciences, 28(3), 262-275. Retrieved from:
https://www.journals.elsevier.com/journal-of-king-saud-university-computer-and-information-
sciences
Elsayed, M., & Zulkernine, M. (2016). IFCaaS: Information flow control as a service for Cloud security.
Availability, Reliability and Security (ARES), 2016 11th International Conference on. 211-216.
Retrieved from;
https://ieeexplore.ieee.org/xpl/tocresult.jsp?isnumber=7784494&filter%3DAND%28p_IS_Numb
er%3A7784494%29&pageNumber=3
Elvy, S. (2018). Commodifying consumer data in the era of the Internet of things. Boston College Law
Review, 59(2), 424-522. Retrieved from: https://www.bc.edu/bc-web/schools/law/academics-
faculty/law-reviews/bclr.html
136
El-Attar, N., E., Awad, W., A., & Omara, F., A., Empirical assessment for security risk and availability
in public Cloud frameworks. (2016). 2016 11th International Conference on Computer
Engineering & Systems (ICCES), 17. doi:10.1109/ICCES.2016.7821969
El-Gazzar, R., Hustad, E., & Olsen, D. H. (2016). Understanding Cloud computing adoption issues: A
Delphi study approach. The Journal of Systems & Software, 118, 64-84. doi:
10.1016/j.jss.2016.04.061
El-Makkaoui, K., Ezzati, A., Beni-Hssane, A., Motamed, C. (2016). Cloud security and privacy model
for providing secure Cloud services. 2016 2nd International Conference on Cloud Computing
Technologies and Applications (CloudTech), 81. doi:10.1109/CloudTech.2016.7847682
Erturk, E. (2017). An incremental model for Cloud adoption: Based on a study of regional organizations.
TEM Journal, 6(4), 868-876. doi:10.18421/TEM64-29
Fahmideh, M., & Beydoun, G. (2018). Reusing empirical knowledge during Cloud computing
adoption. The Journal of Systems & Software, 138, 124-157. doi:10.1016/j.jss.2017.12.011
Feng, C., & Xin, Y. (2014). Fast key generation for Gentry-style homomorphic encryption. The Journal
of China Universities of Posts and Telecommunications, 21, 37-44. doi:10.1016/S1005-
8885(14)60343-5
Fernandes, D., Soares, L., Gomes, J., Freire, M., & Inácio, P. (2014). Security issues in Cloud
environments: A survey. International Journal of Information Security, 13(2), 113-170.
doi:10.1007/s10207-013-0208-7
Fernando, B. & Fernando, R., (2014). Strategy, innovation and internationalization in SMEs: The
implementation issue. Proceedings of the European Conference on Innovation &
Entrepreneurship, 77. Retrieved from: https://www.academic-conferences.org/conferences/ecie/
137
Ferdinand, J. (2015). Building organisational cyber resilience: A strategic knowledge-based view of
cyber security management. Journal of Business Continuity & Emergency Planning, 9(2), 185-
195. Retrieved from https://www.henrystewartpublications.com/jbcep
Flostrand, A. (2017). Finding the future: Crowdsourcing versus the Delphi technique. Business
Horizons, 60, 229-236. doi:10.1016/j.bushor.2016.11.007
Fosu, A. K. (2017). Growth, inequality, and poverty reduction in developing countries: Recent global
evidence. Research in Economics, 71, 306-336. doi:10.1016/j.rie.2016.05.005
Funk, J. L. (2015). Thinking about the future of technology: Rates of improvement and economic
feasibility. Futures, 73, 163-175. doi:10.1016/j.futures.2015.08.003
Furfaro, A., Gallo, T., Garro, A., Sacca, D., & Tundis, A., (2016). Requirements specification of a Cloud
service for cyber security compliance analysis. 2016 2nd International Conference on Cloud
Computing Technologies and Applications (CloudTech), 205.
doi:10.1109/CloudTech.2016.7847700
Gartner. (2014, October 2). What is SMB? – Gartner defines small and midsize businesses. Retrieved
from https://www.gartner.com/it-glossary/smbs-small-and-midsize-businesses
Gastermann, B., Stopper, M., Kossik, A., & Katalinic, B. (2014). Secure implementation of an on-
premises Cloud storage service for small and medium-sized enterprises. Annals of DAAAM &
Proceedings, 25(1), 574-583. doi:10.1016/j.proeng.2015.01.407
George, S., Gyorgy, T., Adelina, O., Victor, S., & Janna, C. (2014). Cloud computing and big data as
convergent technologies for retail pricing strategies of SMEs. Challenges of the Knowledge
Society, 4(1), 1044-1052. Retrieved from: http://cks.univnt.ro/
https://www.henrystewartpublications.com/jbcep
138
Gill, F. J., Leslie, G. D., Grech, C., & Latour, J. M. (2013). Using a web-based survey tool to undertake
a Delphi study: Application for nurse education research. Nurse Education Today, 13(28) 1322-
1328. doi:10.1016/j.nedt.2013.02.016
Gholami, M. F., Daneshgar, F., Low, G., & Beydoun, G. (2016). Cloud migration process—A survey,
evaluation framework, and open challenges. The Journal of Systems & Software, 12, 31-69.
doi:10.1016/j.jss.2016.06.068
Glaser, B. G. (2010). The future of grounded theory. Grounded Theory Review, 9(2). Retrieved from:
Glaser, B. G. (2014). Choosing grounded theory. Grounded Theory Review, 13(2). Retrieved from:
Glaser, B. G. (2016). The grounded theory perspective: Its origins and growth. Grounded Theory
Review, 15(1), 4-9. Retrieved from: http://groundedtheoryreview.com/
Gleeson, N., & Walden, I. (2014). ‘It’s a jungle out there’?: Cloud computing, standards and the law.
European Journal of Law & Technology, 5(2), 1. Retrieved from: http://ejlt.org/
Goettlemann, E., Dahman, K., Gateau, B., Dubois, E., & Godart, C., A security risk assessment model
for business process deployment in the Cloud. 2014 IEEE International Conference on Services
Computing, Services Computing (SCC), 307. doi:10.1109/SCC.2014.48
Gomes, D. E., Guedes dos Santos, J. L., Pereira Borges, J. W., Pedroso Alves, M., de Andrade, D. F., &
Erdmann, A. L. (2018). Theory of the response to the item of research on public health. Journal
of Nursing UFPE / Revista De Enfermagem UFPE, 12(6). doi:10.5205/1981-8963-
v12i6a234740p1800-1812-2018
139
Greyson, D. (2018). Information triangulation: A complex and agentic everyday information practice.
Journal of the Association for Information Science & Technology, 69(7), 869-878.
doi:10.1002/asi.24012
Gritzalis, D., Iseppi, G., Mylonas, A., & Stavrou, V. (2018). Exiting the risk assessment maze: A meta-
survey. ACM Computing Surveys, 51(1), 11-30. doi:10.1145/3145905
Guba, E. G., & Lincoln, Y. S. (2008). Paradigmatic controversies, contradictions, and emerging
confluences. In N. K. Denzin, Y. S. Lincoln, The landscape of qualitative research (pp. 255-
286). Thousand Oaks, CA, US: Sage Publications, Inc.
Gupta S, Saini A. (2018). Cloud adoption: Linking business needs with system measures. Global
Journal of Enterprise Information System 9(2) 42-49. Retrieved from:
https://library.harvard.edu/services-tools/business-source-complete
Haimes, Y. Y., Horowitz, B. M., Guo, Z., Andrijcic, E., & Bogdanor, J. (2015). Assessing systemic risk
to Cloud-computing technology as complex interconnected systems of systems. Systems
Engineering, 18(3), 284-299. doi:10.1002/sys.21303
Halabi, T., & Bellaiche, M. (2018). A broker-based framework for standardization and management of
Cloud Security-SLAs. Computers & Security, 75, 59-71. doi:10.1016/j.cose.2018.01.019
Hanclova, J., Rozehnal, P., Ministr, J., & Tvridkova, M. (2015). The determinants of IT adoption in
SMEs in the Czech-Polish border areas. Information Technology for Development, 21(3), 426.
doi:10.1080/02681102.2014.916249
Hare, S. (2016). For your eyes only: U.S. technology companies, sovereign states, and the battle over
data protection. Business Horizons, 59, 549-561. doi:10.1016/j.bushor.2016.04.002
140
Hasheela, V. T., Smolander, K., & Mufeti, T. K. (2016). An investigation of factors leading to the
reluctance of SaaS ERP adoption in Namibian SMEs. African Journal of Information Systems,
8(4), 1-13. Retrieved from: https://digitalcommons.kennesaw.edu/ajis/
Haufe, K., Dzombeta, S., Bradnis, K., Stantchev, V., & Colomo-Palacios, R., (2018). Improving
transparency and efficiency in IT security management resourcing. IT Professional, IT Prof, (1),
53. doi:10.1109/MITP.2018.011291353
He, M., Devine, L., & Zhuang, J. (2018). Perspectives on Cybersecurity Information Sharing among
Multiple Stakeholders Using a Decision‐Theoretic Approach. Risk Analysis: An International
Journal, 38(2), 215–225. Retrieved from: https://www.sra.org/aggregator/sources/1
Ho, S. M., Booth, C., & Ocasio-Velazquez, M. (2017). Trust or consequences? Causal effects of
perceived risk and subjective norms on Cloud technology adoption. Computers & Security, 70,
581-595 Retrieved from https://www.journals.elsevier.com/computers-and-security
Hosseinian-Far, A., Ramachandran, M., & Sarwar, D. (2017). Strategic engineering for Cloud
computing and big data analytics. Cham, Switzerland: Springer.
Hsu, P., Ray, S., & Li-Hsieh, Y. (2014). Examining Cloud computing adoption intention, pricing
mechanism, and deployment model. International Journal of Information Management, 34, 474-
488. doi:10.1016/j.ijinfomgt.2014.04.006
Hsu, C., & Sanford, B., A., (2007). The Delphi technique: Making sense of consensus. Practical
Assessment, Research & Evaluation, 12(10), 1. Retrieved from: http://pareonline.net/
Huang, C., Hou, C., He, L., Dai, H., & Ding, Y., (2017) Policy-Customized: A new abstraction for
building security as a service. 2017 14th International Symposium on Pervasive Systems,
Algorithms and Networks & 2017 11th International Conference on Frontier of Computer
file:///C:/Users/meersmanm/Downloads/Retrieved
https://www.sra.org/aggregator/sources/1
https://www.journals.elsevier.com/computers-and-security
141
Science and Technology & 2017 Third International Symposium of Creative Computing
(ISPAN-FCST-ISCC), doi:10.1109/ISPAN-FCST-ISCC.2017.57
Huang, L., Shen, Y., Zhang, G., & Luo, H. (2015). Information system security risk assessment based on
multidimensional Cloud model and the entropy theory. 2015 IEEE Conference on Computer
Vision & Pattern Recognition (CVPR), 11. doi:10.1109/ICEIEC.2015.7284476
Hu, K. H., Chen, F. H., & We, W. J. (2016). Exploring the key risk factors for application of Cloud
computing in auditing. Entropy, 18(8), 401. Retrieved from:
http://www.mdpi.com/journal/entropy
Hussain, W., Hussain, F. K., Hussain, O. K., Damiani, E., & Chang, E. (2017). Formulating and
managing viable SLAs in Cloud computing from a small to medium service provider’s
viewpoint: A state-of-the-art review. Information Systems, 71, 240-259.
doi:10.1016/j.is.2017.08.007
Hussain, S., A., Mehwish, F., Atif, S., Imran, R., & Raja Khurram, S. (2017). Multilevel classification of
security concerns in Cloud computing. Applied Computing and Informatics, 13(1) 57-65.
doi:10.1016/j.aci.2016.03.001
Imran, M., Hlavacs, H., Haq, I. U., Jan, B., Khan, F. A., & Ahmad, A. (2017). Provenance based data
integrity checking and verification in Cloud environments. Plos One, 12(5).
doi:10.1371/journal.pone.0177576
Ionela, B. (2014). Cloud computing services: Benefits, risks and intellectual property issues. Global
Economic Observer, 230-242. Retrieved from: https://www.questia.com/library/p439761/global-
economic-observer
142
Iqbal, S., Mat Kiah, M. L., Dhaghighi, B., Hussain, M., Khan, S., Khan, M. K., & Raymond Choo, K.
(2016). Review: On Cloud security attacks: A taxonomy and intrusion detection and prevention
as a service. Journal of Network and Computer Applications, 74, 98-120.
doi:10.1016/j.jnca.2016.08.016
ISACA GWDC. (2018). About – ISACA greater Washington, D.C. chapter. Retrieved from https://isaca-
gwdc.org/about/
Islam, S., Fenz, S., Weippl, E., & Mouratidis, H. (2017). A risk management framework for Cloud
migration decision support. Journal of Risk & Financial Management, 10(2), 1.
doi:10.3390/jrfm10020010
Issa M., K., Abdallah, K., & Muhammad, A. (2014). Cloud computing security: A survey. Computers, 3.
doi:10.3390/computers3010001
IT Process Maps Gb. (2018). What is ITIL? Retrieved from https://en.it-processmaps.com/itil/about-
itil.html
Jaatun, M. G., Pearson, S., Gittler, F., Leenes, R., & Niezen, M. (2016). Enhancing accountability in the
Cloud. International Journal of Information Management. Retrieved from
https://www.journals.elsevier.com/international-journal-of-information-management
Jeganathan, S. (2017). Enterprise security architecture: Key for aligning security goals with business
goals. ISSA Journal, 15(1), 22-29. Retrieved from http://www.issa.org/?page=ISSAJournal
Johnson, A. M. (2009). Business and security executives’ views of information security investment
drivers: Results from a Delphi study. Journal of Information Privacy & Security, 5(1), 3-27.
Retrieved from: https://www.tandfonline.com/loi/uips20
143
Jouini, M., & Ben Arfa Rabai, L. (2016). Comparative study of information security risk assessment
models for Cloud computing systems. Procedia Computer Science, 831084.
doi:10.1016/j.procs.2016.04.227
Lawson, B. P., Muriel, L., & Sanders, P. R. (2017). Regular Paper: A survey on firms’ implementation
of COSO’s 2013 Internal control–integrated framework. Research in Accounting Regulation, 29,
30-43. doi:10.1016/j.racreg.2017.04.004
Lohe, J., & Legner, C. (2014). Overcoming implementation challenges in enterprise architecture
management: a design theory for architecture-driven IT Management (ADRIMA). Information
Systems & E-Business Management, 12(1), 101-137. doi:10.1007/s10257-012-0211-y
Luna, J., Suri, N., Iorga, M., & Karmel, A. (2015). Leveraging the potential of Cloud security service-
level agreements through standards. IEEE Cloud Computing, 2(3), 32-40.
Kaaniche, N., Mohamed, M., Laurent, M., & Ludwig, H., Security SLA Based Monitoring in Clouds.
(2017). 2017 IEEE International Conference on Edge Computing (EDGE), 90.
doi:10.1109/IEEE.EDGE.2017.20
Kalaiprasath, R., Elankavi, R., & Udayakumar, R. (2017). Cloud security and compliance – A semantic
approach in end to end security. International Journal on Smart Sensing & Intelligent Systems.
Retrieved from http://s2is.org/
Karras, D. A. (2017). On Scalable and Efficient Security Risk Modelling of Cloud Computing
Infrastructure based on Markov processes. ITM Web of Conferences 9 3-6. EDP Sciences.
Retrieved from: https://www.itm-conferences.org/
Keung, J., & Kwok, F. (2012). Cloud deployment model selection assessment for SMEs: Renting or
Buying a Cloud. 2012 IEEE Fifth International Conference on Utility & Cloud Computing, 21.
doi:10.1109/UCC.2012.29
144
Khalil, I. M., Khreishah, A., & Azeem, M. (2014). Cloud computing security: A survey. Computers,
3(1) 1-35. doi:10.3390/computers3010001
Khamsemanan, N., Ostrovsky, R., Skeith, W., E. (2016). On the black-box use of somewhat
homomorphic encryption in noninteractive two-party protocols. SIAM Journal on Discrete
Mathematics, 30(1), 266. doi:10.1137/110858835
Khan, N., & Al-Yasiri, A. (2016). Identifying Cloud security threats to strengthen Cloud computing
adoption framework. Procedia Computer Science, 94, 485-490. doi:10.1016/j.procs.2016.08.075
Khan, S., Nicho, M., & Takruri, H. (2016). IT controls in the public Cloud: Success factors for
allocation of roles and responsibilities. Journal of Information Technology Case & Application
Research, 18(3), 155. Retrieved from https://www.tandfonline.com/loi/utca20
Kholidy, H., Erradi, A., Abdelwahed, S., & Baiardi, F. (2016). A risk mitigation approach for
autonomous Cloud intrusion response system. Computing, 98(11), 1111-1135.
doi:10.1007/s00607-016-0495-8
Korstjens, I., & Moser, A. (2018). Series: Practical guidance to qualitative research. Part 4:
Trustworthiness and publishing. The European Journal of General Practice, 24(1), 120–124.
https://www.tandfonline.com/loi/igen20
Kouatli, I. (2016). Managing Cloud computing environment: Gaining customer trust with security and
ethical management. Procedia Computer Science, 91(Promoting Business Analytics and
Quantitative Management of Technology: 4th International Conference on Information
Technology and Quantitative Management (ITQM 2016), 412-421.
doi:10.1016/j.procs.2016.07.110
https://www.tandfonline.com/loi/igen20
145
Kovacsne, L., A., M., (2018). Reducing IT costs and ensuring safe operation with application of the
portfolio management. Serbian Journal of Management, 12(1) 143-155. Retrieved from:
http://www.sjm06.com/
Korte, J. (2017). Mitigating cyber risks through information sharing. Journal of Payments Strategy &
Systems, 203–214. Retrieved from https://www.henrystewartpublications.com/jpss
Kritikos, K., Kirkham, T., Kryza, B., & Massonet, P. (2015). Security enforcement for multi-Cloud
platforms – The case of PaaSage. Procedia Computer Science, 68,1st International Conference
on Cloud Forward: From Distributed to Complete Computing, 103-115.
doi:10.1016/j.procs.2015.09.227
Kumar, D., Samalia, H. V., & Verma, P. (2017). Factors influencing Cloud computing adoption by
small and medium-sized enterprises (SMEs) in India. Pacific Asia Journal of the Association for
Information Systems, 9(3), 25. Retrieved from: http://aisel.aisnet.org/pajais/
Lacity, M. C., & Reynolds, P. (2013). Cloud services practices for small and medium-sized enterprises.
Mis Quarterly Executive, 13(1), 31-44. Retrieved from:
http://misqe.org/ojs2/index.php/misqe/index
Lai, S., & Leu, F. (2015). A security threats measurement model for reducing Cloud computing security
risk. 2015 Second International Conference on Advances in Computing & Communication
Engineering, 414. doi:10.1109/IMIS.2015.64
Lai, Y., Sardakis, G., & Blackburn, R. (2015). Job stress in the United Kingdom: Are small and
medium-sized enterprises and large enterprises different?. Stress & Health: Journal of The
International Society for the Investigation of Stress, 31(3), 222-235. Retrieved from
https://onlinelibrary.wiley.com/journal/15322998
146
Lalev, A. (2017). Methods and instruments for enhancing Cloud computing security in small and
medium sized enterprises. Business Management / Biznes Upravlenie, (2), 38-53 Retrieved from:
http://bm.uni-svishtov.bg/
Lanz, J. (2015). Conducting information technology risk assessments. CPA Journal, 85(5), 6-9.
Retrieved from: https://www.cpajournal.com/
Leclercq-Vandelannoitte, A., & Emmanuel, B. (2018). From sovereign IT governance to liberal IT
governmentality? A Foucauldian analogy. European Journal of Information Systems, 27(3), 326.
doi:10.1080/0960085X.2018.1473932
Lee, J., Kim, Y., S., Kim, J., H., & Kim, I., K., (2017). Toward the SIEM architecture for Cloud-based
security services. 2017 IEEE Conference on Communications and Network Security (CNS),
Communications and Network Security (CNS), 2017 IEEE Conference on, 398.
doi:10.1109/CNS.2017.8228696
Lent, R. (2016). Evaluating the cooling and computing energy demand of a datacentre with optimal
server provisioning. Future Generation Computer Systems, 57, 1-12.
doi:10.1016/j.future.2015.10.008
Leung, R., Hastings, J. F., Keefe, R. H., Brownstein-Evans, C., Chan, K. T., & Mullick, R. (2016).
Building mobile apps for underrepresented mental health care consumers: A grounded theory
approach. Social Work in Mental Health, 14(6), 625. doi:10.1080/15332985.2015.1130010
Lew, D. (2015). ISACA’s COBIT conference provides training and insights for all levels of expertise.
COBIT Focus, 1-2. Retrieved from: https://www.isaca.org/COBIT/focus/Pages/FocusHome.aspx
Li, J., & Li, Q. (2018). Data security and risk assessment in Cloud computing. ITM Web of Conferences.
EDP Sciences. Retrieved from https://www.itm-conferences.org/
147
Lian, J., Yen, D., & Wang, Y. (2014). An exploratory study to understand the critical factors affecting
the decision to adopt Cloud computing in Taiwan hospital. International Journal of Information
Management, 34, 28-36. doi:10.1016/j.ijinfomgt.2013.09.004
Liu, S., Chan, F. T., & Ran, W. (2016). Decision making for the selection of Cloud vendor: An
improved approach under group decision-making with integrated weights and
objective/subjective attributes. Expert Systems with Applications, 5537-47.
doi:10.1016/j.eswa.2016.01.059
Liu, X., Xia, C., Wang, T., Zhong, L., (2017) CloudSec: A novel approach to verifying security
conformance at the bottom of the Cloud. 2017 IEEE International Congress on Big Data
(BigData Congress), Big Data (BigData Congress), 569. doi:10.1109/BigDataCongress.2017.87
Llave, M. R. (2017). Business intelligence and analytics in small and medium-sized enterprises: A
systematic literature review. Procedia Computer Science, CENTERIS/ProjMAN/HCist 2017),
194-205. doi:10.1016/j.procs.2017.11.027
Lu, P. (2018). Structural effects of participation propensity in online collective actions: Based on big
data and Delphi methods. Journal of Computational and Applied Mathematics, 344, 288–300.
https://www.journals.elsevier.com/journal-of-computational-and-applied-mathematics
Lynn, T., van der Werff, L., Hunt, G., & Healy, P. (2016). Development of a Cloud trust label: A Delphi
approach. Journal of Computer Information Systems, 56(3), 185-193. Retrieved from:
https://www.tandfonline.com/loi/ucis20
Madria, S. K. (2016). Security and risk assessment in the Cloud. Computer, 49(9), 110-113. Retrieved
from: https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=2
Maghrabi, L., Pfluegel, E., & Noorji, S., F., (2016) Designing utility functions for game-theoretic Cloud
security assessment: A case for using the common vulnerability scoring system. 2016
https://www.journals.elsevier.com/journal-of-computational-and-applied-mathematics
148
International Conference on Cyber Security and Protection of Digital Services (Cyber Security),
doi:10.1109/CyberSecPODS.2016.7502351
Mahmood, K., Shevtshenko, E., Karaulova, T., & Otto, T. (2018). Risk assessment approach for a
virtual enterprise of small and medium sized enterprises. Proceedings of the Estonian Academy
of Sciences, 67(1), 17-27. doi:10.3176/proc.2017.4.27
Mangiuc, D. (2017). Accountants and the Cloud – Involving the professionals. Accounting &
Management Information Systems / Contabilitate Si Informatica De Gestiune, 16(1), 179-198.
Retrieved from http://jamis.ase.ro/
Martin, V. B. (2017). Formal grounded theory: Knowing when to come out of the rain. Grounded
Theory Review, 16(1), 35-37. Retrieved from: http://groundedtheoryreview.com/
Masky, M., Young, S. S., & Choe, T. (2015). A novel risk identification framework for Cloud
computing security. 2015 2nd International Conference on Information Science & Security
(ICISS), 1. doi:10.1109/ICISSEC.2015.7370967
Mayadunne, S., & Park, S. (2016). An economic model to evaluate information security investment of
risk-taking small and medium enterprises. International Journal of Production Economics,
182519-530. doi:10.1016/j.ijpe.2016.09.018
Mell, P., & Grance, T. (2011). The NIST definition of Cloud computing. Retrieved from:
https://csrc.nist.gov/publications/sp
Mengxi, N., Peng, R., & HaoMiao, Y. (2016). Efficient multi-keyword ranked search over outsourced
Cloud data based on homomorphic encryption. MATEC Web of Conferences, 561.
doi:10.1051/matecconf/20165601002
149
Michaux, S., Ross, P. K., & Blumenstein, M. (2015). Cloud computing as a facilitator of SME
entrepreneurship. Technology Analysis & Strategic Management, 27(1), 87-101. Retrieved from:
https://www.tandfonline.com/loi/ctas20
Mishra, P., Pilli, E. S., Varadharajan, V., & Tupakula, U. (2017). Review: Intrusion detection techniques
in Cloud environment: A survey. Journal of Network and Computer Applications, 77, 18-47.
doi:10.1016/j.jnca.2016.10.015
Mohabbattalab, E., von der Heidt, T., & Mohabbattalab, B. (2014). The perceived advantages of Cloud
computing for SMEs. GSTF Journal on Computing, 4(1), 61-65. doi:10.5176/2251-3043_4.1.309
Molken, R. v., & Wilkins, P. (2017). Implementing oracle integration Cloud service. Birmingham, UK:
Packt Publishing.
Moncayo, D., & Montenegro, C. (2016). 2016 6Th International Conference on Information
Communication & Management (ICICM), 115. doi:10.1109/INFOCOMAN.2016.7784226
Moral-García, S., Moral-Rubio, S., Fernández, E. B., & Fernández-Medina, E. (2014). Enterprise
security pattern: A model-driven architecture instance. Computer Standards & Interfaces, 36,
748-758. doi:10.1016/j.csi.2013.12.009
Moyo, M., & Loock, M., (2016) South African small and medium-sized enterprises’ reluctance to adopt
and use Cloud-based business intelligence systems: A literature review. 2016 11th International
Conference for Internet Technology and Secured Transactions (ICITST), 250.
doi:10.1109/ICITST.2016.7856706
Mustonen-Ollila, E., Lehto, M., & Huhtinen, A. (2018). Hybrid information environment: Grounded
theory analysis. Proceedings of the International Conference on Cyber Warfare & Security, 412-
419. Retrieved from: https://www.academic-conferences.org/conferences/iccws/
150
Musungwini, S., Mugoniwa, B., Furusa, S. S., & Rebanowako, T. G. (2016). An analysis of the use of
Cloud computing among university lecturers: A case study in Zimbabwe. International Journal
of Education and Development Using Information and Communication Technology, 12(1), 53-
70. Retrieved from: http://ijedict.dec.uwi.edu/
Na, S. & Huh, E., (2014) A methodology of assessing security risk of Cloud computing in user
perspective for security-service-level agreements. Fourth edition of the International Conference
on the Innovative Computing Technology (INTECH 2014), 87. Retrieved from:
https://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=6916662&filter%3DAND%28p_
IS_Number%3A6927737%29&pageNumber=2
Ndiaye, N., Razak, L. A., Nagayev, R., & Ng, A. (2018). Demystifying small and medium enterprises’
(SMEs) performance in emerging and developing economies. Borsa Istanbul Review,
doi:10.1016/j.bir.2018.04.003
Ngo, C., Demchenko, Y., & de Laat, C. (2016). Multi-tenant attribute-based access control for Cloud
infrastructure services. Journal of Information Security and Applications, 65-84.
doi:10.1016/j.jisa.2015.11.005
Ogden, S. R., Culp, J. C., Villamaria, F. J., & Ball, T. R. (2016). Developing a checklist: Consensus via
a modified Delphi technique. Journal of Cardiothoracic and Vascular Anesthesia, 30855-858.
doi:10.1053/j.jvca.2016.02.022
Oktadini, N., R., & Surendro, K., (2014). SLA in Cloud computing: Improving SLA’s life cycle
applying six sigmas. 2014 International Conference on Information Technology Systems and
Innovation (ICITSI), doi:10.1109/ICITSI.2014.7048278
O’Malley, M. P., & Capper, C. A. (2015). A measure of the quality of educational leadership programs
for social justice: Integrating LGBTIQ identities into principal preparation. Educational
151
Administration Quarterly, 51(2), 290-330. Retrieved from:
http://journals.sagepub.com/home/eaq
Overby, S., Greiner, L., & Paul, L. G. (2017, July 5). What is an SLA? Best practices for service-level
agreements. Retrieved from https://www.cio.com/article/2438284/outsourcing/outsourcing-sla-
definitions-and-solutions.html
Nanavati, M., Colp, P., Aeillo, B., & Warfield, A. (2014). Cloud Security: A Gathering
Storm. Communications of the ACM, 57(5), 70-79. doi:10.1145/2593686
Papachristodoulou, E., Koutsaki, M., & Kirkos, E. (2017). Business intelligence and SMEs: Bridging the
gap. Journal of Intelligence Studies in Business, 7(1), 70-78. Retrieved from: https://ojs.hh.se/
Papadopoulos, G., Rikama, S., Alajaasko, P., Salah-Eddine, Z., Airaksinen, A., & Loumaranta, H.
(2018). Statistics on small and medium-sized enterprises – Statistics explained. Retrieved from
http://ec.europa.eu/eurostat/statisticsexplained/index.php/Statistics_on_small_and_medium-
sized_enterprises#Basic_structures:_employment_size_class_breakdown_in_Structural_Business
_Statistics
Parekh, G., DeLatte, D., Herman, G., L., Oliva, L., Phatak, D., Scheponik, T., Sherman, A., T., (2018).
Identifying core concepts of cybersecurity: Results of two Delphi processes. IEEE Transactions
on Education, (1), 11. doi:10.1109/TE.2017.2715174
Parks, R. F., & Wigand, R. T. (2014). Organizational privacy strategy: Four quadrants of strategic
responses to information privacy and security threats. Journal of Information Privacy &
Security, 10(4), 203. doi:10.1080/15536548.2014.974435
Paxton, N., C., (2016) Cloud security: A review of current issues and proposed solutions. 2016 IEEE
2nd International Conference on Collaboration and Internet Computing (CIC), 452.
doi:10.1109/CIC.2016.066
152
Persico, V., Botta, A., Marchetta, P., Montieri, A., & Pescapé, A. (2017). On the performance of the
wide-area networks interconnecting public-Cloud datacenters around the globe. Computer
Networks, 112, 67-83. doi:10.1016/j.comnet.2016.10.013
Phaphoom, N., Wang, X., Samuel, S., Helmer, S., & Abrahamsson, P. (2015). A survey study on major
technical barriers affecting the decision to adopt Cloud services. The Journal of Systems &
Software, 103, 167-181. doi:10.1016/j.jss.2015.02.002
Potey, M. M., Dhote, C., & Sharma, M. H. (2016). Homomorphic encryption for security of Cloud
data. Procedia Computer Science, 79 (Proceedings of International Conference on
Communication, Computing and Virtualization (ICCCV) 2016), 175-181.
doi:10.1016/j.procs.2016.03.023
Preeti, B., Runni, K., & Prof. Manjula, R. (2016). Security in Cloud computing for service delivery
models: Challenges and solutions. International Journal of Engineering Research and
Applications, 6(4), 76-85. Retrieved from: http://www.ijera.com/
Priyadarshinee, P., Raut, R. D., Jha, M. K., & Kamble, S. S. (2017). A Cloud computing adoption in
Indian SMEs: Scale development and validation approach. Journal of High Technology
Management Research, 28, 221-245. doi:10.1016/j.hitech.2017.10.010
Qian, L., Y., Baharudin, A. S., & Kanaan-Jeebna, A. (2016). Factors affecting the adoption of enterprise
resource planning (ERP) on Cloud among small and medium enterprises (SMEs) in Penang,
Malaysia. Journal of Theoretical & Applied Information Technology, 88(3), 398. Retrieved
from: https://www.jatit.org/
Qiang, D. (2015). Modeling and performance analysis for composite network–compute service
provisioning in software-defined Cloud environments. Digital Communications and Networks,
1(3), 181-190. doi:10.1016/j.dcan.2015.05.003
153
Quigley, K., Burns, C., & Stallard, K. (2015). ‘Cyber Gurus’: A rhetorical analysis of the language of
cybersecurity specialists and the implications for security policy and critical infrastructure
protection. Government Information Quarterly, 32, 108-117. doi:10.1016/j.giq.2015.02.001
Rao, R. V., & Selvamani, K. (2015). Data Security Challenges and Its Solutions in Cloud
Computing. Procedia Computer Science, 48 (International Conference on Computer,
Communication and Convergence (ICCC 2015), 204-209. doi:10.1016/j.procs.2015.04.171
Rasheed, H. (2014). Data and infrastructure security auditing in Cloud computing
environments. International Journal of Information Management, 34, 364-368.
doi:10.1016/j.ijinfomgt.2013.11.002
Ray, D. (2016). Cloud adoption decisions: Benefitting from an integrated perspective. Electronic
Journal of Information Systems Evaluation, 19(1), 3-22. Retrieved from
http://www.ejise.com/main.html
Raza, N., Rashid, I., & Awan, F. (2017). Security and management framework for an organization
operating in Cloud environment. Annals of Telecommunications, 72(5), 325. Retrieved from
https://link.springer.com/journal/12243
Razumnikov, S. V., Zakharova, A. A., & Kremneva, M. S. (2014). A model of decision support on
migration of enterprise IT-applications in the Cloud environment. Applied Mechanics and
Materials. Trans Tech Publications. Retrieved from: https://www.scientific.net/AMM
Rebello, O., Mellado, D., Fernández-Medina, E., & Mouratidis, H. (2015). Empirical evaluation of a
Cloud computing information security governance framework. Information and Software
Technology, 58, 44-57. doi:10.1016/j.infsof.2014.10.003
154
Ren, S. Q., Tan, B. M., Sundaram, S., Wang, T., Ng, Y., Chang, V., & Aung, K. M. (2016). Secure
searching on Cloud storage enhanced by homomorphic indexing. Future Generation Computer
Systems, 65(Special Issue on Big Data in the Cloud), 102-110. doi:10.1016/j.future.2016.03.013
Ring, T. (2015). Feature: Cloud security fears: Fact or FUD? Network Security, 10-14.
doi:10.1016/S1353-4858(15)30058-1
Rittle, J., Czerwinski, J., & Sullivan, M. (2016). Auditing the Cloud. Internal Auditor, 73(4), 43-48.
Retrieved from: https://na.theiia.org/periodicals/Pages/Internal-Auditor-Magazine.aspx
Robu, M., (2013). The dynamic and importance of SMEs in economy. USV Annals of Economics and
Public Administration, Vol 13, Issue 1(17), Pp 84-89 (2013), (1(17), 84. Retrieved from:
http://www.seap.usv.ro/annals/ojs/index.php/annals
Rocha, L., Gomez, A., Araújo, N., Otero, C., & Rodrigues, D. (2016). Cloud management tools for
sustainable SMEs. Procedia CIRP, 40 (13th Global Conference on Sustainable Manufacturing –
Decoupling Growth from Resource Use), 220-224. doi:10.1016/j.procir.2016.01.106
Rojas, M. T., Gonzalez, N. M., Sbampato, F. V., Redígolo, F. F., Carvalho, T., Ullah, K. W., & …
Ahmed, A. S. (2016). A framework to orchestrate security SLA lifecycle in Cloud
computing. CISTI (Iberian Conference on Information Systems & Technologies / Conferência
Ibérica De Sistemas E Tecnologias De Informação) Proceedings, 1414. Retrieved from:
http://www.worldcat.org/title/information-systems-and-technologies-proceedings-of-the-11th-
iberian-conference-on-information-systems-and-technologies-cisti-2016-gran-canaria-spain-
june-15-18-2016/oclc/1010053680
Sahmim, S., & Gharsellaoui, H. (2017). Privacy and security in Internet-based computing: Cloud
computing, Internet of things, Cloud of things: a review. Procedia Computer Science, 112,
Knowledge-Based and Intelligent Information & Engineering Systems: Proceedings of the 21st
155
International Conference, KES-20176-8 September 2017, Marseille, France, 1516-1522.
doi:10.1016/j.procs.2017.08.050
Salapura, V., Harper, R., (2018) Virtual machine resiliency management system for the Cloud. IEEE
Cloud Computing, Cloud Computing, IEEE, (3), 55. doi:10.1109/MCC.2018.032591617
Schmidt, P. J., Wood, J. T., & Grabski, S. V. (2016). Business in the Cloud: Research Questions on
Governance, Audit, and Assurance. Journal of Information Systems, 30(3), 173-189.
doi:10.2308/isys-51494
Seethamraju, R., (2014). Adoption of software as a service (SaaS) enterprise resource planning (ERP)
systems in small and medium sized enterprises (SMEs). Information Systems Frontiers. 17, 3,
475-492, ISSN: 13873326. Retrieved from: https://link.springer.com/journal/10796
Senarathna, I., Wilkin, C., Warren, M., Yeoh, W., & Salzman, S., (2018) Factors that influence adoption
of Cloud computing: An empirical study of Australian SMEs. Australasian Journal of
Information Systems, 22(0), doi:10.3127/ajis.v22i0.1603
Senarathna, I., Yeoh, W., Warren, M., & Salzman, S. (2016). Security and privacy concerns for
Australian SMEs Cloud adoption: Empirical study of metropolitan vs regional
SMEs. Australasian Journal of Information Systems, 201-20. Retrieved from:
http://journal.acs.org.au/index.php/ajis
Shackleford, D. (2016). It’s time to clarify ownership of Cloud risk. Information Security, 18(10), 23-25.
Retrieved from: https://www.tandfonline.com/toc/uiss20/current
Shaikh, R., & Sasikumar, M. (2015). Trust model for measuring security strength of Cloud computing
service. Procedia Computer Science, 45 (International Conference on Advanced Computing
Technologies and Applications (ICACTA), 380-389. doi:10.1016/j.procs.2015.03.165
156
Shao, Z., Cao, Y., & Cheng, B., (2014). A quantitative evaluation method based on hierarchical
architecture model of Cloud service availability. Applied Mechanics & Materials, 571-572
doi:10.4028/www.scientific.net/AMM.571-572.11. Retrieved from:
https://www.scientific.net/AMM
Sherman, A. T., DeLatte, D., Neary, M., Oliva, L., Phatak, D., Scheponik, T., & … Thompson, J. (2018).
Cybersecurity: Exploring core concepts through six scenarios. Cryptologia, 42(4), 337.
doi:10.1080/01611194.2017.1362063
Shkurti, R., & Muça, E. (2014). An analysis of Cloud computing and its role in accounting industry in
Albania. Journal of Information Systems & Operations Management, 8(2), 1. Retrieved from:
http://jisom.rau.ro/forward.html
Singh, S., Jeong, Y., & Park, J. H. (2016). A survey on Cloud computing security: Issues, threats, and
solutions. Journal of Network and Computer Applications, 75200-222.
doi:10.1016/j.jnca.2016.09.002
Salim, S., A., Darshana, S., Sukanlaya, S., Abdulrahman Hamad E, A., & Maura, A. (2015). Moving
from evaluation to trial: How do SMEs start adopting Cloud ERP?. Australasian Journal of
Information Systems, 19(0) doi:10.3127/ajis.v19i0.1030
Skeptic. (2009, June 20). The real cost of ITIL V3 expert certification | The IT Skeptic. Retrieved from
http://www.itskeptic.org/real-cost-itil-v3-expert-certification
Soubra, M., & Tanriover, Ö. Ö. (2017). An assessment of recent Cloud security measure proposals in
comparison to their support by widely used Cloud service providers. Mugla Journal of Science &
Technology, 3(2), 122. doi:10.22531/muglajsci.355273
157
Souza, S. M., & Puttini, R. S. (2016). Client-side encryption for privacy-sensitive applications on the
Cloud. Procedia Computer Science, 97 (2nd International Conference on Cloud Forward: From
Distributed to Complete Computing), 126-130. doi:10.1016/j.procs.2016.08.289
Srinivasan, S. (2013). Is security realistic in Cloud computing? Journal of International Technology &
Information Management, 22(4), 47-66. Retrieved from http://scholarworks.lib.csusb.edu/jitim/
Stănciulescu, G. C., & Dumitrescu, F. (2014). Optimizing the IT structures of tourism SMEs using
modern applications and resources (Cloud). Procedia Economics and Finance, 15(Emerging
Markets Queries in Finance and Business (EMQ 2013), 1769-1778. doi:10.1016/S2212-
5671(14)00653-4
Statistics Solution. (2019, January 23). What is transferability in qualitative research and how do we
establish it?. Retrieved April 1, 2019, from https://www.statisticssolutions.com/what-is-
transferability-in-qualitative-research-and-how-do-we-establish-it/
Strasser, A. (2017). Delphi Method variants in information systems research: Taxonomy development
and application. Electronic Journal of Business Research Methods, 15(2), 120-133. Retrieved
from: http://www.ejbrm.com/main.html
Sun, Y., Nanda, S., & Jaeger, T., (2015). Security-as-a-service for microservices-based Cloud
applications. 2015 IEEE 7th International Conference on Cloud Computing Technology and
Science (CloudCom), 50. doi:10.1109/CloudCom.2015.93
Sung, C., Zhang, B., Higgins, C., Y., & Choe, Y., (2016). Data-Driven Sales Leads Prediction for
Everything-as-a-Service in the Cloud. 2016 IEEE International Conference on Data Science and
Advanced Analytics (DSAA), 557. doi:10.1109/DSAA.2016.83
SurveyMonkey. (2018). It’s all about powering the curious | SurveyMonkey. Retrieved from
https://www.surveymonkey.com/mp/aboutus/?ut_source=footer
158
Szadeczky, T. (2016). Risk management of new technologies. AARMS: Academic & Applied Research
in Military & Public Management Science, 15(3), 279-290. Retrieved from: https://www.uni-
nke.hu/kutatas/egyetemi-folyoiratok/aarms/journal-home
Tajammul, M., & Parveen, R., (2017). Comparative analysis of big ten ISMS standards and their effect
on Cloud computing. (2017). 2017 International Conference on Computing and Communication
Technologies for Smart Nation (IC3TSN), 362. doi:10.1109/IC3TSN.2017.8284506
Tang, C., & Liu, J. (2015). Selecting a trusted Cloud service provider for your SaaS program.
Computers & Security, 5060-73. doi:10.1016/j.cose.2015.02.001
Tang, Y., Wang, L., Yang, L., & Wang, X. (2014). Information security risk assessment method based
on Cloud model. IET Conference Proceedings. The Institution of Engineering & Technology. doi
10.1049/cp.2014.0695
Tanimoto, S., Sato, R., Kato, K., Iwashita, M., Seki, Y., Sato, H., Kanai, A., (2014). A study of risk
assessment quantification in Cloud computing. 17th International Conference on, Network-
Based Information Systems (NBiS), 426. doi:10.1109/NBiS.2014.11
Tisdale, S. M. (2016). Architecting a cybersecurity management framework. Issues in Information
Systems, 17(4), 227. Retrieved from: https://www.iacis.org/iis/iis.php
Togan, M. (2015). Aspects of security standards for Cloud computing. MTA Review, 25(1), 31-44.
Retrieved from: https://www.journal.mta.ro/
Torkura, K., A., Sukmana, M., I., H., Cheng, F., & Meinel, C., (2017). Leveraging Cloud native design
patterns for security-as-a-service applications. 2017 IEEE International Conference on Smart
Cloud, 90. doi:10.1109/SmartCloud.2017.21
159
Trapero, R., Modic, J., Stopar, M., Taha, A., & Suri, N. (2017). A novel approach to manage Cloud
security SLA incidents. Future Generation Computer Systems, 72193-205.
doi:10.1016/j.future.2016.06.004
Trevelyan, E. G., & Robinson, P. N. (2015). Research paper: Delphi methodology in health research:
how to do it? European Journal of Integrative Medicine, 7(Diagnostic Techniques and Outcome
Measures for Integrated Health), 423-428. doi:10.1016/j.eujim.2015.07.002
Tso, F. P., Jouet, S., & Pezaros, D. P. (2016). Network and server resource management strategies for
data centre infrastructures: A survey. Computer Networks, 106, 209-225.
doi:10.1016/j.comnet.2016.07.002
Tunc, C., Hariri, S., Merzouki, M., Mahmoudi, C., de Vaulx, F., Chbili, J., Boh, R., Battou, A., Cloud
Security Automation Framework. (2017). 2017 IEEE 2nd International Workshops on
Foundations and Applications of Self Systems (FASW), 307. doi:10.1109/FAS-W.2017.164
vizardUnited States Small Business Administration. (2017). Table of size standards. Retrieved from
https://www.sba.gov/document/support–table-size-standards
Van Till, S. (2017). Five Cloud-based physical security measures for healthcare organizations. Journal
of Healthcare Protection Management, 33(1), 15-18. Retrieved from:
https://www.iahss.org/page/Journal
Vasiljeva, T., Shaikhulina, S., & Kreslins, K. (2017). Cloud Computing: Business Perspectives, Benefits
and Challenges for Small and Medium Enterprises (Case of Latvia). Procedia
Engineering, 178(RelStat-2016: Proceedings of the 16th International Scientific Conference
Reliability and Statistics in Transportation and Communication October 19-22, 2016. Transport
and Telecommunication Institute, Riga, Latvia), 443-451. doi:10.1016/j.proeng.2017.01.087
160
Viehmann, J., (2014) Risk management for outsourcing to the Cloud: Security risks and safeguards as
selection criteria for extern Cloud services. 2014 IEEE International Symposium on Software
Reliability Engineering Workshops, 293. doi:10.1109/ISSREW.2014.80
Vijayakumar, K., & Arun, C. (2017). Analysis and selection of risk assessment frameworks for Cloud-
based enterprise applications. Biomedical Research. Retrieved from: http://www.biomedres.info/
Vizard, M. (2016). Taking a look inside the AWS Public Cloud. Channel Insider, 1-2. Retrieved from:
https://www.channelinsider.com/cp/bio/Michael-Vizard/
Wakunuma, K., & Masika, R. (2017). Cloud computing, capabilities and intercultural ethics:
Implications for Africa. Telecommunications Policy, 41(ICT developments in Africa –
infrastructures, applications and policies), 695-707. doi:10.1016/j.telpol.2017.07.006
Wang, F., & He, W. (2014). Service strategies of small Cloud service providers: A case study of a small
Cloud service provider and its clients in Taiwan. International Journal of Information
Management, 34406-415. doi:10.1016/j.ijinfomgt.2014.01.007
Wang, Z., Su, X., Diao, Y., Wang, P., & Ge, S. (2015). Study of data security risk relevance about
Cloud computing for small and medium-sized enterprises. Application Research of Computers /
Jisuanji Yingyong Yanjiu, 32(6), 1782-1786. doi:10.3969/j.issn.1001-3695.2015.06.040
Wang, X. V., Wang, L., & Gordes, R. (2018). Interoperability in Cloud manufacturing: a case study on
private Cloud structure for SMEs. International Journal of Computer Integrated Manufacturing,
31(7), 653-663. doi:10.1080/0951192X.2017.1407962
Wang, C., Wood, L. C., Abdul-Rahman, H., & Lee, Y. T. (2016). When traditional information
technology project managers encounter the Cloud: Opportunities and dilemmas in the transition
to Cloud services. International Journal of Project Management, 34, 371-388.
doi:10.1016/j.ijproman.2015.11.006
161
Waterman, M., Noble, J., & Allan, G. (2015). How Much Up-Front? A Grounded theory of Agile
Architecture. 2015 IEEE/ACM 37Th IEEE International Conference on Software Engineering,
347. doi:10.1109/ICSE.2015.54
Weintraub, E., & Cohen, Y. (2016). Security risk assessment of Cloud computing services in a
networked environment. International Journal of Advanced Computer Science and Applications,
7(11) 79-90. doi:10.14569/IJACSA.2016.071112
Wiesche, M., Jurisch, M. C., Yetton, P. W., & Krcmar, H. (2017). Grounded theory methodology in
information sciences research. MIS Quarterly, 41(3), 685-A9. Retrieved from:
https://www.misq.org/
Wilson, A., & Wilson, C. (2011). The effects of U.S. government security regulations on the
cybersecurity professional. Allied Academies International Conference: Proceedings of The
Academy of Legal, Ethical & Regulatory Issues (ALERI), 15(2), 5-12. Retrieved from:
http://www.alliedacademies.org/affiliate-academies-aleri.php
Wu, X., Chen, B., & Weng, J. (2016). Reversible data hiding for encrypted signals by homomorphic
encryption and signal energy transfer. Journal of Visual Communication and Image
Representation, 41, 58-64. doi:10.1016/j.jvcir.2016.09.005
Yimam, D., Fernandez, E., B., Building compliance and security reference architectures (CSRA) for
Cloud Systems. (2016). 2016 IEEE International Conference on Cloud Engineering (IC2E), ic2e,
147. doi:10.1109/IC2E.2016.16
Younis, Y., Kifayat, K., & Merabti, M. (2014). An access control model for Cloud computing. Journal
of Information Security and Applications, 19, 45-60. doi:10.1016/j.jisa.2014.04.003
162
Yu, Y., Li, M., Li, X., Zhao, J. L., & Zhao, D. (2018). Effects of entrepreneurship and IT fashion on
SMEs’ transformation toward Cloud service through mediation of trust. Information &
Management, 55(2), 245-257
Zong-you, D., Wen-long, Z., Yan-an, S., & Hai-too W., (2017) The application of Cloud matter —
Element in information security risk assessment. 2017 3rd International Conference on
Information Management (ICIM), 218. doi:10.1109/INFOMAN.2017.7950379
Zibouh, O., Dalli, a., & Drissi, H. (2016). Cloud computing security through parallelizing fully
homomorphic encryption applied to multi-Cloud approach. Journal of Theoretical & Applied
Information Technology, 87(2), 300. Retrieved from https://www.jatit.org/
Zissis, D., & Lekkas, D. (2012). Addressing Cloud computing security issues. Future Generation
Computer Systems, 28, 583-592. doi:10.1016/j.future.2010.12.006
163
Appendix A Survey Answers Aggregate
Table 1
Survey 1, Q6: What types of SMEs have you performed or been involved in risk assessments
for? Please select all that apply.
Answer Choices Responses Count
Primary (mining, farming,
fishing, etc)
20% 4
Secondary (manufacturing) 25% 5
Tertiary (service, teaching,
nursing, etc)
25% 5
Quaternary 75% 15
Other 20% 4
Any additional comments (We
want your expertise)?
20% 4
164
Table 2.
Survey 1, Q7: What types of risk assessments have you been involved with? Please select all that
apply.
Answer Choices Responses Count
Financial 50% 10
IT (Information Technology) 95% 19
Cloud computing 65% 13
Internal 70% 14
External 80% 16
Qualitative 50% 10
Quantitative 40% 8
Other 10% 2
Any additional comments (We
want your expertise)
0% 0
165
Table 3.
Survey 1, Q8: For SMEs that are planning to adopt Cloud computing, do you see SMEs adopting
IT related frameworks (partially or completely)?
Answer Choices Responses Count
Yes 85% 17
No 15% 30
166
Table 4.
Survey 1. Q9: What IT related frameworks (partially or completely) do you see SMEs adopting?
Answer Choices Response
s
Count
COBIT (Control Objectives for Information and Related
Technologies)
61.11% 11
ITIL (formerly Information Technology Infrastructure Library) 61.11% 11
TOGAF (The Open Group Architecture Framework for enterprise
architecture)
27.78% 5
ISO/IEC 38500 (International Organization for
Standardization/International Electrotechnical Commission Standard
for Corporate Governance of Information Technology)
61.11% 11
COSO (Committee of Sponsoring Organizations of the Treadway
Commission)
38.89% 7
Other 16.67% 3
167
Table 5.
Survey 1, Q10: For SMEs that are planning to adopt Cloud computing, do you see SMEs using
IT security control standards?
Answer Choices Response
s
Count
Yes 95% 19
No 5% 1
168
Table 6.
Survey 1, Q11: What IT security control standards do you see SMEs using? Please select the
standards from the list below.
169
Answer Choices Response
s
Count
CIS (Center for Internet Security) top 20 controls 52.63% 10
NIST SP 800-53 (National Institute of Standard and Technology
Special Publication 800-53 Security and Privacy Controls for
Information Systems and Organizations)
84.21% 16
NIST Cybersecurity Framework (National Institute of Standard and
Technology)
84.21% 16
ISO/IEC 27001 (International Organization for
Standardization/International Electrotechnical Commission
Information Security Management Systems)
73.68% 14
IEC 62443 (International Electrotechnical Commission Industrial
Network and System Security)
5.26% 1
ENISA NCSS (European Union Agency for Network and
Information Security National Cyber Security Strategies)
15.79% 3
HIPAA (Health Insurance Portability and Accountability Act) 78.95% 15
PCI-DSS (Payment Card Industry Data Security Standard) 68.42% 13
GDPR (General Data Protection Regulation) 78.95% 15
Other 5.26% 1
170
Table 7.
Survey 1, Question 12: Have you seen Cloud security configuration baselines used by SMEs?
Answer Choices Response
s
Count
Yes 84.21% 16
No 15.79% 3
171
Table 8.
Survey 1, Q 13: What Cloud security configuration baselines have you seen used by SMEs?
Please select all that apply.
Answer Choices Response
s
Count
DoD Cloud Security requirements guides (Department of Defense) 62.5% 10
DISA/IASE Security requirements guide (Defense Information
Systems Agency Information Assurance Support Environment)
56.25% 9
CSA Cloud security guidance (Cloud Security Alliance) 31.25% 5
FedRAMP Cloud security baselines (Federal Risk and Authorization
Management Program)
68.75% 11
AWS SbD (Amazon Web Services Security by Design) 50% 8
CIS Cloud baselines (Center for Internet Security) 50% 8
Other 0% 0
172
Table 9.
Survey 1, Q14: Do you see any non-technical areas of concern when SMEs are contemplating
Cloud adoption?
Answer Choices Response
s
Count
Yes 100% 20
No 0% 0
173
Table 10.
Survey 1, Q15: What non-technical areas of concern do you see when SMEs are contemplating
Cloud adoption?
Answer Choices Response
s
Count
Governance 80% 16
Business Process 85% 17
Financial (non-technical) 70% 14
Privacy 85% 17
Legal 55% 11
Other 15% 3
Any additional comments (We want your expertise)? 15% 3
174
Table 11.
Survey 1, Q16: Do you see any IT (not security) areas of concern for SMEs as they adopt Cloud
computing?
Answer Choices Response
s
Count
Yes 100% 20
No 0% 0
175
Table 12.
Survey 1, Q17: What IT (non-security) areas of concern do you see for SMEs as they adopt
Cloud computing? Please select all areas of concern that you have seen.
Answer Choices Response
s
Count
Backup and Restore 60% 12
IT Audit Results 75% 15
Transition Process to Cloud 100% 20
Type of Cloud to use IaaS (Infrastructure as a Service), PaaS
(Platform as a service), SaaS (Software as a service)
70% 14
IT Team Knowledge and Skills 75% 15
Network Path to Cloud (redundant paths, multiple Internet service
providers)
65% 13
Cost 55% 11
Psychological Barriers/Concerns 50% 10
Other 0% 0
Other (please specify) 5% 1
176
Table 13.
Survey 1, Q18: Do you see SMEs adopting Cloud security controls?
Answer Choices Response
s
Count
Yes 95% 19
No 5% 1
177
Table 14.
Survey 1, Q 19: What Cloud security controls do you see SMEs adopting? Please select all
Cloud security controls that you have seen.
178
Answer Choices Response
s
Count
Data storage 68.42% 13
VMs (Virtual Machines) 57.89% 11
Micro services (Docker, Kubernetes, etc.) 31.58% 6
Networks 52.63% 10
Virtual security devices (for example; virtual Firewalls or Amazon
Web Services (AWS) security groups)
73.68% 14
Physical security devices (for example; a Hardware Security Module
(HSM))
57.89% 11
CASB (Cloud Access Security Broker) 21.05% 4
Encryption at rest 78.95% 15
Encryption in transit 89.47% 17
Encryption during compute (homomorphic encryption) 31.58% 6
Backup 52.63% 10
SecaaS (Security as a Service) 31.58% 6
SecSLA (Security Service Level Agreement) 15.79% 3
IAM (Identity and Access Management) 63.16% 12
MultiCloud 15.79% 3
179
Other 0% 0
Table 15.
Survey 1, Q20: Have you seen specific recommendations made to SMEs regarding Cloud
computing adoption.
Answer Choices Response
s
Count
Yes 75% 15
No 25% 5
180
Table 16.
Survey 1, Q21: What specific recommendations have you seen made to SMEs regarding Cloud
computing adoption? Please select all recommendations that you have seen.
Answer Choices Response
s
Count
Accept CSP’s (Cloud Service Provider) attestations such as SAS 70
(Statement of Auditing Standards #70) as proof of compliance
73.33% 11
Accept CSP’s (Cloud Service Provider) SLAs (Service Level
Agreement) or SecSLAs (Security Service Level Agreement)
60% 9
Outsource or contract Cloud operations 80% 12
Do not adopt Cloud 0% 0
Partial Cloud adoption (for example: no sensitive data allowed in
Cloud)
73.33% 11
Other 6.67% 1
181
Survey Two
Table 17.
Survey 2, Q6: How well defined is the term Cloud? Do you see a distinction in the risk analysis
and auditing between the terms below? Please select all that apply.
Answer Choices Response
s
Count
Distinction between colocation and Cloud. 73.91% 17
Distinction between IaaS (Infrastructure as a Service) and cPanel
controlled hosting.
47.83% 11
Distinction between VMware environment and a private Cloud. 56.52% 13
PaaS (Platform as a Service) and SaaS (Software as a Service). 82.61% 19
Other 0% 0
182
Table 18.
Survey 2, Q7: Most SMEs have Cloud operations in progress. Which scenarios have you seen
and which have you seen audited by the SMEs? Please select all that apply.
Answer Choices Response
s
Count
Shadow IT. Cloud spending not officially budgeted, audited, or
documented.
69.57% 16
Test or development environments in Cloud. 65.22% 15
SMEs auditing and securing Cloud test or development
environments.
39.13% 9
One off solutions. For example; a business group using DropBox for
its own files.
52.17% 12
SMEs auditing and securing one off solutions. 39.13% 9
Business group or team using non-standard CSP. For example; SME
has picked AWS but enterprise exchange team is using Azure.
43.48% 10
SMEs auditing and securing non-standard CSP. 30.43% 7
Other. 4.35% 1
183
Table 19.
Survey 2, Q8: When starting to plan a transition to a Cloud environment, what have you seen
SMEs start with before risk assessments or collections of requirements? Please select all that
apply.
Answer Choices Response
s
Count
Choice of CSP (Cloud service provider). 86.96% 20
Choice of infrastructure such as IaaS (Infrastructure as a Service),
PaaS (Platform as a Service), or SaaS (Software as a Service).
69.57% 16%
Choice of IT framework such as COBIT, ITIl, or ISO/IEC 38500. 30.43% 7%
Choice of security control standards such as NIST SP 800-53 or
CSF, HIPAA, or PCI-DSS.
47.83% 11
Choice of Cloud security baselines such as FedRAMP, CIS, or CSA. 47.83% 11
Automation tools such as DevOps or SecDevOps. 26.09% 6
Other 4.35% 1
184
Table 20.
Survey 2, Q9: Do you see SMEs effectively plan their Cloud usage and growth? Please select all
that apply.
Answer Choices Response
s
Count
The SME has a unified plan for their Cloud transition including
auditing and documenting the process.
4.76% 1
The SME has previously adopted Cloud tools and environments as
solutions to single problems. For example; a business group has
adopted Google Docs to share documents.
57.14% 12
The SME views Cloud solutions as solutions to single problems. 33.33% 7
The SME has a Cloud audit team or subject matter expert. 47.62% 10
The SME has separate controls for Cloud environments. 42.86% 9
The SME has BC / DR plans for CSP failures. 33.33% 7
The SME has security procedures for transferring data from on-
premises to Cloud environment.
38.10% 8
The SME moves IT infrastructure to CSPs as servers, IT equipment,
or data centers reach end of life or leases expire.
66.67% 14
Other. 0% 0
185
Table 21.
Survey 2, Q10: 100% of respondents to Survey 1 have seen recommendations to outsource the
transition to a Cloud environment. Which portions of a transition to a Cloud environment have
you seen recommended to be outsourced? Please select all that apply.
Answer Choices Response
s
Count
Entire transition including choice of CSP (Cloud Service Provider),
type of virtual environment, and transfer of data.
15.79% 3
Selecting CSP and type of infrastructure such as IaaS, PaaS, or
SaaS.
47.37% 9
Creating and executing data transfer plan to Cloud environment. 68.42% 13
Creating and executing security controls in Cloud environment. 42.11% 8
Managed or professional services including ongoing management of
SME data and IT operations.
73.68% 14
Managed security services including scheduled audits or penetration
testing.
42.11% 8
Other. 0% 0
186
Table 22.
Survey 2, Q11: Most survey 1 respondents identified a lack of current SME IT staff expertise
and/or desire as an issue in transition to the Cloud. Are there specific staff issues that you have
seen? Please select all that apply.
Answer Choices Response
s
Count
IT staff not sized appropriately. 89.47% 17
Budget for IT staff training in Cloud environments is lacking. 78.95% 15
IT staff resistant to transition to Cloud environments. 63.16% 12
Governance or management structure not adequate for transition to
Cloud environments, For example; IT is a silo and makes its own
decisions.
78.95% 15
SME business structure or processes not conducive to Cloud
operations. For example; each business unit has distinct IT staff and
IT budget.
63.16% 12
Other 5.26% 1
187
Table 23.
Survey 2, Q12: What solutions have you seen SMEs use to remedy a lack of staff Cloud
training? Please select all that apply.
Answer Choices Response
s
Count
Internal ad-hoc training. For example; a CSP account for staff use. 57.89% 11
General Cloud and Cloud security training courses. For example;
SANS courses.
36.84% 7
Specific CSP training. For example; AWS architect training. 42.11% 8
Hiring of additional personnel. 42.11% 8
Outsourcing Cloud related work to a third party. 73.68% 14
Hiring consultants or professional services to complement SME
staff.
78.95% 15
Other. 0% 0
188
Table 24.
Survey 2, Q13: Survey 1 respondents listed a variety of non-IT related concerns with a transition
to a Cloud environment. Which concerns have you seen outsourced and risk assessed by SMEs?
Please select all that apply.
189
Answer Choices Response
s
Count
Privacy. 38.89% 7
Outsourced privacy procedures risk assessed by SMEs. 33.33% 6
Legal. 44.44% 8
Outsourced legal procedures risk assessed by SMEs. 38.89% 7
Governance. 33.33% 6
Outsourced governance procedures risk assessed by SMEs. 27.78% 5
Business process. 44.44% 8
Outsourced business process procedures risk assessed by SMEs. 27.78% 5
Business continuity / Disaster recovery. 44.44% 8
Outsourced BC / DR procedures risk assessed by SMEs. 11.11% 2
Risk assessment. 44.44% 8
Outsourced risk assessment procedures risk assessed by SMEs. 38.89% 7
Outsourced other procedures risk assessed by SMEs. 33.33% 6
Other. 5.56% 1
190
Table 25.
Survey 2, Q14: What are the important factors for a SME when choosing a CSP? Please select all
that apply.
Answer Choices Response
s
Count
Cost. 80% 16
East of use. 60% 12
Auditing and logging capabilities. 60% 12
Security tools. 60% 12
Automation tools (DevOps, SecDevOps). 50% 10
Stability and reliability. 80% 16
Professional or management services. 30% 6
Industry specific tools. For example: For example a CSP that
specializes in HIPAA or PCI-DSS controls.
30% 6
SME IT team familiarity with CSP tools. For example: a MS
Windows IT shop selecting Azure as a CSP.
35% 7
Other. 5% 1
191
Table 26.
Survey 2, Q 15: Which CSPs have you seen used by SMEs? Please select all that apply.
192
Answer Choices Response
s
Count
AWS (Amazon Web Services) 94.74% 18
Microsoft Azure 68.42% 13
Google Cloud platform 47.37% 9
IBM Cloud 42.11% 8
Rackspace 21.05% 4
GoDaddy 10.53% 2
Verizon Cloud 15.79% 3
VMware 52.63% 10
Oracle Cloud 21.05% 4
1&1 5.26% 1
Digital Ocean 10.53% 2
MageCloud 0% 0
InMotion 0% 0
CloudSigma 0% 0
Hyve 0% 0
Ubiquity 0% 0
Hostinger 0% 0
193
Togglebox 0% 0
Alantic.net 0% 0
Navisite 0% 0
Vultr 0% 0
SIM-Networks 0% 0
GigeNet 0% 0
VEXXHOST 0% 0
E24Cloud 0% 0
ElasticHosts 0% 0
LayerStack 0% 0
Other 5.26% 1
194
Table 27.
Survey 2, Q16: Many SMEs use several different Cloud based IT tools. Which tools have you
seen in use, and have you seen them audited? Please select all that apply.
195
Answer Choices Response
s
Count
Cloud email. For example; Gmail. 72.22% 13
Cloud file storage. For example DropBox. 61.11% 11
Cloud office applications. For example o365 55.56% 10
Cloud chat / communications. For example; Slack. 55.56% 10
Cloud file storage audited by SME. 38.89% 7
Cloud CDN (content delivery network). For example; Akamai. 38.89% 7
Email audited by SME. 33.33% 6
Cloud CRM. For example; Salesforce. 33.33% 6
Cloud office applications audited by SME. 22.22% 4
Cloud chat / communications audited by SME. 16.67% 3
Cloud based backup. For example; Zetta. 16.67% 3
Cloud based backup audited by SME. 11.11% 2
Web hosting. For example; GoDaddy. 11.11% 2
Cloud CDN audited by SME. 11.11% 2
Cloud CRM audited by SME. 5.56% 1
Other 5.56% 1
Other audited by SME. 0% 0
196
197
Survey 3
Table 28.
Survey 3, Q6: Have you seen SMEs adapt their risk assessment process for Cloud environments
in any of the following ways? Please select all that apply.
Answer Choices Response
s
Count
Adding Cloud experts to the audit team. 61.11% 11
Outsourcing Cloud audits or risk assessments. 77.78% 14
Break Cloud audits into smaller processes. 33.33% 6
Limit Cloud audits or risk assessments to CSP attestations. 22.22% 4
Other (Please describe) or any additional comments (We want your
expertise)?
0% 0
198
Table 29.
Survey 3, Q7: Have you seen SMEs change how they identify and describe hazards in a Cloud
risk assessment in the ways listed below? Please select all that apply.
Answer Choices Response
s
Count
New hazards specific to CSP, infrastructure, platform, or service are
included.
70.59% 12
New hazards based on the network path between on-premises and
CSP are included.
41.18% 7
New hazards based on specific differences between on-premises and
CSP environments are included.
41.18% 7
No new hazards are included, existing on-premises definitions used. 17.65% 3
Other (Please describe) or any additional comments (We want your
expertise)?
0% 0
199
Table 30.
Survey 3, Q8: Do you see the results of Cloud environment risk assessments and audits changing
the way SMEs conduct business in a meaningful way as per the choices below? Please select all
that apply.
Answer Choices Response
s
Count
Large IT budget reductions. 17.65% 3
Large IT budget increases. 35.29% 6
Changes in risk mitigation costs or procedures. 82.35% 14
Changes in risk avoidance costs or procedures. 64.71% 11
Changes in risk transference costs or procedures. 47.06% 8
Other (Please describe) or any additional comments (We want your
expertise)?
0% 0
200
Table 31.
Survey 3, Q9: When deciding who might be harmed and how, do you see SMEs including new
Cloud based factors such as those listed below? Please select all that apply.
Answer Choices Response
s
Count
National or international norms based on where the CSP is based or
operates?
29.41% 5
National or international norms based on where the SME is based or
operates?
17.65% 3
Specific legal requirements for data such as GDRP. 52.94% 17
201
Table 32.
Survey 3, Q10: When assessing risk of Cloud environments, do you see SMEs changing their
process in the ways listed below? Please select all that apply.
Answer Choices Response
s
Count
Using CSP recommended practices 55.56% 10
Using any IT governance frameworks not previously used by the
SME.
61.11% 11
Using any IT controls not previously used by the SME. 77.78% 14
Using any Cloud security control guides not previously used by the
SME.
61.11% 11
Other (Please describe) or any additional comments (We want your
expertise)?
0% 0
202
Table 33.
Survey 3, Q11: Who do you see SMEs assigning risk ownership t regarding Cloud
environments? Please select all that apply.
Answer Choices Response
s
Count
SME IT team. 0% 0
SME security team. 33.33% 6
3rd party. 11.11% 2
Business owner 38.89% 7
SME does not change risk ownership procedures. 16.67% 3
203
Table 34.
Survey 3, Q12: When identifying controls to reduce risk in Cloud environments, do you see
SMEs changing their process in the ways listed below? Please select all that apply.
Answer Choices Response
s
Count
Primarily relying on CSP provided controls. 50% 9
Adapting new controls from any IT governance frameworks. 77.78% 14
Using any new non-Cloud specific IT security controls. 22.22% 4
Using any Cloud security control guides 66.67% 12
Other (Please describe) or any additional comments (We want your
expertise)?
0% 0
204
Table 35.
Survey 3, Q 13: Once controls have been identified for the SME’s environment, what effect do
they have on existing SME IT controls? Please select all that apply.
Answer Choices Response
s
Count
New Cloud controls are kept separate from existing control
catalogues.
35.29% 6
New Cloud controls are combined with existing controls to form
larger control catalogues.
64.71% 11
New Cloud controls promise to replace or reduce existing control
catalogues spurring increased Cloud transitions.
17.65% 3
New Cloud controls appear onerous and reduce Cloud transitions
due to increased difficulty.
5.88% 1
Other (Please describe) or any additional comments (We want your
expertise)?
5.88% 1
205
Table 36.
Survey 3, Q14: Have you seen Cloud risk assessments change other previously completed SME
risk assessments in the ways listed below? Please select all that apply.
Answer Choices Response
s
Count
Previous risk assessments changed because of CSP location. 6.25% 1
Previous risk assessments changed because of new legal or
regulatory requirements based on Cloud usage.
37.50% 6
Previous risk assessments changed because of new financial
requirements based on Cloud usage.
6.25% 1
Previous risk assessments changed because of new insurance
requirements based on Cloud usage.
6.25% 1
Previous risk assessments changed because of new market
requirements based on Cloud usage.
0% 0
Previous risk assessments changed because of new operational
requirements based on Cloud usage.
37.5% 6
Previous risk assessments changed because of new strategic
requirements based on Cloud usage.
6.25% 1
Other (Please describe) or any additional comments (We want your
expertise)?
0% 0
206
Table 37.
Survey 3, Q15: Cloud transitions almost always promise cost saving and Cloud operations
usually require less effort than on-premise IT operations. Cloud transitions, however, increase
the risk and audit team’s responsibilities, knowledge and skills requirements. How do you see
SMEs changing their risk and audit teams to adapt to Cloud environments? Please select all that
apply.
Answer Choices Responses Count
Increase size and budget of risl and audit teams. 41.18% 7
Reorganize or change structure of risk and audit teams 64.71% 11
Increase outsourcing or use of consultants to perform Cloud risk and
audit duties.
47.06% 8
Increase workload of existing risk and audit teams. 76.47% 13
207
Appendix B Survey one individual answers
Table 38
Survey One, Question One.
208
My name is Matthew Meersman. I am a doctoral student at Northcentral University. I am
conducting a research study on Cloud computing risk assessments for Small to Medium sized
enterprises (SMEs). I am completing this research as part of my doctoral degree. Your
participation is completely voluntary. I am seeking your consent to involve you and your
information in this study. Reasons you might not want to participate in the study include a lack
of knowledge in Cloud computing risk assessments. You may also not be interested in Cloud
computing risk assessments. Reasons you might want to participate in the study include a desire
to share your expert knowledge with others. You may also wish to help advance the field of
study on Cloud computing risk assessments. An alternative to this study is simply not
participating. I am here to address your questions or concerns during the informed consent
process via email. This is not an ISACA sponsored survey so there will be no CPEs awarded for
participation in this survey. PRIVATE INFORMATION Certain private information may be
collected about you in this study. I will make the following effort to protect your private
information. You are not required to include your name in connection with your survey. If you
do choose to include your name, I will ensure the safety of your name and survey by maintaining
your records in an encrypted password protected computer drive. I will not ask the name of your
employer. I will not record the IP address you use when completing the survey. Even with this
effort, there is a chance that your private information may be accidentally released. The chance is
small but does exist. You should consider this when deciding whether to participate. If you
participate in this research, you will be asked to:1. Participate in a Delphi panel of risk experts by
answering questions in three web-based surveys. Each survey will contain twenty to thirty
questions and should take less than twenty minutes to complete. Total time spent should be one
hour over a period of approximately six to eight weeks. A Delphi panel is where I ask you risk
209
experts broad questions in the first survey. In the second survey I ask you new questions based
on what you, as a group, agreed on. I do the same thing for the third round. By the end, your
expert judgement may tell us what works in Cloud risk assessments.Eligibility: You are eligible
to participate in this research if you: 1. Are an adult over the age of eighteen. 2. Have five or
more years of experience in the IT risk field.You are not eligible to participate in this research if
you: 1. Under the age of eighteen. 2. If you have less than five years of experience in the IT risk
field. I hope to include twenty to one hundred people in this research. Because of word limits in
Survey Monkey questions you read and agree to this page and the next page to consent to this
study.
Respondent
ID Response
10491254797 Agree
10490673669 Agree
10487064183 Agree
10485140823 Agree
10484829239 Agree
10484680154 Agree
10484537295 Agree
10475221052 Agree
10475220285 Agree
10471701090 Agree
10471688387 Agree
10471530810 Agree
10448076199 Agree
10447785436 Agree
10445940900 Agree
10431943351 Agree
10431854058 Agree
10427699337 Agree
10417386813 Agree
10412337046 Agree
10407510054 Agree
10407305328 Agree
10407277615 Agree
210
10402182952 Agree
Table 39
Survey One, Question Two.
Part 2 of the survey confidentiality agreement Risks: There are minimal risks in this
study. Some possible risks include: a third party figuring out your identity or your employer’s
identity if they are able to see your answers before aggregation of answers takes place.To
decrease the impact of these risks, you can skip any question or stop participation at any
time.Benefits: If you decide to participate, there are no direct benefits to you.The potential
benefits to others are: a free to use Cloud computing risk assessment tool. Confidentiality: The
information you provide will be kept confidential to the extent allowable by law. Some steps I
will take to keep your identity confidential are; you are not required to provide your name or
your employer’s name. I will not record your IP address.The people who will have access to your
information are myself, and/or, my dissertation chair, and/or, my dissertation committee. The
Institutional Review Board may also review my research and view your information.I will secure
your information with these steps: Encrypting all data received during this study during storage.
There will be no printed copies. There will be one copy of the data stored on an encrypted thumb
drive that is stored in my small home safe. There will be one copy of the data stored as an
encrypted archive in my personal Google G Drive folder.I will keep your data for 7 years. Then,
I will delete the electronic data in the G Drive folder and destroy the encrypted thumb drive.
Contact Information: If you have questions for me, you can contact me at: 202-798-3647 or
M.Meersman5121@o365.ncu.eduMy dissertation chair’s name is Dr. Smiley. He works at
Northcentral University and is supervising me on the research. You can contact him at:
Gsmiley@ncu.edu or 703.868.4819If you contact us you will be giving us information like your
211
phone number or email address. This information will not be linked to your responses if the
study is anonymous.If you have questions about your rights in the research, or if a problem has
occurred, or if you are injured during your participation, please contact the Institutional Review
Board at: irb@ncu.edu or 1-888-327-2877 ext 8014. Voluntary Participation: Your participation
is voluntary. If you decide not to participate, or if you stop participation after you start, there
will be no penalty to you. You will not lose any benefit to which you are otherwise
entitled.Future ResearchAny information or specimens collected from you during this research
may not be used for other research in the future, even if identifying information is removed.
AnonymityThis study is anonymous, and it is not the intention of the researcher to collect your
name. However, you do have the option to provide your name voluntarily. Please know that if
you do, it may be linked to your responses in this study. Any consequences are outside the
responsibility of the researcher, faculty supervisor, or Northcentral University. If you do wish to
provide your name, a space will be provided. Again, including your name is voluntary, and you
212
can continue in the study if you do not provide your
name.________________________________ (Your Signature only if you wish to sign)
Respondent
ID Response
10491254797 Yes
10490673669 Yes
10487064183 Yes
10485140823 Yes
10484829239 Yes
10484680154 Yes
10484537295 Yes
10475221052 Yes
10475220285 No
10471701090 Yes
10471688387 No
10471530810 No
10448076199 Yes
10447785436 Yes
10445940900 Yes
10431943351 Yes
10431854058 Yes
10427699337 Yes
10417386813 Yes
10412337046 Yes
10407510054 No
10407305328 Yes
10407277615 Yes
10402182952 Yes
213
Table 40
Survey One, Question Three.
Are you between the ages of 18 to 65?
Respondent
ID Response
10491254797 Yes
10490673669 Yes
10487064183 Yes
10485140823 Yes
10484829239 Yes
10484680154 Yes
10484537295 Yes
10475221052 Yes
10475220285 Null
10471701090 Yes
10471688387 Null
10471530810 Null
10448076199 Yes
10447785436 Yes
10445940900 Yes
10431943351 Yes
10431854058 Yes
10427699337 Yes
10417386813 Yes
10412337046 Yes
10407510054 Null
10407305328 Yes
10407277615 Yes
10402182952 Yes
214
Table 41
Survey One, Question Four
Do you have 5 or more years in the risk field (please include any postgraduate education)?
Respondent
ID Response
10491254797 Yes
10490673669 Yes
10487064183 Yes
10485140823 Yes
10484829239 Yes
10484680154 Yes
10484537295 Yes
10475221052 Yes
10475220285 Null
10471701090 Yes
10471688387 Null
10471530810 Null
10448076199 Yes
10447785436 Yes
10445940900 Yes
10431943351 Yes
10431854058 Yes
10427699337 Yes
10417386813 Yes
10412337046 Yes
10407510054 Null
10407305328 Yes
10407277615 Yes
10402182952 Yes
Table 42
Survey One, Question Five
For this study we define small to medium enterprises (SMEs) by the European Commission
guidelines: Small (15 million or less in annual revenue) to medium (60 million or less in annual
revenue) sized enterprises that are not subsidiaries of large enterprises or governments, or wholly
215
or partially supported by large enterprises or governments. Please remember that you are free to
not answer any of the following questions that you wish. If a question is asking for information
you do not wish to share, do not answer it.
Respondent ID Response
10491254797 Agree
10490673669 Agree
10487064183 Agree
10485140823 Agree
10484829239 Agree
10484680154 Agree
10484537295 Agree
10475221052 Agree
10475220285 Null
10471701090 Agree
10471688387 Null
10471530810 Null
10448076199 Agree
10447785436 Agree
10445940900 Agree
10431943351 Agree
10431854058 Agree
10427699337 Agree
10417386813 Agree
10412337046 Agree
10407510054 Null
10407305328 Agree
10407277615 Agree
10402182952 Agree
216
Table 43
Survey One, Question Six.
What types of SMEs have you performed or been involved in risk assessments for? Please select
all that apply:
217
Respondent
ID
Primary
(mining,
farming,
fishing,
etc)
Secondary
(manufacturing)
Tertiary
(service,
teaching,
nursing,
etc)
Quaternary
(IT, research,
and
development,
etc) Other
Any additional
comments (We want
your expertise)?
10491254797 Yes Yes
10490673669 Yes
10487064183 Yes Yes Yes Yes Yes
10485140823 Yes
10484829239 Yes Yes
10484680154 Yes
10484537295
Primary – Info Tech
Secondary –
Intellectual
Property\R&D
Tertiary –
Media\Entertainment
Quaternary –
Manufacturing
10475221052 Yes
10475220285
10471701090 Yes Yes Yes
10471688387
10471530810
10448076199 Yes Yes Yes
10447785436 Yes
10445940900 Yes Yes Yes
10431943351 Yes Yes
10431854058 Yes
10427699337 Yes
10417386813 Yes
performing risk
assessments and
setting up a program
from scratch
10412337046 Yes
10407510054
218
10407305328 Yes
Your question is
unclear. I have
performed risk
assessments over a
24 year period across
17 different
industries, including
1) Entertainment &
Media, 2)
Technology, 3)
Financial Services –
Banks, 5) Financial
Services –
Broker/Dealers, 6)
Hospitality, 7)
Manufacturing, 8)
Retail, 9) Higher
Education, 10) Non-
Profit, 11)
Telecommunications,
12) Transportation
and Logistics, 13)
Healthcare, 14)
Pharmaceuticals, 15)
Mining, 16)
Financial Sevices –
Insurance, 17)
Financial Services –
Mortgage Banking
10407277615 Yes Federal Government
10402182952 Yes Yes
219
Table 44
Survey One, Question Seven.
What types of risk assessments have you been involved with? Please select all that apply:
220
Responde
nt ID
Finan
cial
IT
(Informa
tion
Technol
ogy)
Cloud
comput
ing
Inter
nal
Exter
nal
Qualitat
ive
Quantita
tive
Oth
er
Any
additio
nal
comme
nts (We
want
your
expertis
e)?
10491254
797 Yes Yes Yes Yes Yes Yes Yes
10490673
669 Yes Yes
10487064
183 Yes Yes Yes Yes Yes Yes Yes
10485140
823 Yes Yes Yes Yes Yes
10484829
239 Yes
10484680
154 Yes Yes
10484537
295 Yes Yes Yes Yes Yes Yes
10475221
052 Yes Yes Yes Yes Yes
10475220
285
10471701
090 Yes Yes Yes Yes
10471688
387
10471530
810
10448076
199 Yes Yes Yes Yes Yes Yes
10447785
436 Yes Yes Yes
10445940
900 Yes Yes Yes Yes Yes Yes Yes Yes
10431943
351 Yes Yes Yes Yes Yes
10431854
058 Yes Yes Yes Yes
10427699
337 Yes Yes
221
10417386
813 Yes Yes Yes Yes Yes
10412337
046 Yes Yes Yes Yes
10407510
054
10407305
328 Yes Yes Yes Yes Yes Yes Yes
10407277
615 Yes Yes Yes
10402182
952 Yes Yes Yes Yes Yes Yes
222
Table 44
Survey One, Question Eight.
What IT related frameworks (partially or completely) do you see SMEs adopting?
223
Respond
ent ID
COBIT
(Control
Objective
s for
Informati
on and
Related
Technolo
gies)
ITIL
(formerl
y
Informat
ion
Technol
ogy
Infrastru
cture
Library)
TOGAF
(The
Open
Group
Archite
cture
Framew
ork for
enterpri
se
architec
ture)
ISO/IEC 38500
(International
Organization for
Standardization/Int
ernational
Electrotechnical
Commission
Standard for
Corporate
Governance of
Information
Technology)
COSO
(Commit
tee of
Sponsori
ng
Organiza
tions of
the
Treadwa
y
Commiss
ion)
Oth
er
Any
addition
al
commen
ts (We
want
your
expertise
)?
1049125
4797 Yes Yes Yes
Ye
s
1049067
3669 Yes
1048706
4183 Yes
1048514
0823 Yes
Ye
s
FISMA /
NIST
and
FedRA
MP are
also
consider
ed to be
framewo
rks and
are
probably
the most
widely
enforced
standard
s
1048482
9239 Yes Yes Yes
1048468
0154 Yes
1048453
7295 Yes
1047522
1052
224
1047522
0285
1047170
1090 Yes Yes Yes
1047168
8387
1047153
0810
1044807
6199 Yes Yes
1044778
5436 Yes
1044594
0900 Yes Yes Yes Yes Yes
1043194
3351 Yes Yes Yes
1043185
4058 Yes Yes Yes
1042769
9337 Yes Yes Yes Yes Yes
1041738
6813
1041233
7046 Yes
1040751
0054
1040730
5328 Yes Yes Yes Yes Yes
Ye
s
NIST
800-53
and
other
NIST
framewo
rks
1040727
7615 Yes Yes Yes
1040218
2952 Yes Yes Yes
COBIT
l, COSO,
and ITIL
are used
in SME
environ
ments in
Europe
and the
Middle
East.
225
Table 46
Survey One, Question Nine.
For SMEs that are planning to adopt Cloud computing, do you see SMEs using IT security
control standards?
Respondent
ID Response
10491254797 Yes
10490673669 Yes
10487064183 Yes
10485140823 Yes
10484829239 Yes
10484680154 Yes
10484537295 Yes
10475221052 No
10475220285 Null
10471701090 Yes
10471688387 Null
10471530810 Null
10448076199 Yes
10447785436 Yes
10445940900 Yes
10431943351 Yes
10431854058 Yes
10427699337 Yes
10417386813 Yes
10412337046 Yes
10407510054 Null
10407305328 Yes
10407277615 Yes
10402182952 Yes
226
Table 47
Survey One, Question Ten.
What IT security control standards do you see SMEs using? Please select the standards from the
list below.
227
Resp
onde
nt ID
CIS
(Ce
nter
for
Inte
rnet
Sec
urit
y)
top
20
con
trol
s
NIST
SP
800-
53
(Nati
onal
Instit
ute of
Stand
ard
and
Techn
ology
Speci
al
Publi
cation
800-
53
Secur
ity
and
Priva
cy
Contr
ols
for
Infor
matio
n
Syste
ms
and
Organ
izatio
ns)
NIST
Cybe
rsecu
rity
Fram
ewor
k
(Nati
onal
Instit
ute of
Stand
ard
and
Tech
nolog
y)
ISO/IEC
27001
(Internation
al
Organizatio
n for
Standardiza
tion/Internat
ional
Electrotech
nical
Commissio
n
Information
Security
Managemen
t Systems)
IEC
62443
(Intern
ational
Electr
otechn
ical
Comm
ission
Indust
rial
Netwo
rk and
Syste
m
Securi
ty)
ENI
SA
NCS
S
(Eur
opea
n
Unio
n
Age
ncy
for
Net
wor
k
and
Infor
mati
on
Secu
rity
Nati
onal
Cyb
er
Secu
rity
Strat
egie
s)
HIPA
A
(Healt
h
Insura
nce
Porta
bility
and
Acco
untabi
lity
Act)
PCI
–
DS
S
(Pa
ym
ent
Car
d
Ind
ustr
y
Dat
a
Sec
urit
y
Sta
nda
rd)
GDP
R
(Gen
eral
Data
Prot
ectio
n
Reg
ulati
on)
O
th
er
Any
additi
onal
com
ments
(We
want
your
exper
tise)?
1049
1254
797 Yes Yes Yes Yes Yes Yes Yes
1049
0673
669 Yes Yes Yes Yes Yes
1048
7064
183 Yes Yes Yes Yes Yes Yes Yes
Y
es
228
1048
5140
823 Yes Yes Yes Yes
1048
4829
239 Yes Yes Yes Yes Yes
1048
4680
154 Yes Yes Yes Yes
1048
4537
295 Yes Yes Yes Yes
1047
5221
052
1047
5220
285
1047
1701
090 Yes Yes Yes Yes Yes
1047
1688
387
1047
1530
810
1044
8076
199 Yes Yes Yes Yes Yes Yes Yes
1044
7785
436 Yes Yes Yes Yes Yes Yes Yes
1044
5940
900 Yes Yes Yes Yes Yes Yes Yes Yes Yes
1043
1943
351 Yes Yes Yes Yes Yes
1043
1854
058 Yes Yes Yes Yes Yes
SSAE
18 –
SOC
2
1042
7699
337 Yes Yes Yes Yes
229
1041
7386
813 Yes Yes Yes Yes Yes
1041
2337
046 Yes Yes Yes Yes
1040
7510
054
1040
7305
328 Yes Yes Yes Yes Yes Yes Yes
DHS
–
Cyber
Resili
ency
Fram
ewor
k
(CRR
)
1040
7277
615 Yes Yes Yes
230
1040
2182
952 Yes Yes Yes Yes Yes Yes
SMEs
will
apply
the
stand
ard
most
releva
nt in
the
US to
win
new
busin
ess to
meet
contr
actual
condi
tions
and
apply
other
int’l/
Europ
ean
depen
ding
if
they
do
busin
ess in
Europ
e.
231
Table 48
Survey One, Question Eleven.
Have you seen Cloud security configuration baselines used by SMEs?
Respondent
ID Response
10491254797 Yes
10490673669 Yes
10487064183 Yes
10485140823 Yes
10484829239 Yes
10484680154 Yes
10484537295 Yes
10475221052 Null
10475220285 Null
10471701090 Yes
10471688387 Null
10471530810 Null
10448076199 Yes
10447785436 Yes
10445940900 Yes
10431943351 No
10431854058 Yes
10427699337 Yes
10417386813 No
10412337046 Yes
10407510054 Null
10407305328 Yes
10407277615 No
10402182952 Yes
232
Table 49
Survey One, Question Twelve.
What Cloud security configuration baselines have you seen used by SMEs? Please select all that
apply:
233
Responde
nt ID
DoD
Cloud
Security
requireme
nts guides
(Departm
ent of
Defense)
DISA/IAS
E Security
requireme
nts guide
(Defense
Informatio
n Systems
Agency
Informatio
n
Assurance
Support
Environm
ent)
CSA
Cloud
securit
y
guidan
ce
(Cloud
Securit
y
Allian
ce)
FedRAM
P Cloud
security
baselines
(Federal
Risk and
Authorizat
ion
Managem
ent
Program)
AWS
SbD
(Amaz
on
Web
Servic
es
Securit
y by
Design
)
C1IS
Cloud
baselin
es
(Cente
r for
Interne
t
Securit
y)
Oth
er
Any
addition
al
comme
nts (We
want
your
expertis
e)?
10491254
797 Yes Yes Yes Yes
10490673
669 Yes Yes Yes Yes
10487064
183 Yes Yes Yes Yes Yes
10485140
823 Yes Yes
10484829
239 Yes Yes Yes
10484680
154 Yes Yes Yes
10484537
295 Yes Yes
10475221
052
10475220
285
10471701
090 Yes
10471688
387
10471530
810
10448076
199 Yes Yes Yes Yes Yes
10447785
436 Yes Yes
10445940
900 Yes Yes Yes Yes Yes Yes
234
10431943
351
10431854
058 Yes
10427699
337 Yes Yes
10417386
813
10412337
046 Yes Yes Yes Yes Yes
10407510
054
10407305
328 Yes Yes Yes
10407277
615
10402182
952 Yes Yes Yes
235
Table 50
Survey One, Question Thirteen.
Do you see any non-technical areas of concern when SMEs are contemplating Cloud adoption?
Respondent
ID Response
10491254797 Yes
10490673669 Yes
10487064183 Yes
10485140823 Yes
10484829239 Yes
10484680154 Yes
10484537295 Yes
10475221052 Yes
10475220285 Null
10471701090 Yes
10471688387 Null
10471530810 Null
10448076199 Yes
10447785436 Yes
10445940900 Yes
10431943351 Yes
10431854058 Yes
10427699337 Yes
10417386813 Yes
10412337046 Yes
10407510054 Null
10407305328 Yes
10407277615 Yes
10402182952 Yes
236
Table 51
Survey One, Question Fourteen.
What non-technical areas of concern do you see when SMEs are contemplating Cloud adoption?
Respondent
ID Governance
Business
Process
Financial
(non-
technical) Privacy Legal Other
Any
additional
comments
(We want
your
expertise)?
10491254797 Yes Yes Yes Yes Yes
10490673669 Yes Yes Yes
10487064183 Yes Yes Yes Yes Yes Yes
10485140823 Yes Yes Yes Yes Yes
10484829239 Yes Yes Yes Yes Yes
10484680154 Yes Yes Yes Yes
10484537295 Yes Yes Yes Yes Yes
10475221052 Yes Yes Yes Yes Yes Human
10475220285
10471701090 Yes Yes
10471688387
10471530810
10448076199 Yes Yes
10447785436 Yes Yes Yes Yes
10445940900 Yes Yes Yes Yes Yes
10431943351 Yes Yes Yes Yes
10431854058 Yes Yes
10427699337 Yes Yes
10417386813 Yes Yes Yes
10412337046 Yes Yes Yes Yes
10407510054
10407305328 Yes Yes Yes Yes Yes
Business
Continuity/
Disaster
Recovery
Third Party
risk
management
10407277615 Yes Yes Yes Yes Cost
10402182952 Yes Yes Yes
237
Table 52
Survey One, Question Fifteen.
Do you see any IT (not security) areas of concern for SMEs as they adopt Cloud computing?
Respondent
ID Response
10491254797 Yes
10490673669 Yes
10487064183 Yes
10485140823 Yes
10484829239 Yes
10484680154 Yes
10484537295 Yes
10475221052 Yes
10475220285 Null
10471701090 Yes
10471688387 Null
10471530810 Null
10448076199 Yes
10447785436 Yes
10445940900 Yes
10431943351 Yes
10431854058 Yes
10427699337 Yes
10417386813 Yes
10412337046 Yes
10407510054 Null
10407305328 Yes
10407277615 Yes
10402182952 Yes
238
Table 53
Survey One, Question Sixteen.
What IT (not security) areas of concern do you see for SMEs as they adopt Cloud computing?
Please select all areas of concern that you have seen.
239
Respon
dent ID
Back
up
and
Rest
ore
IT
Aud
it
Res
ults
Transi
tion
Proces
s to
Cloud
Type of
Cloud to
use IaaS
(Infrastru
cture as a
Service),
PaaS
(Platform
as a
Service),
SaaS
(Softwar
e as a
Service)
IT
Team
Knowl
edge
and
Skills
Netwo
rk Path
to
Cloud
(redun
dant
paths,
multipl
e
Interne
t
service
provid
ers)
Co
st
Psychologic
al
Barriers/Co
ncerns
Other
(please
specify)
1049125
4797 Yes Yes Yes Yes Yes
Ye
s
1049067
3669 Yes Yes Yes Yes
1048706
4183 Yes Yes Yes Yes Yes Yes
Ye
s Yes
1048514
0823 Yes Yes Yes
1048482
9239 Yes Yes Yes Yes Yes Yes
Ye
s
1048468
0154 Yes Yes Yes Yes Yes Yes
1048453
7295 Yes Yes Yes Yes
Ye
s
1047522
1052 Yes Yes Yes Yes Yes
Ye
s Yes
1047522
0285
1047170
1090 Yes Yes
1047168
8387
1047153
0810
1044807
6199 Yes Yes Yes Yes Yes
Ye
s
1044778
5436 Yes Yes Yes Yes Yes Yes Yes
1044594
0900 Yes Yes Yes Yes Yes Yes
Ye
s Yes
240
1043194
3351 Yes Yes Yes Yes
1043185
4058 Yes Yes Yes Yes
1042769
9337 Yes Yes Yes Yes
1041738
6813 Yes Yes Yes Yes
1041233
7046 Yes Yes Yes Yes Yes
Ye
s
1040751
0054
1040730
5328 Yes Yes Yes Yes Yes Yes
Ye
s Yes
Adoptio
n of IT
processe
s in the
cloud –
such as
patch
and
vulnera
bility
manage
ment
and
SDLC
Asset
manage
ment –
includin
g
handlin
g end of
life
assets
that
cannot
transitio
n to the
cloud.
1040727
7615 Yes Yes Yes Yes Yes Yes
Ye
s
1040218
2952 Yes Yes Yes
Ye
s
241
Table 54
Survey One, Question Seventeen.
Do you see SMEs adopting Cloud security controls?
Respondent
ID Response
10491254797 Yes
10490673669 Yes
10487064183 Yes
10485140823 Yes
10484829239 Yes
10484680154 Yes
10484537295 Yes
10475221052 No
10475220285 Null
10471701090 Yes
10471688387 Null
10471530810 Null
10448076199 Yes
10447785436 Yes
10445940900 Yes
10431943351 Yes
10431854058 Yes
10427699337 Yes
10417386813 Yes
10412337046 Yes
10407510054 Null
10407305328 Yes
10407277615 Yes
10402182952 Yes
242
Table 55
Survey One, Question Eighteen. Part One of Two.
What Cloud security controls do you see SMEs adopting? Please select all Cloud security
controls that you have seen.
243
Responde
nt ID
Data
stora
ge
VMs
(Virtua
l
Machin
es)
Micro
services
(Docker
,
Kuberne
tes, etc.)
Netwo
rks
Virtua
l
securit
y
device
s (for
examp
le;
virtual
Firew
alls or
Amaz
on
Web
Servic
es
(AWS
)
securit
y
group
s)
Physic
al
securit
y
device
s (for
examp
le; a
Hardw
are
Securit
y
Modul
e
(HSM)
)
CAS
B
(Clou
d
Acce
ss
Secur
ity
Brok
er)
Encrypt
ion at
rest
Encrypt
ion in
transit
1049125
4797 Yes Yes Yes Yes Yes
1049067
3669 Yes Yes Yes Yes Yes Yes Yes
1048706
4183 Yes Yes Yes Yes Yes Yes Yes Yes
1048514
0823 Yes Yes Yes Yes Yes
1048482
9239 Yes Yes Yes Yes
1048468
0154 Yes Yes Yes Yes Yes
1048453
7295 Yes Yes Yes
1047522
1052
1047522
0285
1047170
1090 Yes Yes Yes Yes
1047168
8387
1047153
0810
244
1044807
6199 Yes Yes Yes Yes Yes Yes Yes
1044778
5436 Yes Yes Yes Yes Yes Yes
1044594
0900 Yes Yes Yes Yes Yes Yes Yes Yes Yes
1043194
3351 Yes Yes Yes Yes Yes Yes
1043185
4058 Yes Yes Yes Yes
1042769
9337 Yes Yes Yes Yes Yes
1041738
6813 Yes Yes Yes Yes Yes Yes
1041233
7046 Yes Yes Yes Yes
1040751
0054
1040730
5328 Yes Yes Yes Yes Yes Yes Yes Yes
1040727
7615 Yes Yes
1040218
2952 Yes Yes Yes
245
Table 56
Survey One, Question Eighteen. Part Two of Two.
What Cloud security controls do you see SMEs adopting? Please select all Cloud security
controls that you have seen.
246
Responde
nt ID
Encryption
during
compute
(homomor
phic
encryption
)
Back
up
Secaa
S
(Secur
ity as
a
Servic
e)
SecSLA
(Security
Service
Level
Agreeme
nt)
IAM
(Identity
and
Access
Managem
ent)
MultiCl
oud
Oth
er
Any
addition
al
commen
ts (We
want
your
expertis
e)?
Yes Yes Yes Yes
10491254
797 Yes Yes
10490673
669 Yes Yes Yes Yes Yes
10487064
183 Yes
10485140
823 Yes Yes
10484829
239 Yes Yes
10484680
154
10484537
295
10475221
052
10475220
285 Yes
10471701
090
10471688
387
10471530
810 Yes Yes Yes
10448076
199
10447785
436 Yes Yes Yes Yes Yes Yes
10445940
900 Yes Yes Yes
10431943
351 Yes
247
10431854
058 Yes
10427699
337 Yes Yes Yes
10417386
813 Yes Yes Yes Yes
10412337
046
10407510
054 Yes
10407305
328 Yes
10407277
615
I ONLY
see
SMEs
doing
these.
Others
will be
expensiv
e and
technica
lly
prohibiti
ve.
10402182
952
248
Table 57
Survey One, Question Nineteen.
Have you seen specific recommendations made to SMEs regarding Cloud computing adoption?
Respondent
ID Response
10491254797 Yes
10490673669 Yes
10487064183 Yes
10485140823 Yes
10484829239 Yes
10484680154 Yes
10484537295 No
10475221052 Yes
10475220285 Null
10471701090 Yes
10471688387 Null
10471530810 Null
10448076199 Yes
10447785436 No
10445940900 Yes
10431943351 No
10431854058 No
10427699337 No
10417386813 Yes
10412337046 Yes
10407510054 Null
10407305328 Yes
10407277615 Yes
10402182952 Yes
249
Table 58
Survey One, Question twenty.
What specific recommendations have you seen made to SMEs regarding Cloud computing
adoption? Please select all recommendations that you have seen.
250
Respondent
ID
Accept
CSP’s
(Cloud
Service
Provider)
attestations
such as
SAS 70
(Statement
of
Auditing
Standards
#70) as
proof of
compliance
Accept
CSP’s
(Cloud
Service
Provider)
SLAs
(Service
Level
Agreement)
or
SecSLAs
(Security
Service
Level
Agreement)
Outsource
or
contract
Cloud
operations
Outsource
or
contract
SecaaS
(Security
as a
Service)
MSSP
(Managed
Security
Service
Provider)
etc.
Do
not
adopt
Cloud
Partial
Cloud
adoption
(for
example:
no
sensitive
data
allowed
in
Cloud) Other
Any
additional
comments
(We want
your
expertise)?
10491254797 Yes Yes Yes
10490673669 Yes
10487064183 Yes Yes Yes Yes Yes
10485140823 Yes Yes Yes Yes
Partial
adoption
typically
takes the
form of a
particular
project or
system.
10484829239 Yes Yes
10484680154 Yes Yes
10484537295
10475221052 Yes Yes Yes Yes Yes Yes
10475220285
10471701090 Yes Yes
10471688387
10471530810
10448076199 Yes Yes Yes
10447785436
10445940900 Yes Yes Yes Yes Yes
10431943351
10431854058
10427699337
251
10417386813 Yes Yes Yes Yes
10412337046 Yes Yes
10407510054
10407305328 Yes Yes Yes
SAS70
was retired
several
years ago.
you should
be
referencing
AICPA
SOC1/2 –
SSAE18
10407277615 Yes Yes Yes Yes
10402182952 Yes Yes Yes Yes
252
Table 59
Survey One, Question Twenty-one.
Any additional comments or recommendations for the follow up survey?
Respondent
ID
10491254797
10490673669
10487064183
10485140823
10484829239
10484680154
10484537295
10475221052
10475220285
10471701090
10471688387
10471530810
10448076199
10447785436
10445940900 No
10431943351
10431854058
10427699337
10417386813
10412337046
10407510054
10407305328
I did or see anything your survey regarding a discussion of using a risk
based approach to managing cloud computing risk. In my experience, I
see too many SMEs looking to adopt a framework, implement the
framework with no perspective on the risks. The assumption that a
framework, if implemented, takes care of the risks is flawed. I look
forward to seeing more surveys.
10407277615 Nope
10402182952 Surveys oriented per industry.
253
Appendix C Survey two individual answers
Table 60
Survey Two, Question One.
https://www.surveymonkey.com/r/Meersman_2_of_3_preview. My name is Matthew Meersman.
I am a doctoral student at Northcentral University. I am conducting a research study on Cloud
computing risk assessments for Small to Medium sized enterprises (SMEs). I am completing this
research as part of my doctoral degree. Your participation is completely voluntary. I am seeking
your consent to involve you and your information in this study. Reasons you might not want to
participate in the study include a lack of knowledge in Cloud computing risk assessments. You
may also not be interested in Cloud computing risk assessments. Reasons you might want to
participate in the study include a desire to share your expert knowledge with others. You may
also wish to help advance the field of study on Cloud computing risk assessments. An alternative
to this study is simply not participating. I am here to address your questions or concerns during
the informed consent process via email. This is not an ISACA sponsored survey so there will be
no CPEs awarded for participation in this survey. PRIVATE INFORMATION Certain private
information may be collected about you in this study. I will make the following effort to protect
your private information. You are not required to include your name in connection with your
survey. If you do choose to include your name, I will ensure the safety of your name and survey
by maintaining your records in an encrypted password protected computer drive. I will not ask
the name of your employer. I will not record the IP address you use when completing the survey.
Even with this effort, there is a chance that your private information may be accidentally
released. The chance is small but does exist. You should consider this when deciding whether to
254
participate. If you participate in this research, you will be asked to: 1. Participate in a Delphi
panel of risk experts by answering questions in three web-based surveys. Each survey will
contain twenty to thirty questions and should take less than twenty minutes to complete. Total
time spent should be one hour over a period of approximately six to eight weeks. A Delphi panel
is where I ask you risk experts broad questions in the first survey. In the second survey I ask you
new questions based on what you, as a group, agreed on. I do the same thing for the third round.
By the end, your expert judgement may tell us what works in Cloud risk assessments.
Eligibility: You are eligible to participate in this research if you: 1. Are an adult over the age of
eighteen. 2. Have five or more years of experience in the IT risk field. You are not eligible to
participate in this research if you: 1. Under the age of eighteen. 2. If you have less than five years
of experience in the IT risk field. I hope to include twenty to one hundred people in this research.
255
Because of word limits in Survey Monkey questions you read and agree to this page and the next
page to consent to this study.
Respondent
ID Agree Disagree
10553721693 Agree
10544594567 Agree
10541108771 Agree
10540849349 Agree
10539799082 Agree
10539415359 Agree
10537530950 Agree
10535079594 Agree
10532895540 Agree
10532865651 Agree
10532402129 Agree
10531688057 Agree
10531608134 Agree
10531591833 Agree
10530967705 Agree
10530924512 Agree
10530913418 Agree
10530912179 Agree
10530872657 Agree
10530871673 Agree
10530844402 Disagree
10530837446 Agree
10530835085 Agree
10530534471 Agree
10530525458 Agree
10530496542 Agree
10530446115 Agree
10530171158 Agree
10517027813 Agree
10513614347 Agree
256
Table 61
Survey Two, Question Two.
Part 2 of the survey confidentiality agreement Risks: There are minimal risks in this
study. Some possible risks include: a third party figuring out your identity or your employer’s
identity if they are able to see your answers before aggregation of answers takes place.To
decrease the impact of these risks, you can skip any question or stop participation at any time.
Benefits: If you decide to participate, there are no direct benefits to you. The potential benefits to
others are: a free to use Cloud computing risk assessment tool. Confidentiality: The information
you provide will be kept confidential to the extent allowable by law. Some steps I will take to
keep your identity confidential are; you are not required to provide your name or your
employer’s name. I will not record your IP address. The people who will have access to your
information are myself, and/or, my dissertation chair, and/or, my dissertation committee. The
Institutional Review Board may also review my research and view your information. I will
secure your information with these steps: Encrypting all data received during this study during
storage. There will be no printed copies. There will be one copy of the data stored on an
encrypted thumb drive that is stored in my small home safe. There will be one copy of the data
stored as an encrypted archive in my personal Google G Drive folder. I will keep your data for 7
years. Then, I will delete the electronic data in the G Drive folder and destroy the encrypted
thumb drive. Contact Information: If you have questions for me, you can contact me at: 202-798-
3647 or M.Meersman5121@o365.ncu.eduMy dissertation chair’s name is Dr. Smiley. He works
at Northcentral University and is supervising me on the research. You can contact him at:
Gsmiley@ncu.edu or 703.868.4819If you contact us you will be giving us information like your
phone number or email address. This information will not be linked to your responses if the
257
study is anonymous. If you have questions about your rights in the research, or if a problem has
occurred, or if you are injured during your participation, please contact the Institutional Review
Board at: irb@ncu.edu or 1-888-327-2877 ext 8014.Voluntary Participation: Your participation
is voluntary. If you decide not to participate, or if you stop participation after you start, there will
be no penalty to you. You will not lose any benefit to which you are otherwise entitled. Future
Research Any information or specimens collected from you during this research may not be used
for other research in the future, even if identifying information is removed. Anonymity: This
study is anonymous, and it is not the intention of the researcher to collect your name. However,
you do have the option to provide your name voluntarily. Please know that if you do, it may be
linked to your responses in this study. Any consequences are outside the responsibility of the
researcher, faculty supervisor, or Northcentral University. If you do wish to provide your name, a
space will be provided. Again, including your name is voluntary, and you can continue in the
258
study if you do not provide your name.________________________________ (Your Signature
only if you wish to sign)
Respondent
ID Yes No
10553721693 Yes
10544594567 Yes
10541108771 Yes
10540849349 Yes
10539799082 Yes
10539415359 Yes
10537530950 Yes
10535079594 Yes
10532895540 Yes
10532865651 Yes
10532402129 Yes
10531688057 Yes
10531608134 Yes
10531591833 No
10530967705 Yes
10530924512 Yes
10530913418 Yes
10530912179 Yes
10530872657 No
10530871673 Yes
10530844402 No
10530837446 Yes
10530835085 Yes
10530534471 Yes
10530525458 Yes
10530496542 Yes
10530446115 Yes
10530171158 Yes
10517027813 Yes
10513614347 Yes
259
Table 62
Survey Two, Question Three.
Are you between the ages of 18 to 65?
Respondent
ID Yes No
10553721693 Yes
10544594567 Yes
10541108771 Yes
10540849349 Yes
10539799082 Yes
10539415359 Yes
10537530950 Yes
10535079594 Yes
10532895540 Yes
10532865651 Yes
10532402129 Yes
10531688057 Yes
10531608134 Yes
10531591833
10530967705 No
10530924512 Yes
10530913418 Yes
10530912179 Yes
10530872657
10530871673 Yes
10530844402
10530837446 Yes
10530835085 Yes
10530534471 Yes
10530525458 Yes
10530496542 Yes
10530446115 Yes
10530171158 Yes
10517027813 Yes
10513614347 Yes
260
Table 63
Survey Two, Question Four.
Do you have 5 or more years in the risk field (please include any postgraduate education)?
Respondent
ID Yes No
10553721693 Yes
10544594567 Yes
10541108771 Yes
10540849349 Yes
10539799082 Yes
10539415359 Yes
10537530950 Yes
10535079594 Yes
10532895540 Yes
10532865651 Yes
10532402129 Yes
10531688057 Yes
10531608134 Yes
10531591833
10530967705
10530924512 Yes
10530913418 Yes
10530912179 Yes
10530872657
10530871673 Yes
10530844402
10530837446 Yes
10530835085 Yes
10530534471 Yes
10530525458 Yes
10530496542 Yes
10530446115 Yes
10530171158 Yes
10517027813 Yes
10513614347 Yes
261
Table 64
Survey Two, Question Five.
For this study we define small to medium enterprises (SMEs) by the European Commission
guidelines: Small (15 million or less in annual revenue) to medium (60 million or less in annual
revenue) sized enterprises that are not subsidiaries of large enterprises or governments, or wholly
or partially supported by large enterprises or governments. Please remember that you are free to
262
not answer any of the following questions that you wish. If a question is asking for information
you do not wish to share, do not answer it.
Respondent
ID Agree Disagree Any additional comments (We want your expertise)?
10553721693 Agree
10544594567 Agree
10541108771 Agree
10540849349 Agree
10539799082 Agree
10539415359 Agree
10537530950 Agree
10535079594 Agree Name field does not work….
10532895540 Agree
10532865651 Agree
10532402129 Agree
10531688057 Agree
10531608134 Agree
10531591833
10530967705
10530924512 Agree
10530913418 Agree
10530912179 Agree
10530872657
10530871673 Agree
10530844402
10530837446 Agree
10530835085 Agree
10530534471 Agree
10530525458 Agree
10530496542 Agree
10530446115 Agree
10530171158 Agree
10517027813 Agree
10513614347 Agree
263
Table 65
Survey Two, Question Six.
How well defined is the term Cloud? Do you see a distinction in the risk analysis and auditing
between the terms below? Please select all that apply.
264
Respondent
ID
Distinction
between
colocation
and Cloud.
Distinction
between IaaS
(infrastructure
as a Service)
and cPanel
controlled
hosting.
Distinction
between
VMware
environment
and a
private
Cloud.
PaaS
(Platform as
a Service)
and SaaS
(Software as
a Service). Other.
Any additional
comments? (We
want your expertise).
10553721693 Yes Yes
10544594567 Yes Yes
10541108771 Yes Yes Yes Yes
10540849349 Yes Yes Yes Yes
10539799082 Yes Yes
10539415359 Yes Yes Yes Yes
10537530950 Yes Yes Yes
cPanel is not IaaS
interface, it is for
web sites, more PaaS
or SaaS
10535079594 Yes Yes Yes
10532895540
This question and the
answers to select
make little sense to
me. You ask two
questions, and
provide one set of
answers. If you’re
going to do PhD-
level work, you need
to do much, much
better than this.
Harsh feedback,
perhaps, but you
need to hear it. I
have no idea how to
answer this question
in a manner that
makes sense.
10532865651 Yes Yes Yes Yes
10532402129 Yes Yes Yes Yes
10531688057 Yes Yes Yes
10531608134 Yes Yes
10531591833
10530967705
10530924512 Yes Yes Yes
10530913418 Yes
265
10530912179 Yes Yes
10530872657
10530871673 Yes Yes Yes Yes
10530844402
10530837446 Yes Yes Yes
10530835085 Yes
10530534471 Yes Yes
10530525458 Yes Yes
10530496542 Yes Yes
10530446115 Yes
10530171158 Yes Yes
10517027813
10513614347
266
Table 66
Survey Two, Question Seven. Part One of Two.
Most SMEs have Cloud operations in progress. Which scenarios have you seen and which have
you seen audited by the SMEs? Please select all that apply.
267
Respondent
ID
Shadow IT.
Cloud
spending not
officially
budgeted,
audited, or
documented.
Test or
development
environments
in Cloud.
SMEs
auditing and
securing
Cloud test or
development
environments.
One off
solutions.
For
example; a
business
group using
DropBox for
its own files.
SMEs
auditing and
securing
one off
solutions.
10553721693 Yes Yes Yes
10544594567 Yes Yes
10541108771 Yes Yes Yes Yes
10540849349
10539799082 Yes Yes Yes
10539415359 Yes
10537530950 Yes Yes Yes
10535079594 Yes Yes Yes Yes
10532895540 Yes Yes Yes
10532865651 Yes Yes Yes Yes Yes
268
10532402129 Yes Yes
10531688057 Yes Yes Yes Yes Yes
10531608134 Yes Yes Yes Yes Yes
10531591833
10530967705
10530924512
10530913418 Yes
10530912179 Yes
10530872657
10530871673 Yes Yes
10530844402
10530837446 Yes Yes Yes
10530835085 Yes
10530534471 Yes Yes Yes
10530525458 Yes Yes
10530496542 Yes Yes
10530446115 Yes Yes Yes Yes
10530171158 Yes Yes
10517027813
10513614347
269
Table 67
Survey Two, Question Seven. Part Two of Two.
Most SMEs have Cloud operations in progress. Which scenarios have you seen and which have
you seen audited by the SMEs? Please select all that apply.
270
Respondent
ID
Business group or
team using non-
standard CSP. For
example; SME has
picked AWS but
enterprise exchange
team is using Azure.
SMEs auditing
and securing
non-standard
CSP. Other.
Any additional
comments? (We want
your expertise).
10553721693 Yes
10544594567
10541108771 Yes
10540849349 Yes Yes
10539799082 Yes
10539415359
10537530950 Yes
10535079594 Yes Yes
If this counts,
different email
provider
10532895540 Yes Yes
What is the
difference between
answers 2 and 3, 6
and 7? Sorry, but
you REALLY need
to focus on cogent
language and clear
thought. I’m
answering this survey
to help you, but
becoming more and
more irritated with its
sloppiness.
10532865651
271
10532402129 Yes
10531688057 Yes Yes
10531608134 Yes Yes
10531591833
10530967705
10530924512
10530913418
10530912179
10530872657
10530871673 Yes
10530844402
10530837446 Yes
10530835085
10530534471
10530525458 Yes
10530496542
10530446115
10530171158
10517027813
10513614347
Table 68
Survey Two, Question Eight.
When starting to plan a transition to a Cloud environment, what have you seen SMEs start with
before risk assessments or collection of requirements? Please select all that apply;
272
Responde
nt ID
Choice
of CSP
(Cloud
service
provide
r).
Choice of
infrastruct
ure such as
IaaS
(Infrastruc
ture as a
Service),
PaaS
(Platform
as a
Service),
or SaaS
(Software
as a
Service).
Choice
of IT
framew
ork
such as
COBIT,
ITIL or
ISO/IE
C
38500.
Choice
of
securit
y
control
standar
ds
such as
NIST
SP
800-53
or
CSF,
HIPA
A, or
PCI-
DSS.
Choice
of Cloud
security
baseline
s such
as
FedRA
MP,
CIS, or
CSA.
Automati
on tools
such as
DevOps
or
SecDevO
ps.
Othe
r.
Any
addition
al
comme
nts (We
want
your
expertis
e)?
10553721
693 Yes Yes Yes
10544594
567 Yes Yes Yes Yes
10541108
771 Yes Yes Yes Yes
10540849
349 Yes Yes
10539799
082 Yes Yes Yes Yes
10539415
359 Yes Yes Yes Yes Yes Yes
10537530
950 Yes Yes Yes
10535079
594 Yes
Not
been
involve
d
10532895
540 Yes Yes Yes Yes Yes
Finally,
a
question
that
makes
sense.
10532865
651 Yes Yes Yes
10532402
129 Yes Yes
273
10531688
057 Yes Yes Yes
10531608
134 Yes Yes Yes Yes
10531591
833
10530967
705
10530924
512
10530913
418 Yes Yes Yes Yes Yes
10530912
179 Yes
10530872
657
10530871
673 Yes
10530844
402
10530837
446 Yes Yes Yes
10530835
085 Yes
10530534
471 Yes Yes
10530525
458 Yes Yes Yes Yes
10530496
542 Yes Yes Yes Yes Yes
10530446
115 Yes Yes Yes
10530171
158 Yes Yes Yes
10517027
813
10513614
347
274
Table 69
Survey Two, Question Nine Part One of Two.
Do you see SMEs effectively plan their Cloud usage and growth? Please select all that apply.
275
Respondent
ID
The SME
has a unified
plan for their
Cloud
transition
including
auditing and
documenting
the process.
The SME has
previously
adopted
Cloud tools
and
environments
as solutions
to single
problems.
For example;
a business
group has
adopted
Google Docs
to share
documents.
The SME
views
Cloud
solutions
as
solutions
to
individual
problems.
The SME
has a
Cloud
audit
team or
subject
matter
expert.
The SME has
separate
controls for
Cloud
environments.
10553721693 Yes Yes
10544594567 Yes
10541108771 Yes Yes
10540849349 Yes
10539799082 Yes Yes
10539415359 Yes
10537530950 Yes Yes Yes
10535079594 Yes Yes Yes
10532895540
10532865651 Yes Yes
276
10532402129
10531688057 Yes Yes Yes
10531608134 Yes Yes Yes
10531591833
10530967705
10530924512
10530913418 Yes Yes Yes
10530912179
10530872657
10530871673 Yes Yes
10530844402
10530837446 Yes Yes
10530835085 Yes
10530534471 Yes
10530525458 Yes Yes
10530496542 Yes
10530446115 Yes Yes Yes
10530171158 Yes
10517027813
10513614347
277
Table 70
Survey Two, Question Nine Part Two of Two.
Do you see SMEs effectively plan their Cloud usage and growth? Please select all that apply.
278
Respondent
ID
The SME has
BC / DR
plans for
CSP failures.
The SME has
security
procedures
for
transferring
data from on-
premises to
Cloud
environment.
The SME
moves IT
infrastructure
to CSPs as
servers, IT
equipment, or
data centers
reach end of
life or leases
expire. Other.
Any
additional
comments
(We want
your
expertise)?
10553721693 Yes
10544594567
10541108771 Yes
10540849349 Yes Yes Yes
10539799082 Yes
10539415359
10537530950 Yes Yes Yes
10535079594
10532895540
* Sigh * You
ask a Yes/No
question,
then provide
other types of
answers. I
give up.
10532865651 Yes Yes Yes
10532402129 Yes Yes
10531688057 Yes Yes Yes
10531608134 Yes Yes Yes
10531591833
10530967705
10530924512
10530913418 Yes Yes
279
10530912179
10530872657
10530871673 Yes
10530844402
10530837446 Yes Yes
10530835085
10530534471 Yes
10530525458 Yes
10530496542 Yes
10530446115
10530171158 Yes
10517027813
10513614347
280
Table 71
Survey Two, Question Ten part One of Two.
100% of respondents to Survey 1 have seen recommendations to outsource the transition to a
Cloud environment. Which portions of a transition to a Cloud environment have you seen
recommended to be outsourced? Please select all that apply.
281
Respondent
ID
Entire
transition
including
choice of
CSP (Cloud
Service
Provider),
type of
virtual
environment,
and transfer
of data.
Selecting
CSP and
type of
infrastructure
such as IaaS,
PaaS, or
SaaS.
Creating and
executing
data transfer
plan to
Cloud
environment.
Creating and
executing
security
controls in
Cloud
environment.
10553721693 Yes
10544594567 Yes
10541108771 Yes Yes Yes Yes
10540849349 Yes
282
10539799082 Yes Yes
10539415359 Yes
10537530950 Yes
10535079594 Yes
10532895540
10532865651 Yes
10532402129 Yes Yes
10531688057 Yes Yes Yes
10531608134
10531591833
10530967705
10530924512
10530913418 Yes
10530912179
10530872657
10530871673 Yes Yes
10530844402
10530837446 Yes Yes Yes Yes
283
10530835085 Yes
10530534471 Yes
10530525458
10530496542 Yes Yes
10530446115 Yes Yes Yes
10530171158 Yes
10517027813
10513614347
284
Table 72
Survey Two, Question Ten part Two of Two.
100% of respondents to Survey 1 have seen recommendations to outsource the transition to a
Cloud environment. Which portions of a transition to a Cloud environment have you seen
recommended to be outsourced? Please select all that apply.
285
Respondent
ID
Managed or professional
services including ongoing
management of SME data
and IT operations.
Managed security
services including
scheduled audits or
penetration testing. Other.
Any additional
comments (We want
your expertise)?
10553721693 Yes
10544594567 Yes
10541108771 Yes Yes
10540849349 Yes
10539799082 Yes
10539415359 Yes Yes
10537530950 Yes Yes
10535079594 Yes Yes
10532895540
See previous
response.
10532865651 Yes
10532402129 Yes
10531688057 Yes Yes
10531608134
10531591833
10530967705
10530924512
10530913418 Yes Yes
10530912179
10530872657
10530871673 Yes
10530844402
10530837446
10530835085
10530534471 Yes
10530525458
10530496542 Yes
10530446115
10530171158 Yes
10517027813
10513614347
286
Table 73
Survey Two, Question Eleven.
Most survey 1 respondents identified a lack of current SME IT staff expertise and/or desire as an
issue in transition to the Cloud. Are there specific staff issues that you have seen? Please select
all that apply.
287
Responde
nt ID
IT staff
not sized
appropriat
ely.
Budget for
IT staff
training in
Cloud
environme
nts
lacking.
IT staff
resistant to
transition
to Cloud
environme
nts.
Governanc
e or
manageme
nt
structure
not
adequate
for
transition
to Cloud
environme
nts. For
example;
IT is a silo
and makes
its own
decisions.
SME
business
structure
or
processe
s not
conduci
ve to
Cloud
operatio
ns. For
example
: each
business
unit has
distinct
IT staff
and IT
budget.
Othe
r.
Any
additional
comment
s (We
want your
expertise)
?
10553721
693 Yes Yes
10544594
567 Yes Yes Yes Yes
10541108
771 Yes Yes Yes Yes Yes
10540849
349 Yes Yes Yes
10539799
082 Yes Yes
10539415
359 Yes Yes Yes Yes Yes
288
10537530
950 Yes Yes Yes
Othe
r.
The
number
of
individual
s
possessin
g cloud
expertise
is limited
and the
skills are
in
demand.
Those
who work
on
gaining
the
expertise
seek
employm
ent
requiring
those
skills;
therefore
existing
IT staff
typically
do not
have
cloud
expertise.
10535079
594 Yes Yes Yes Yes Yes
10532895
540
10532865
651 Yes Yes Yes Yes Yes
10532402
129 Yes Yes Yes Yes
10531688
057 Yes Yes Yes
10531608
134
289
10531591
833
10530967
705
10530924
512
10530913
418 Yes Yes Yes Yes
10530912
179
10530872
657
10530871
673 Yes Yes Yes Yes
10530844
402
10530837
446 Yes Yes Yes Yes Yes
10530835
085 Yes
10530534
471 Yes Yes Yes Yes Yes
10530525
458
290
10530496
542 Yes Yes Yes
We have
a third
party
vendor
helping
with the
migration
to the
cloud,
and we
have also
found a
lack of
deep
technical
knowledg
e and
managem
ent in
vendors
who
propose
their
expertise.
10530446
115 Yes Yes Yes Yes
10530171
158 Yes Yes Yes Yes
10517027
813
10513614
347
291
Table 74
Survey Two, Question Twelve.
What solutions have you seen SMEs use to remedy a lack of staff Cloud training? Please select
all that apply.
292
Respondent
ID
Internal
ad-hoc
training.
For
example;
a CSP
account
for staff
use.
General
Cloud
and
Cloud
security
training
courses.
For
example;
SANS
courses.
Specific
CSP
training.
For
example
AWS
architect
training.
Hiring of
additional
personnel.
Outsourcing
Cloud
related
work to a
third party.
Hiring
consultants
or
professional
services to
complement
SME staff. Other.
Any
additional
comments
(We want
your
expertise)?
10553721693 Yes Yes Yes
10544594567 Yes Yes Yes Yes Yes
10541108771 Yes Yes Yes Yes Yes Yes
10540849349 Yes Yes
10539799082 Yes Yes Yes Yes
10539415359
10537530950 Yes Yes Yes Yes Yes
10535079594 Yes Yes Yes Yes
10532895540
10532865651 Yes Yes
10532402129 Yes Yes Yes
10531688057 Yes Yes Yes Yes
10531608134 Yes
10531591833
10530967705
10530924512
10530913418 Yes Yes Yes Yes
10530912179
10530872657
10530871673 Yes Yes Yes
10530844402
10530837446 Yes Yes Yes
10530835085 Yes
10530534471 Yes Yes Yes Yes
10530525458
10530496542 Yes Yes Yes Yes Yes
Often
budgets
prevent
hiring
additional
staff.
293
10530446115 Yes Yes
10530171158 Yes Yes
10517027813
10513614347
Table 75
Survey Two, Question Thirteen Part One of Two.
Survey 1 respondents listed a variety of non-IT related concerns with a transition to a Cloud
environment. Which concerns have you seen outsourced and risk assessed by SMEs? Please
select all that apply.
294
Respondent
ID
Privacy
.
Outsource
d Privacy
Legal
.
Outsource
d Legal
procedure
s risk
assessed
by SMEs.
Governanc
e.
Outsource
d
governanc
e
procedure
s risk
assessed
by SMEs.
Busines
s
process.
1055372169
3 Yes.
1054459456
7 Yes. Yes Yes. Yes Yes Yes.
1054110877
1 Yes. Yes Yes. Yes Yes Yes. Yes
1054084934
9
1053979908
2 Yes. Yes
1053941535
9 Yes. Yes Yes. Yes Yes Yes. Yes
295
1053753095
0 Yes Yes.
1053507959
4 Yes. Yes
1053289554
0
1053286565
1
1053240212
9
1053168805
7 Yes. Yes Yes. Yes Yes Yes.
1053160813
4
1053159183
3
1053096770
5
1053092451
2
1053091341
8 Yes. Yes Yes Yes. Yes
1053091217
9
1053087265
7
296
1053087167
3 Yes
1053084440
2
1053083744
6 Yes Yes. Yes Yes
1053083508
5 Yes.
1053053447
1 Yes
1053052545
8
1053049654
2
1053044611
5 Yes Yes. Yes
1053017115
8 Yes.
1051702781
3
1051361434
7
Table 76
Survey Two, Question Thirteen Part Two of Two.
297
Survey 1 respondents listed a variety of non-IT related concerns with a transition to a Cloud
environment. Which concerns have you seen outsourced and risk assessed by SMEs? Please
select all that apply.
298
Respond
ent ID
Outsou
rced
busines
s
process
proced
ures
risk
assesse
d by
SMEs.
Busine
ss
contin
uity /
Disast
er
recove
ry.
Outsou
rced
BC /
DR
proced
ures
risk
assesse
d by
SMEs.
R isk
assessm
ent.
Outsourced
risk
assessment pro
cedures risk
assessed by
SMEs.
Outsou
rced
other
proced
ures
risk
assesse
d by
SMEs.
Oth
er.
Any
additio
nal
comme
nts (We
want
your
expertis
e)?
1055372
1693 Yes
1054459
4567 Yes Yes Yes Yes
1054110
8771 Yes Yes Yes Yes Yes
1054084
9349 Yes Yes
1053979
9082
1053941
5359 Yes Yes Yes Yes Yes Yes
1053753
0950 Yes Yes
1053507
9594
1053289
5540
1053286
5651
1053240
2129 Yes
1053168
8057 Yes Yes Yes Yes Yes
1053160
8134
1053159
1833
1053096
7705
1053092
4512
1053091
3418 Yes Yes
299
1053091
2179
1053087
2657
1053087
1673 Yes
Outsou
rced
help
desk
service
s.
1053084
4402
1053083
7446 Yes Yes Yes
1053083
5085
1053053
4471
1053052
5458
1053049
6542 Yes Yes
1053044
6115 Yes Yes
1053017
1158 Yes
1051702
7813
1051361
4347
Table 77
Survey Two, Question 14, part One of Two.
What are the important factors for a SME when choosing a CSP? Please select all that apply.
300
Respondent
ID Cost.
Ease
of
use.
Auditing
and logging
capabilities.
Security
tools.
Automation
tools
(DevOps,
SecDevOps).
Stability
and
reliability.
10553721693 Yes Yes Yes Yes
10544594567 Yes Yes Yes Yes Yes
10541108771 Yes Yes Yes Yes Yes Yes
10540849349 Yes Yes
10539799082 Yes Yes Yes Yes Yes
10539415359 Yes Yes Yes Yes Yes
301
10537530950
10535079594 Yes Yes Yes Yes Yes Yes
10532895540
10532865651 Yes Yes Yes Yes Yes Yes
10532402129 Yes Yes Yes Yes Yes
10531688057 Yes
10531608134 Yes Yes Yes Yes Yes Yes
10531591833
10530967705
10530924512
10530913418 Yes Yes Yes Yes Yes Yes
10530912179
10530872657
10530871673 Yes Yes
10530844402
10530837446 Yes Yes Yes Yes Yes
10530835085 Yes
10530534471 Yes Yes Yes
302
10530525458
10530496542 Yes Yes Yes Yes
10530446115 Yes Yes Yes Yes
10530171158 Yes Yes
10517027813
10513614347
Table 78
Survey Two, Question 14, part Two of Two.
What are the important factors for a SME when choosing a CSP? Please select all that apply.
303
Respondent
ID
Professional
or
management
services.
Industry
specific
tools. For
example a
CSP that
specializes
in HiPAA
or PCI-
DSS
controls.
SME IT
team
familiarity
with CSP
tools. For
example a
MS
Windows
IT shop
selecting
Azure as a
CSP. Other.
Any
additional
comments
(We want
your
expertise)?
10553721693
10544594567 Yes Yes
10541108771
10540849349
10539799082 Yes
10539415359
304
10537530950 Yes Other.
Finding
skilled
personnel
for the CSP.
Everyone
used Cisco
because
folks knew
Cisco – but
it was not
the best
choice in
terms of
costs and
performance
for many
applications.
10535079594 Yes
10532895540
10532865651 Yes
10532402129 Yes Yes
10531688057
10531608134 Yes
10531591833
10530967705
10530924512
10530913418 Yes Yes Yes
10530912179
10530872657
10530871673 Yes
10530844402
10530837446 Yes Yes
10530835085
10530534471 Yes Yes
305
10530525458
10530496542
10530446115 Yes Yes
10530171158
10517027813
10513614347
Table 79
Survey Two, Question Fifteen, part one of
Which CSPs have you seen used by SMEs? Please select all that apply.
306
Respondent
ID
AWS
(Amazon
Web
Services)
Microsoft
Azure
Google
Cloud
platform
IBM
Cloud Rackspace GoDaddy
10553721693 Yes Yes Yes
10544594567 Yes Yes
10541108771 Yes Yes Yes Yes Yes Yes
10540849349 Yes Yes
10539799082 Yes Yes Yes
10539415359 Yes Yes Yes Yes
10537530950 Yes Yes
10535079594 Yes Yes
10532895540
10532865651 Yes Yes
10532402129 Yes
10531688057 Yes Yes Yes Yes
10531608134
10531591833
10530967705
10530924512
10530913418 Yes Yes Yes
10530912179
10530872657
10530871673 Yes
10530844402
10530837446 Yes Yes Yes
10530835085 Yes Yes
10530534471 Yes Yes
10530525458
10530496542 Yes Yes Yes Yes
10530446115 Yes Yes Yes Yes
10530171158 Yes Yes Yes Yes
10517027813
10513614347
Table 80
Survey Two, Question Fifteen, part Two of Five.
307
Which CSPs have you seen used by SMEs? Please select all that apply.
Respondent
ID
Verizon
Cloud VMware
Oracle
Cloud 1&1 DigitalOcean
10553721693
10544594567
10541108771 Yes Yes Yes Yes Yes
10540849349
10539799082
10539415359 Yes Yes
10537530950 Yes
10535079594 Yes
10532895540
10532865651 Yes
10532402129 Yes Yes Yes Yes
10531688057
10531608134
10531591833
10530967705
10530924512
10530913418
10530912179
10530872657
10530871673 Yes
10530844402
10530837446 Yes
10530835085
10530534471 Yes
10530525458
10530496542
10530446115 Yes Yes
10530171158 Yes
10517027813
10513614347
Table 81
Survey Two, Question Fifteen, part Four of Five.
308
Which CSPs have you seen used by SMEs? Please select all that apply.
Respondent
ID MageCloud InMotion CloudSigma Hyve Ubiquity
10553721693
10544594567
10541108771
10540849349
10539799082
10539415359
10537530950
10535079594
10532895540
10532865651
10532402129
10531688057
10531608134
10531591833
10530967705
10530924512
10530913418
10530912179
10530872657
10530871673
10530844402
10530837446
10530835085
10530534471
10530525458
10530496542
10530446115
10530171158
10517027813
10513614347
Table 82
Survey Two, Question Fifteen, part Three of Five.
309
Which CSPs have you seen used by SMEs? Please select all that apply.
Respondent
ID Hostinger Togglebox Atlantic.net Navisite Vultr
SIM-
Networks
10553721693
10544594567
10541108771
10540849349
10539799082
10539415359
10537530950
10535079594
10532895540
10532865651
10532402129
10531688057
10531608134
10531591833
10530967705
10530924512
10530913418
10530912179
10530872657
10530871673
10530844402
10530837446
10530835085
10530534471
10530525458
10530496542
10530446115
10530171158
10517027813
10513614347
Table 83
Survey Two, Question Fifteen, part Five of Five.
310
Which CSPs have you seen used by SMEs? Please select all that apply.
311
Respondent
ID
GigeNe
t
VEXXHOS
T
E24Clou
d
ElasticHos
ts
LayerStac
k
Othe
r
Any
additional
comment
s *We
want your
expertise)
?
1055372169
3
1054459456
7
1054110877
1
1054084934
9
1053979908
2
1053941535
9
1053753095
0
1053507959
4
1053289554
0
1053286565
1
1053240212
9
1053168805
7
1053160813
4
1053159183
3
1053096770
5
1053092451
2
1053091341
8
1053091217
9
1053087265
7
312
1053087167
3
1053084440
2
1053083744
6
1053083508
5
1053053447
1 Yes
1053052545
8
1053049654
2
1053044611
5
1053017115
8
1051702781
3
1051361434
7
313
Respondent
ID
GigeNe
t
VEXXHOS
T
E24Clou
d
ElasticHos
ts
LayerStac
k
Othe
r
Any
additional
comment
s *We
want your
expertise)
?
1055372169
3
1054459456
7
1054110877
1
1054084934
9
1053979908
2
1053941535
9
1053753095
0
1053507959
4
1053289554
0
1053286565
1
1053240212
9
1053168805
7
1053160813
4
1053159183
3
1053096770
5
1053092451
2
1053091341
8
1053091217
9
1053087265
7
314
1053087167
3
1053084440
2
1053083744
6
1053083508
5
1053053447
1 Yes
1053052545
8
1053049654
2
1053044611
5
1053017115
8
1051702781
3
1051361434
7
Table 84
Survey Two, Question 16, One of Three.
Many SMEs use several different Cloud based IT tools. Which tools have you seen in use, and
have you seen them audited? Please select all that apply:
315
Respondent
ID
Cloud
Email.
For
example;
Gmail.
Email
audited
by
SME.
Cloud
file
storage.
For
example;
DropBox.
Cloud
file
storage
audited
by
SME.
Cloud office
applications.
For
example;
o365
Cloud
office
applications
audited by
SME.
10553721693 Yes Yes Yes Yes Yes Yes
10544594567 Yes
10541108771
10540849349 Yes
10539799082 Yes Yes Yes
10539415359 Yes Yes Yes Yes Yes Yes
10537530950 Yes Yes Yes Yes
10535079594 Yes Yes Yes Yes
10532895540
10532865651 Yes Yes Yes Yes
10532402129 Yes
10531688057 Yes Yes
10531608134
10531591833
10530967705
10530924512
10530913418 Yes Yes Yes
10530912179
10530872657
10530871673 Yes Yes
10530844402
10530837446 Yes Yes Yes Yes
10530835085 Yes
10530534471 Yes Yes
10530525458
316
10530496542
10530446115 Yes Yes Yes Yes
10530171158 Yes Yes Yes
10517027813
10513614347
Table 85
Survey Two, Question 16, Two of Three.
Many SMEs use several different Cloud based IT tools. Which tools have you seen in use, and
have you seen them audited? Please select all that apply:
317
Respondent
ID
Cloud chat /
communications
. For example;
Slack.
Cloud chat /
communication
s audited by
SME.
Cloud
based
backup.
For
example
; Zetta.
Cloud
based
backup
audite
d by
SME.
Cloud
CRM. For
example;
Salesforce
.
Cloud
CRM
audite
d by
SME.
1055372169
3 Yes Yes Yes
1054459456
7 Yes
1054110877
1 Yes
1054084934
9 Yes
1053979908
2 Yes
1053941535
9 Yes Yes Yes Yes
1053753095
0 Yes Yes
1053507959
4 Yes Yes
1053289554
0
1053286565
1 Yes
1053240212
9 Yes
1053168805
7 Yes Yes
1053160813
4
1053159183
3
1053096770
5
1053092451
2
318
1053091341
8 Yes Yes
1053091217
9
1053087265
7
1053087167
3 Yes Yes
1053084440
2
1053083744
6 Yes
1053083508
5
1053053447
1 Yes
1053052545
8
1053049654
2
1053044611
5
1053017115
8
1051702781
3
1051361434
7
Table 86
Survey Two, Question 16, Three of Three
Many SMEs use several different Cloud based IT tools. Which tools have you seen in use, and
have you seen them audited? Please select all that apply:
319
Respondent
ID
Web
hosting.
For
example;
GoDaddy.
Cloud
CDN
(content
delivery
network).
For
example;
Akamai.
Cloud
CDN
audited
by SME. Other.
Other
audited
by
SME.
Any
additional
comments
(We want
your
expertise)?
10553721693 Yes
10544594567
10541108771 Yes
I have seen
almost all of
these in use,
but only
seen CRMs
and CDNs
audited.
10540849349
10539799082
10539415359 Yes. Yes
10537530950 Yes Yes
10535079594
10532895540
10532865651 Yes Yes
10532402129
10531688057 Yes
10531608134
10531591833
10530967705
10530924512
10530913418 Yes
10530912179
10530872657
10530871673 Yes
10530844402
10530837446
10530835085 Yes
10530534471
10530525458
320
10530496542
I have seen
several of
these but I
have not
seen them
audited.
10530446115
10530171158
10517027813
10513614347
Table 87
Survey Two, Question Seventeen
Any additional comments or recommendations for the follow up survey?
321
Respondent
ID Comments
10553721693
10544594567
10541108771
10540849349
10539799082
10539415359
10537530950
10535079594
10532895540 This is simply not graduate level work. Sorry.
10532865651
10532402129
10531688057
10531608134
10531591833
10530967705
10530924512
10530913418
10530912179
10530872657
10530871673
10530844402
10530837446
10530835085
10530534471
I think there is very little guidance and I have seen very little assessment or
auditing.
10530525458
10530496542
10530446115
10530171158
10517027813
10513614347
322
Appendix D Survey Three Individual Answers
Table 88
Survey Three, Question One.
My name is Matthew Meersman. I am a doctoral student at Northcentral University. I am
conducting a research study on Cloud computing risk assessments for Small to Medium sized
enterprises (SMEs). I am completing this research as part of my doctoral degree. Your
participation is completely voluntary. I am seeking your consent to involve you and your
information in this study. Reasons you might not want to participate in the study include a lack
of knowledge in Cloud computing risk assessments. You may also not be interested in Cloud
computing risk assessments. Reasons you might want to participate in the study include a desire
to share your expert knowledge with others. You may also wish to help advance the field of
study on Cloud computing risk assessments. An alternative to this study is simply not
participating. I am here to address your questions or concerns during the informed consent
process via email. This is not an ISACA sponsored survey so there will be no CPEs awarded for
participation in this survey. PRIVATE INFORMATION Certain private information may be
collected about you in this study. I will make the following effort to protect your private
information. You are not required to include your name in connection with your survey. If you
do choose to include your name, I will ensure the safety of your name and survey by maintaining
your records in an encrypted password protected computer drive. I will not ask the name of your
employer. I will not record the IP address you use when completing the survey. Even with this
effort, there is a chance that your private information may be accidentally released. The chance is
small but does exist. You should consider this when deciding whether to participate. If you
participate in this research, you will be asked to:1. Participate in a Delphi panel of risk experts by
323
answering questions in three web-based surveys. Each survey will contain twenty to thirty
questions and should take less than twenty minutes to complete. Total time spent should be one
hour over a period of approximately six to eight weeks. A Delphi panel is where I ask you risk
experts broad questions in the first survey. In the second survey I ask you new questions based
on what you, as a group, agreed on. I do the same thing for the third round. By the end, your
expert judgement may tell us what works in Cloud risk assessments Eligibility: You are eligible
to participate in this research if you: 1. Are an adult over the age of eighteen. 2. Have five or
more years of experience in the IT risk field. You are not eligible to participate in this research if
you: 1. Under the age of eighteen.2. If you have less than five years of experience in the IT risk
field. I hope to include twenty to one hundred people in this research. Because of word limits in
324
Survey Monkey questions you read and agree to this page and the next page to consent to this
study.
Respondent ID Agree Disagree
10594631389 Agree
10592543726 Agree
10592389354 Agree
10588959572 Agree
10572611704 Agree
10571628613 Agree
10561345731 Agree
10559610688 Agree
10558924365 Agree
10558850146 Agree
10558685153 Agree
10557162374 Agree
10556647426 Agree
10556319920 Agree
10553788808 Agree
10552983398 Agree
10552074281 Agree
10550402764 Agree
10549771608 Agree
10548528015 Agree
10548469322 Agree
10548450420 Agree
10548449124 Agree
10543731948 Agree
Table 89
Survey Three, Question Two.
Part 2 of the survey confidentiality agreement Risks: There are minimal risks in this study. Some
possible risks include: a third party figuring out your identity or your employer’s identity if they
are able to see your answers before aggregation of answers takes place. To decrease the impact
325
of these risks, you can skip any question or stop participation at any time. Benefits: If you decide
to participate, there are no direct benefits to you. The potential benefits to others are: a free to use
Cloud computing risk assessment tool. Confidentiality: The information you provide will be kept
confidential to the extent allowable by law. Some steps I will take to keep your identity
confidential are; you are not required to provide your name or your employer’s name. I will not
record your IP address. The people who will have access to your information are myself, and/or,
my dissertation chair, and/or, my dissertation committee. The Institutional Review Board may
also review my research and view your information. I will secure your information with these
steps: Encrypting all data received during this study during storage. There will be no printed
copies. There will be one copy of the data stored on an encrypted thumb drive that is stored in
my small home safe. There will be one copy of the data stored as an encrypted archive in my
personal Google G Drive folder. I will keep your data for 7 years. Then, I will delete the
electronic data in the G Drive folder and destroy the encrypted thumb drive. Contact
Information: If you have questions for me, you can contact me at: 202-798-3647 or
M.Meersman5121@o365.ncu.eduMy dissertation chair’s name is Dr. Smiley. He works at
Northcentral University and is supervising me on the research. You can contact him at:
Gsmiley@ncu.edu or 703.868.4819If you contact us you will be giving us information like your
phone number or email address. This information will not be linked to your responses if the
study is anonymous. If you have questions about your rights in the research, or if a problem has
occurred, or if you are injured during your participation, please contact the Institutional Review
Board at: irb@ncu.edu or 1-888-327-2877 ext 8014.Voluntary Participation: Your participation
is voluntary. If you decide not to participate, or if you stop participation after you start, there will
be no penalty to you. You will not lose any benefit to which you are otherwise entitled. Future
326
Research: Any information or specimens collected from you during this research may not be
used for other research in the future, even if identifying information is removed. Anonymity:
This study is anonymous, and it is not the intention of the researcher to collect your name.
However, you do have the option to provide your name voluntarily. Please know that if you do, it
may be linked to your responses in this study. Any consequences are outside the responsibility of
the researcher, faculty supervisor, or Northcentral University. If you do wish to provide your
name, a space will be provided. Again, including your name is voluntary, and you can continue
327
in the study if you do not provide your name.________________________________ (Your
Signature only if you wish to sign)
Respondent
ID Yes No
10594631389 No
10592543726 Yes
10592389354 Yes
10588959572 Yes
10572611704 Yes
10571628613 Yes
10561345731 Yes
10559610688 Yes
10558924365 No
10558850146 Yes
10558685153 Yes
10557162374 Yes
10556647426 Yes
10556319920 Yes
10553788808 Yes
10552983398 Yes
10552074281 Yes
10550402764 Yes
10549771608 Yes
10548528015 Yes
10548469322 Yes
10548450420 Yes
10548449124 Yes
10543731948 Yes
328
Table 90
Survey Three, Question Three.
Are you between the ages of 18 to 65?
Respondent
ID Yes No
10594631389
10592543726 Yes
10592389354 Yes
10588959572 Yes
10572611704 Yes
10571628613 Yes
10561345731 Yes
10559610688 Yes
10558924365
10558850146 Yes
10558685153 Yes
10557162374 Yes
10556647426 Yes
10556319920 Yes
10553788808 Yes
10552983398 Yes
10552074281 Yes
10550402764 Yes
10549771608 Yes
10548528015 Yes
10548469322 Yes
10548450420 Yes
10548449124 Yes
10543731948 Yes
329
Table 91
Survey Three, Question Four.
Do you have 5 or more years in the risk field (please include any postgraduate education)?
Respondent
ID Yes No
10594631389
10592543726 Yes
10592389354 Yes
10588959572 Yes
10572611704 Yes
10571628613 Yes
10561345731 Yes
10559610688 Yes
10558924365
10558850146 Yes
10558685153 Yes
10557162374 Yes
10556647426 Yes
10556319920 Yes
10553788808 Yes
10552983398 Yes
10552074281 Yes
10550402764 Yes
10549771608 Yes
10548528015 Yes
10548469322 Yes
10548450420 Yes
10548449124 Yes
10543731948 Yes
Table 92
Survey Three, Question Five.
For this study we define small to medium enterprises (SMEs) by the European Commission
guidelines: Small (15 million or less in annual revenue) to medium (60 million or less in annual
revenue) sized enterprises that are not subsidiaries of large enterprises or governments, or wholly
330
or partially supported by large enterprises or governments. Please remember that you are free to
not answer any of the following questions that you wish. If a question is asking for information
you do not wish to share, do not answer it.
Respondent
ID Agree Disagree
10594631389
10592543726 Agree
10592389354 Agree
10588959572 Agree
10572611704 Agree
10571628613 Agree
10561345731 Agree
10559610688 Agree
10558924365
10558850146 Agree
10558685153 Agree
10557162374 Agree
10556647426 Agree
10556319920 Agree
10553788808 Agree
10552983398 Agree
10552074281 Agree
10550402764 Agree
10549771608 Agree
10548528015 Agree
10548469322 Agree
10548450420 Agree
10548449124 Agree
10543731948 Agree
331
Table 93
Survey Three, Question Six.
Have you seen SMEs adapt their risk assessment process for Cloud environments in any of the
following ways? Please select all that apply:
332
Respondent
ID
Adding
Cloud
experts
to the
audit
team
Outsourcing
Cloud
audits or
risk
assessments
Break
Cloud audits
or risk
assessments
into smaller
processes
Limit
Cloud
audits or
risk
assessments
to CSP
attestations
Other
(Please
describe)
or any
additional
comments
(We want
your
expertise)?
10594631389
10592543726 Yes Yes
10592389354 Yes
10588959572 Yes
10572611704 Yes
10571628613
10561345731 Yes Yes
10559610688 Yes Yes Yes
10558924365
10558850146
10558685153 Yes
10557162374 Yes Yes
10556647426 Yes Yes
10556319920 Yes
10553788808 Yes Yes
10552983398 Yes Yes Yes
10552074281 Yes Yes
10550402764 Yes
10549771608 Yes Yes
10548528015 Yes Yes Yes
10548469322 Yes Yes Yes
10548450420 Yes Yes Yes
10548449124
10543731948
333
Table 94
Survey Three, Question Seven.
Have you seen SMEs change how they identify and describe hazards in a Cloud risk assessment
in the ways listed below? Please select all that apply:
334
Respondent
ID
New hazards
specific to
CSP,
infrastructure,
platform, or
service are
included
New
hazards
based on
the
network
path
between
on-
premises
and CSP
are
included
New hazards
based on
specific
differences
between on-
premises and
CSP
environments
are included
No new
hazards
are
included,
existing
on-
premises
definitions
used
Other
(Please
describe)
or any
additional
comments
(We want
your
expertise)?
10594631389
10592543726 Yes
10592389354 Yes
10588959572 Yes
10572611704 Yes
10571628613
10561345731 Yes Yes Yes
10559610688 Yes Yes
10558924365
10558850146
10558685153 Yes
10557162374 Yes
10556647426 Yes Yes Yes
10556319920 Yes Yes Yes
10553788808 Yes
10552983398
10552074281 Yes Yes Yes
10550402764 Yes Yes
10549771608 Yes Yes
10548528015 Yes
10548469322 Yes Yes
10548450420 Yes
10548449124
10543731948
335
Table 95
Survey Three, Question Eight.
Do you see the results of Cloud environment risk assessments and audits changing the way
SMEs conduct business in a meaningful way as per the choices below? Please select all that
apply.
336
Respondent
ID
Large
IT
budget
reducti
ons
Large
IT
budget
increas
es
Changes in
risk
mitigation
costs or
procedures
Changes
in risk
avoidance
costs or
procedure
s
Changes
in risk
transferen
ce costs or
procedure
s
Changes
in risk
accepta
nce
costs or
procedu
res
Other
(Please
describe)
or any
additional
comments
(We want
your
expertise)
?
105946313
89
105925437
26 Yes Yes
105923893
54 Yes Yes Yes
105889595
72 Yes Yes Yes Yes Yes
105726117
04 Yes Yes
105716286
13
105613457
31 Yes Yes Yes Yes
105596106
88 Yes Yes
105589243
65
105588501
46
105586851
53 Yes
105571623
74 Yes Yes
105566474
26 Yes Yes Yes Yes Yes
105563199
20 Yes Yes Yes Yes
105537888
08 Yes Yes Yes
105529833
98 Yes Yes Yes
105520742
81 Yes Yes Yes Yes Yes
105504027
64 Yes Yes Yes
337
105497716
08 Yes
105485280
15 Yes Yes
105484693
22 Yes Yes Yes Yes
105484504
20
105484491
24
105437319
48
338
Table 96
Survey Three, Question Nine.
When deciding who might be harmed and how, do you see SMEs including new Cloud based
factors such as those listed below? Please select all that apply.
Respondent
ID
National or
international
norms based
on where
the CSP is
based or
operates
National or
international
norms based
on where
the SME is
based or
operates
Specific
legal
requirements
for data such
as GDRP
10594631389
10592543726 Yes
10592389354 Yes
10588959572 Yes
10572611704 Yes
10571628613
10561345731 Yes
10559610688 Yes
10558924365
10558850146
10558685153
10557162374 Yes
10556647426 Yes
10556319920 Yes
10553788808 Yes
10552983398 Yes
10552074281 Yes
10550402764 Yes
10549771608 Yes
10548528015 Yes
10548469322 Yes
10548450420 Yes
10548449124
10543731948
339
Table 97
Survey Three, Question Ten
When assessing risk of Cloud environments, do you see SMEs changing their process in the
ways listed below? Please select all that apply
340
Respondent
ID
Using CSP
recommended
practices
Using any
IT
governance
frameworks
not
previously
used by the
SME
Using any
IT security
controls not
previously
used by the
SME
Using any
Cloud
security
control
guides not
previously
used by
the SME
Other?
(Please
describe)
Any
additional
comments
(We want
your
expertise)?
10594631389
10592543726 Yes Yes Yes
10592389354 Yes Yes Yes
10588959572 Yes Yes
10572611704 Yes Yes
10571628613
10561345731 Yes Yes Yes Yes
10559610688 Yes Yes
10558924365
10558850146
10558685153 Yes
10557162374 Yes
10556647426 Yes Yes
10556319920 Yes Yes Yes Yes
10553788808 Yes Yes Yes Yes
10552983398 Yes Yes
10552074281 Yes Yes
10550402764 Yes Yes Yes
10549771608 Yes Yes Yes
10548528015 Yes Yes Yes
10548469322 Yes
10548450420 Yes Yes Yes Yes
10548449124
10543731948
341
Table 98
Survey Three, Question Eleven.
Who do you see SMEs assigning risk ownership to regarding Cloud environments? Please select
all that apply.
Respondent
ID
SME
IT
team
SME
security
team
3rd
party
Business
owner
SME does
not change
risk
ownership
procedures
10594631389
10592543726 Yes
10592389354 Yes
10588959572 Yes
10572611704 Yes
10571628613
10561345731 Yes
10559610688 Yes
10558924365
10558850146
10558685153 Yes
10557162374 Yes
10556647426 Yes
10556319920 Yes
10553788808 Yes
10552983398 Yes
10552074281 Yes
10550402764 Yes
10549771608 Yes
10548528015 Yes
10548469322 Yes
10548450420 Yes
10548449124
10543731948
342
Table 99
Survey Three, Question Twelve.
When identifying controls to reduce risk in Cloud environments, do you see SMEs changing
their process in the ways listed below? Please select all that apply.
Respondent
ID
Primarily
relying
on CSP
provided
controls
Adapting
new
controls
from any
IT
governance
frameworks
Using
any
new
non-
Cloud
specific
IT
security
controls
Using
any
Cloud
security
control
guides
Other?
(Please
describe)
Any
additional
comments
(We want
your
expertise)?
10594631389
10592543726 Yes Yes
10592389354 Yes Yes Yes
10588959572 Yes Yes
10572611704 Yes
10571628613
10561345731 Yes Yes Yes
10559610688 Yes Yes Yes
10558924365
10558850146
10558685153 Yes
10557162374 Yes Yes
10556647426 Yes Yes
10556319920 Yes Yes Yes Yes
10553788808 Yes Yes Yes
10552983398 Yes Yes Yes
10552074281 Yes Yes
10550402764 Yes Yes
10549771608 Yes
10548528015 Yes Yes
10548469322 Yes
10548450420 Yes Yes
10548449124
10543731948
343
Table 100
Survey Three, Question Thirteen
Once controls have been identified for the SME’s Cloud environment, what effect do they have
on existing SME IT controls? Please select all that apply.
344
Respondent
ID
New
Cloud
controls
are kept
separate
from
existing
control
catalogs
New
Cloud
controls
are
combined
with
existing
controls
to form
larger
control
catalogues
New
Cloud
controls
promise
to replace
or reduce
existing
control
catalogs
spurring
increased
Cloud
transitions
New
Cloud
controls
appear
onerous
and
reduce
Cloud
transitions
due to
increased
difficulty
Other
(Please
describe) or
any
additional
comments
(We want
your
expertise)?
10594631389
10592543726 Yes
10592389354 Yes
10588959572 Yes
10572611704 Yes
10571628613
10561345731 Yes
10559610688 Yes
10558924365
10558850146
10558685153
10557162374 Yes
10556647426 Yes
New Cloud
controls
often
incompatible
with existing
controls.
10556319920 Yes Yes
10553788808 Yes Yes
10552983398 Yes
10552074281 Yes
345
10550402764 Yes Yes Yes
10549771608 Yes
10548528015 Yes
10548469322 Yes
10548450420 Yes
10548449124
10543731948
346
Table 101
Survey Three, Question Fourteen.
Have you seen Cloud risk assessments change other previously completed SME risk assessments
in the ways listed below? Please select all that apply.
347
Respond
ent ID
Previou
s risk
assessm
ents
change
d
because
of CSP
location
Previous
risk
assessm
ents
changed
because
of new
legal or
regulato
ry
require
ments
based
on
Cloud
usage
Previous
risk
assessm
ents
changed
because
of new
financial
require
ments
based
on
Cloud
usage
Previous
risk
assessm
ents
changed
because
of new
insuranc
e
require
ments
based
on
Cloud
usage
Previous
risk
assessm
ents
changed
because
of new
market
require
ments
based
on
Cloud
usage
Previous
risk
assessm
ents
changed
because
of new
operatio
nal
require
ments
based
on
Cloud
usage
Previous
risk
assessm
ents
changed
because
of new
strategic
require
ments
based
on
Cloud
usage
Other
(Please
describ
e) or
any
additio
nal
comm
ents
(We
want
your
experti
se)?
1059463
1389
1059254
3726 Yes
1059238
9354 Yes
1058895
9572 Yes
1057261
1704 Yes
1057162
8613
1056134
5731 Yes
1055961
0688 Yes
1055892
4365
1055885
0146
1055868
5153
1055716
2374 Yes
1055664
7426 Yes
1055631
9920 Yes
348
1055378
8808
1055298
3398 Yes
1055207
4281 Yes
1055040
2764 Yes
1054977
1608 Yes
1054852
8015 Yes
1054846
9322 Yes
1054845
0420 Yes
1054844
9124
1054373
1948
349
Table 102
Survey Three, Question Fifteen.
Cloud transitions almost always promise cost savings and Cloud operations usually require less
effort than on-premise IT operations. Cloud transitions, however, increase the risk and audit
teams’ responsibilities, knowledge and skills requirements. How do you see SMEs changing
their risk and audit teams to adapt to Cloud environments? Please select all that apply:
350
Respondent
ID
Increase
size and
budget
of risk
and
audit
teams
Reorganize
or change
structure
of risk and
audit
teams
Increase
outsourcing
or use of
consultants
to perform
Cloud risk
and audit
duties
Increase
workload
of
existing
risk and
audit
teams
Other
(Please
describe)
or any
additional
comments
(We want
your
expertise)?
10594631389
10592543726 Yes Yes
10592389354 Yes
10588959572 Yes Yes Yes
10572611704 Yes Yes Yes Yes
10571628613
10561345731 Yes Yes Yes
10559610688 Yes Yes Yes
10558924365
10558850146
10558685153 Yes
10557162374 Yes Yes Yes
10556647426 Yes Yes
10556319920 Yes Yes Yes Yes
10553788808 Yes Yes
10552983398 Yes
10552074281 Yes Yes Yes
10550402764 Yes
10549771608 Yes Yes
10548528015 Yes Yes Yes
10548469322 Yes
10548450420
10548449124
10543731948
351
Table 103
Survey Three, Question Sixteen.
Any additional comments or recommendations for the follow up survey?
Respondent
ID Answer
10594631389
10592543726
10592389354
10588959572
10572611704
10571628613
10561345731
10559610688
10558924365
10558850146
10558685153
10557162374
10556647426
10556319920
10553788808
10552983398
10552074281
This is some of the
best graduate level
work I have EVER
seen!
10550402764
10549771608
10548528015
10548469322
10548450420
10548449124
10543731948
352
Appendix E Validated survey instrument
SME Cloud Adoption Risk Guidance.
353
Table 104
Question One.
354
Question Choices
SMEs need help evaluating the
risks of Cloud adoption. This
survey is trying to help identify
risks that SMEs should take into
account as they move to the
Cloud. The audience for this
survey is not experienced risk
professionals. The information
provided in this survey is
intended to help business owners
and executives understand the
broad risks involved with
adopting Cloud computing.This
survey is free to use and may be
taken as many times as you like.
Each question has a comment
field if you would like to see any
changes or additions. Additional
comments or requests ca be sent
to SME_Cloud_Risk@
meersman.org. Each choice may
lead to a different
recommendation. This first
question is on the size of your
risk and audit team. This helps us
make realistic recommendations
for your organization. Your
organization has:
Any links or advice you would
like to share?
IT frameworks
or IT security
configuration
guidelines
CSP (Cloud
service provider)
choice
Type of Cloud
service or
environment
355
Method of
transition
Table 105
Question Two.
Question Choices
Your organization is large enough for a full-time internal audit team.
Your organization probably has an IT framework or IT configuration
standard in place. Your audit and risk team can help gauge the risk
associated with any early decisions that you make about a Cloud
transition. Your organization is large or in a heavily regulated field.
Your organization has well defined processes. Cloud adoption may
require modification of your current policies and procedures or
require new ones. Please select a decision to see more.
Any links or advice you would like to share?
A Full-time
internal risk and
audit team.
External audits
as needed,
including IT
audits.
Only financial
audits as
required.
No real audits
or risk
assessments.
356
Table 106
Question Three.
357
Question Choices
Your organization uses external
risk and audit teams as needed.
Your organization has
experience with providing
answers to a risk and audit
team. You organization may
use an IT standard or
configuration guide. Your
organization may have regular
IT audits. Your usual outside
audit and risk team may not
specialize in Cloud computing.
You may need to engage a new
IT audit and risk firm.
Any links or advice you would
like to share?
Engage
current vendor
for Cloud related
assessments.
Create RFP
for audit vendor
that specializes
in Cloud.
358
Use current
on-premises
guidance for new
Cloud
environment.
Table 107
Question Four.
Question Choices
Your organization has not yet
engaged a risk or audit team for
IT related matters. IT audits can
be part of a regular financial audit
or separate engagements. Your
organization may not have Cloud
expertise in your IT team, or you
may wish to get a less biased
opinion. Possible choices below.
Any links or advice you would
like to share?
Time to start
IT audits,
perhaps Cloud is
the first.
Your Cloud
risk assessment
will be done by
the internal IT
team.
Your Cloud
risk assessment
will be done by
the internal
audit team.
359
Table 108
Question Five.
360
Question Choices
Your organization is not
large enough to require
formal risk or audit
procedures. Unless you
think it is time for your
organization to adopt
formal risk procedures, an
ad-hoc approach to a
Cloud transition could
work. You should make
sure that you have the
requisite Cloud experience
either in your IT staff or
with an outside consultant.
Any links or advice you
would like to share?
Start
piecemeal,
Cloud app by
Cloud app.
Engage a
consultant or
third party for
your Cloud
transition.
Hire a Cloud
expert to join
your IT team.
361
Train
existing IT
team in Cloud.
Table 109
Question Six.
362
Question Choices
Many large enterprises
with full audit and risk
teams use one of the
below for IT governance
or IT security control
standards. If you
recognize one of the
choices below, there is a
standard for how your
risk team performs
evaluations. Each choice
listed leads to links that
can help describe the
Cloud risk assessment
process for that
framework.
Any links or advice you
would like to share?
COBIT
ITIL
ISO/IEC
NIST SP 800-53
NIST
Cybersecurity
Framework
HIPAA
PCI-DSS
GDPR
363
Table 110
Question Seven.
Question Choices
Useful links for an organization such as yours regarding Cloud
frameworks
includes:https://deloitte.wsj.com/riskandcompliance/2018/11/26/moving-
to-the-cloud-engage-internal-audit-
upfront/https://read.acloud.guru/cloud-risk-management-requires-a-
change-to-continuous-compliance-mindset-
bca7252eecd0?gi=10febf09c20chttps://www.icaew.com/technical/audit-
and-assurance/assurance/what-can-assurance-cover/internal-audit-
resource-centre/how-to-audit-the-
cloudhttps://www.corporatecomplianceinsights.com/wp-
content/uploads/2014/12/PwC-A-guide-to-cloud-audits-12-18-
14 https://www.pwc.com/us/en/services/risk-assurance/library.html
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so helpful
Not at all
helpful
364
Table 111
Question Eight.
Question Choices
When deciding which CSP to use, the following links should
help.https://www.zdnet.com/article/how-to-choose-your-cloud-provider-
aws-google-or-
microsoft/https://www.researchgate.net/publication/323670366_Criteria_for
_Selecting_Cloud_Service_Providers_A_Delphi_Study_of_Quality-of-
Service_Attributeshttps://lts.lehigh.edu/services/explanation/guide-
evaluating-service-security-cloud-service-providershttps://nordic-
backup.com/blog/10-mistakes-choosing-cloud-computing-providers/
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
365
Table 112
Question Nine.
Question Choices
Besides choosing a Cloud provider(s), the type of Cloud
infrastructure is also important. The following links should
help.https://aws.amazon.com/choosing-a-cloud-
platform/https://www.cloudindustryforum.org/content/code-
practice-cloud-service-
providershttps://kirkpatrickprice.com/blog/whos-
responsible-cloud-
security/https://www.datamation.com/cloud-
computing/iaas-vs-paas-vs-saas-which-should-you-
choose.html
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
366
Table 113
Question Ten.
367
Question Choices
The transition to a Cloud environment can be done many
ways. Each way has different risks and risk assessment
procedures. The links below should
help.https://www.businessnewsdaily.com/9248-cloud-
migration-challenges.htmlhttps://medium.com/xplenty-
blog/how-to-transition-to-the-cloud-the-basics-
75e7ca06f959https://www.365datacenters.com/portfolio-
items/best-practices-to-transition-to-the-
cloud/https://serverguy.com/cloud/aws-migration/
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
368
Table 114
Question Eleven.
Question Choices
Your organization may have some regulatory and framework
guidelines that will impact your Cloud transition. The links below
should help.https://www.ucop.edu/ethics-compliance-audit-
services/_files/webinars/10-14-16-cloud-
computing/cloudcomputing https://read.acloud.guru/cloud-risk-
management-requires-a-change-to-continuous-compliance-mindset-
bca7252eecd0?gi=10febf09c20chttps://www.infoq.com/articles/cloud-
security-auditing-challenges-and-emerging-
approacheshttps://www.pwc.com/us/en/services/risk-
assurance/library.html
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so helpful
Not at all
helpful
369
Table 115
Question Twelve.
370
Question Choices
The choice of a CSP(s) depends on important risk choices. The links
below should help clarify the basis of those risk
choices.https://www.zdnet.com/article/how-to-choose-your-cloud-
provider-aws-google-or-
microsoft/https://www.brighttalk.com/webcast/11673/136325/cloud-
security-and-compliance-solution-for-
smbhttps://www.entrepreneur.com/article/226845https://lts.lehigh.edu/se
rvices/explanation/guide-evaluating-service-security-cloud-service-
providers
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so helpful
Not at all
helpful
371
Table 116
Question Thirteen.
Question Choices
Not only are CSP choices based on important
risk decisions, so too are types of Cloud
computing environments. The links below
should help.https://aws.amazon.com/choosing-a-
cloud-
platform/https://kirkpatrickprice.com/blog/whos-
responsible-cloud-
security/https://www.fingent.com/blog/cloud-
service-models-saas-iaas-paas-choose-the-right-
one-for-your-
businesshttps://blog.resellerclub.com/saas-iaas-
paas-choosing-the-right-cloud-model-for-your-
business/
Any links or advice you would like to share?
Extremely helpful
Very helpful
Somewhat helpful
Not so helpful
Not at all helpful
372
Table 117
Question Fourteen.
373
Question Choices
Once the CSP(s) and type of Cloud computing
environment are chosen, the transition process from on-
premises to the Cloud has risks that need to be assessed.
The links below should
help.https://www.businessnewsdaily.com/9248-cloud-
migration-challenges.htmlhttps://medium.com/xplenty-
blog/how-to-transition-to-the-cloud-the-basics-
75e7ca06f959https://www.365datacenters.com/portfolio-
items/best-practices-to-transition-to-the-
cloud/https://serverguy.com/cloud/aws-migration/
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
374
Not at all
helpful
375
Table 118
Question Fifteen.
376
Question Choices
Your organization most likely does not have specific IT frameworks or
configuration requirements. The links below should help determine what
framework type risks your organization has regarding a Cloud
transition.https://www.ucop.edu/ethics-compliance-audit-
services/_files/webinars/10-14-16-cloud-
computing/cloudcomputing https://assets.kpmg/content/dam/kpmg/ca/pd
f/2018/03/cloud-computing-risks-
canada https://www.pwc.com/us/en/services/risk-
assurance/library.htmlhttp://www.cloudaccess.com/smb/
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so
helpful
377
Not at all
helpful
Table 119
Question Sixteen.
Question Choices
There are risks to choosing any CSP. Understanding the strengths and
weaknesses of various CSPs will help make clear the risk decisions that
your organization will need to make. The links below should
help.https://www.cloudindustryforum.org/content/8-criteria-ensure-
you-select-right-cloud-service-
providerhttps://searchcloudcomputing.techtarget.com/feature/Top-
considerations-for-choosing-a-cloud-
providerhttps://searchcloudsecurity.techtarget.com/essentialguide/How-
to-evaluate-choose-and-work-securely-with-cloud-service-
providershttps://www.entrepreneur.com/article/226845
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so helpful
Not at all
helpful
378
Table 120
Question Seventeen.
Question Choices
Once a CSP is chosen, there are still
important details to consider such as what
type of Cloud computing environment your
organization plans to use in the CSP. The
risks for each type are well understood and
informed decisions can be made. The links
below should
help.https://kirkpatrickprice.com/blog/whos-
responsible-cloud-
security/https://www.fingent.com/blog/cloud-
service-models-saas-iaas-paas-choose-the-
right-one-for-your-
businesshttps://blog.resellerclub.com/saas-
iaas-paas-choosing-the-right-cloud-model-
for-your-
business/https://www.datamation.com/cloud-
computing/iaas-vs-paas-vs-saas-which-
should-you-choose.html
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so helpful
Not at all
helpful
379
Table 121
Question Eighteen.
380
Question Choices
The way your organization chooses to move to the Cloud raises
several risk questions that should be resolved. The links below
should help.https://www.businessnewsdaily.com/9248-cloud-
migration-challenges.htmlhttps://cloudacademy.com/blog/cloud-
migration-benefits-
risks/https://visualstudiomagazine.com/articles/2018/05/01/moving-
to-the-cloud-a-piecemeal-
strategy.aspxhttps://support.office.com/en-gb/article/move-
completely-to-the-cloud-f46ff7c8-b09e-4cc5-8a37-184fcfec1aca
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
381
Not at all
helpful
Table 122
Question Nineteen.
Question Choices
Your organization has not had to deal with many audits or risk
assessment processes yet. Moving to the Cloud may be your
organization’s most important IT decision. The links below should help
provide general guidance for organizations similar to
yours.https://www.patriotsoftware.com/accounting/training/blog/small-
business-risk-analysis-assessment-
purpose/https://www.himss.org/library/health-it-privacy-
security/sample-cloud-risk-
assessmenthttps://www.isaca.org/Journal/archives/2012/Volume-
5/Pages/Cloud-Risk-10-Principles-and-a-Framework-for-
Assessment.aspxhttps://securityintelligence.com/smb-security-best-
practices-why-smaller-businesses-face-bigger-risks/
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
382
Table 123
Question Twenty.
Question Choices
The choice of CSP has important risk and cost ramifications.
While the best course may be to engage a consultant to help,
the links below should help you understand the choices
more clearly.https://www.businessnewsdaily.com/5851-
cloud-storage-
solutions.htmlhttps://www.cnet.com/news/best-cloud-
services-for-small-
businesses/https://www.insight.com/en_US/solve/small-
business-solutions/cloud-and-data-center-
transformation/cloud-
services.htmlhttps://www.cloudindustryforum.org/content/8-
criteria-ensure-you-select-right-cloud-service-provider
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
383
Table 124
Question Twenty-one.
Question Choices
Even after making the choice of which CSP to
use, you need to decide on the details of your
Cloud computing environment. As with all your
other business decisions, the details are
important. The links below should help you
understand the differences between the types of
Cloud.https://www.businessnewsdaily.com/5851-
cloud-storage-
solutions.htmlhttps://www.fossguru.com/iaas-
cloud-computing-small-
business//https://www.g2.com/categories/cloud-
platform-as-a-service-
paashttps://kirkpatrickprice.com/blog/whos-
responsible-cloud-security/
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
384
Table 125
Question Twenty-two.
Question Choices
How your organization transitions to the Cloud may seem
straightforward but there are risks associated with any method you
choose. The links below should help you understand those risks and
make appropriate choices.https://lab.getapp.com/security-risks-of-
cloud-
computing/https://www.forbes.com/sites/theyec/2018/12/18/cloud-
computing-for-small-businesses-what-you-need-to-
know/https://www.businessnewsdaily.com/9248-cloud-migration-
challenges.htmlhttps://www.upwork.com/hiring/development/moving-
to-cloud-servers/
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
385
Table 126
Question Twenty-three.
Question Choices
Your organization may follow the COBIT framework. Useful links
include:http://www.isaca.org/Knowledge-
Center/Research/Pages/Cloud.aspxhttps://www.researchgate.net/publication/31186
3817_COBIT_Evaluation_as_a_Framework_for_Cloud_Computing_Governanceh
ttps://cloudsecurityalliance.org/working-groups/cloud-controls-matrix/
Any links or advice you would like to share?
Extre
mely
helpful
Very
helpful
Some
what
helpful
Not so
helpful
Not at
all
helpful
386
Table 127
Question twenty-four.
Question Choices
Your organization may use the ITIL service delivery framework. Useful links
include:https://www.simplilearn.com/itil-key-concepts-and-summary-
articlehttps://www.informationweek.com/devops/itil-devops-whatever—the-
labels-dont-matter/d/d-
id/1332650https://www.itilnews.com/index.php?pagename=ITIL_and_Cloud_
Computing_by_Sumit_Kumar_Jhahttps://www.axelos.com/news/blogs/march-
2019/itil-4-and-cloud-based-services
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
387
Table 128
Question Twenty-five.
388
Question
Choice
s
Your organization may follow ISO/IES JTC 1 policies. ISO/IEC JTC 1/SC 38
is Cloud specific. Useful links
include:https://www.iso.org/committee/601355.htmlhttps://www.iec.ch/dyn/w
ww/f?p=103:22:0::::FSP_ORG_ID:7608https://www.itworldcanada.com/blog/
cloud-computing-standards-update-iso-jtc1sc38-2/380069
Any links or advice you would like to share?
Extre
mely
helpful
Very
helpful
Som
ewhat
helpful
389
Not
so
helpful
Not
at all
helpful
Table 129
Question Twenty-six.
Question Choices
PCI-DSS is a security standard for SMEs that handle credit card transactions.
Some useful links
include:https://www.pcisecuritystandards.org/https://www.bigcommerce.co
m/blog/pci-compliance/https://www.paymentsjournal.com/gdpr-and-pci-dss/
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
390
Table 130
Question Twenty-seven.
Question Choices
Your organization may have compliance based IT security policies.
NIST 800-53 and NIST CF are widely used. Useful links
include:https://www.nist.gov/programs-projects/nist-cloud-
computing-program-nccphttps://www.nist.gov/baldrige/products-
services/baldrige-cybersecurity-
initiativehttps://docs.aws.amazon.com/quickstart/latest/compliance-
nist/templates.htmlhttps://docs.microsoft.com/en-
us/azure/security/blueprints/nist171-paaswa-overview
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
391
Table 131
Question Twenty-eight.
392
Question Choices
The NIST CF provides a policy framework for Cybersecruity.
Useful links include:https://www.nist.gov/baldrige/products-
services/baldrige-cybersecurity-
initiativehttps://www.nist.gov/programs-projects/nist-cloud-
computing-program-
nccphttps://docs.aws.amazon.com/quickstart/latest/compliance-
nist/templates.htmlhttps://docs.microsoft.com/en-
us/azure/security/blueprints/nist171-paaswa-overview
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so
helpful
393
Not at all
helpful
Table 132
Question Twenty-nine.
Question Choices
HIPAA has very distinct requirements for IT and Cloud usage.
Some useful links include:https://www.hhs.gov/hipaa/for-
professionals/special-topics/cloud-
computing/index.htmlhttps://aws.amazon.com/compliance/hipaa-
compliance/https://hosting.review/file-storage/hipaa-compliant-
cloud-
storage/https://hitinfrastructure.com/features/understanding-
hipaa-compliant-cloud-options-for-health-it
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so helpful
Not at all
helpful
394
Table 133
Question Thirty.
Question Choices
GDPR is a new requirement for many SMEs. Some useful
links may be found below:https://gdpr-
info.eu/https://martechtoday.com/guide/gdpr-the-general-
data-protection-
regulationhttps://www.cloudindustryforum.org/content/cloud-
and-eu-gdpr-six-steps-
compliancehttps://www.paymentsjournal.com/gdpr-and-pci-
dss/
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
395
Table 134
Question Thirty-one.
Question Choices
The Center for Internet Security has many good tools
and guides for Internet and Cloud Security
including:https://www.cisecurity.org/white-
papers/cis-controls-cloud-companion-
guide/https://www.cisecurity.org/cis-
benchmarks/https://www.cisecurity.org/cybersecurity-
best-practices/
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
396
Table 135
Question Thirty-two.
Question Choices
The CSA has many useful procedures and practices. Some oft good ones
include:https://cloudsecurityalliance.org/https://aws.amazon.com/complianc
e/csa/https://www.microsoft.com/en-us/trustcenter/compliance/csa-self-
assessment
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
397
Table 136
Question Thirty-three.
Question Choices
There are many links to recommendations for AWS security, some of the
good ones include:https://aws.amazon.com/compliance/security-by-
design/https://d1.awsstatic.com/whitepapers/compliance/Intro_to_Securit
y_by_Design https://s3-us-west-2.amazonaws.com/uw-s3-cdn/wp-
content/uploads/sites/149/2018/12/28193639/Tim-
Sandage_Amazon_Secure-By-Design-%E2%80%93-Running-Compliant-
Workloads-on-AWS
Any links or advice you would like to share?
Extremely
helpful
Very helpful
Somewhat
helpful
Not so helpful
Not at all
helpful
398
Table 137
Question Thirty-four.
Question Choices
There are many links to recommendations for Azure security, some of the
good ones
include:https://www.cisecurity.org/benchmark/azure/https://docs.microsoft.co
m/en-us/azure/security/security-best-practices-and-
patternshttps://talkingazure.com/posts/exploring-azure-security-center-virtual-
machine-baseline/
Any links or advice you would like to share?
Extremely
helpful
Very
helpful
Somewhat
helpful
Not so
helpful
Not at all
helpful
Exploring the Strategies of Enhanced Organizational Learning in Small- and M edium-
Sized Enterprises
Dissertation
Submitted to Northcentral Uni
v
ersity
Graduate Faculty of the School of Business and Technology M anagement
in Partial Fulfillment of the
Requirements for the Degree of
DOCTOR OF PHILOSOPHY
by
KAREN A. B. COCHRAN
Prescott Valley, Arizona
March 2013
UMI Number: 35698
92
All rights reserved
INFORMATION TO ALL USERS
The quality o f this reproduction is dependent upon the quality o f the copy submitted.
In the unlikely event that the author did not send a com plete m anuscript
and there are missing pages, these will be noted. Also, if material had to be removed,
a note will indicate the deletion.
UMI
Dissertation PiiblishMig
UMI 3569892
Published by ProQuest LLC 2013. Copyright in the Dissertation held by the Author.
Microform Edition © ProQ uest LLC.
All rights reserved. This w ork is protected against
unauthorized copying under Title 17, United States Code.
ProQuest LLC
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, Ml 48106-1346
Copyright: 20
13
Karen A.B. Cochran
APPROVAL PAGE
Exploring the Strategies o f Enhanced Organizational Learning in Small and Medium-
Sized Enterprises
By
Karen A.B. Cochran
Approved by:
VP Academic Affairs: HeatherTrederick Ph.D. Date
Certified by:
< $ ' 7 - 2 0 / 3 School Dean: A. Lee Smith, Ph.D. Date
Abstract
Fluctuations in the global economy, transforming industries, and increased business
bankruptcies have compelled the leaders of industrial organizations to focus on
increasing organizational learning capacity. The problem studied was that a sound
strategy for leaders of small and medium-sized enterprises (SMEs) to increase
organizational learning did not exist. The qualitative multiple-case study referenced
complexity leadership theory (CLT) to explore and identify strategies that increased
organizational learning within the business acumen and subsequently aided SM E leaders
in sustaining economic competitive status. Data were collected through face-to-face
inter
vi
ews with twelve SME leaders representing four embedded case-study sites on the
east and west coasts of the United States. The SME leaders were representative of the
SME industrial manufacturing sector, which provides 86% o f employment in the United
States. The participants had no distinction between the tasks of establishing a strategic
plan and implementing a tactical plan; and were responsible for financial sustainability,
daily operations, and talent management. Analysis of data from the individual cases was
followed by a cross-case synthesis, resulting in four themes (a) communication, (b)
learning environment, (c) compensation, and (d) innovation. The study results revealed a
strategy which utilized a flat-lined organizational structure to enable rapid,
unembellished, and transparent communication, and cultivated an open learning
environment. Expressive ideas were welcomed, innovation generated new product
offerings, and market capabilities were expanded. SME leaders subsequently offered
increased compensation and consequently talent was retained. The organizational
structure, and the creation of open learning environments by the SME leaders promoted
the emergence of complex adaptive systems. The study results provided a theoretical
answer to the problem statement; met the intended purpose o f the study; and contributed
to the learning theory body of
knowledge.
The study advanced the knowledge base
regarding how leaders of SME companies approach increasing organizational learning to
sustain the business. CLT translated beyond the realm of education and contributed to
increasing organizational learning in the industrial manufacturing sector. Future studies
should be conducted to include the perspectives o f the SME workforce. Additionally
CLT should be examined in the services industries sector.
v
Acknowledgments
The stamina required to complete the journey was a gift from my father, the late Clarence
Patrick Buote, who taught me very early in life that I could do anything as long as I
focused my mind. Daddy, I am you, I love you, I miss you, and I know you are sharing
in this moment.
To my mother Rainey, thank you for the hugs, neck rubs, prayers, and having the faith in
me when I knew the impossible was winning. I would have been ABD if not for your
continuous love and encouragement.
To my other half Steve, thank you for the countless hours of our relationship you have
graciously surrendered while I buried my face on a computer screen. You watered and
fed me, and made me go to bed. It is now time for our lives together without
interruption, and alas the celebratory cruise.
To my son, James, I thank you for instigating this journey on my M BA graduation day
with your question, “So when are you going to be a doctor” ? You have been a
tremendous source of discussion, knowledge, encouragement, and love. You are an
inspiration to me and all of your students. I am so very proud of you; all my love, Mom.
To my chair, Dr. Steve Munkeby, from the marathon phone calls to the bar-b-que dinners
in Huntsville, your guidance, patience, and willingness to listen have enabled me to grow
academically and as a person. I am forever grateful for everything you have given me.
To Dr. Ying Liu, thank you for your critiques and support. I hope to see you in New
York sometime in the near future. To my editors, Toni Williams and A licia Clayton,
thank you for helping translate a large body of effort into a cohesive, smooth flowing
manuscript of meaningful jargon. I am blessed to have found all of you.
To the participants, thank you for your candor and willingness to share with me your
time, thoughts, and strategies. Thank you for what you do for our Nation and to protect
our Freedom. You are heroes. You have given me a gift that will never be replicated.
To Dr. R. Marion and Dr. M. Uhl-Bien, thank you for your fascinating translation of
biological complex adaptive systems to the theoretical existence within exploratory
fields. I have found many CAS on my doctoral journey and am excited to expand CLT. I
look forward to working with you in the future.
To my boss, Laura Truax, who would send me e-mails with a reference: D o not read this,
you should be working on your paper, the instantaneous generated smile was better than
any energy drink. Your time-management and multitasking examples got me to the end.
To many, many coworkers, family, and friends who have supported, encouraged, and
prayed for me, knowing I had to provide you with milestone check-ins kept me going.
Your belief in me provided light during the dark hours, and yes, you have to call me
doctor, at least for a few months. Finally, Donna and Harold Acosta— start the paella!
vi
Table of Contents
List of T a b le s………………………………………………………………………………………………………………..
ix
List of Figures……………………………………………………………………………………………………………….. x
Chapter 1: Introduction…………………………………………………………………………………………………..
1
Background………………………………………………………………………………………………………………2
Statement of the Problem………………………………………………………………………………………….
5
Purpose of the S tudy…………………………………………………………………………………………………
6
Theoretical Framework…………………………………………………………………………………………….
7
Research Questions…………………………………………………………………………………………………..
9
Nature of the S tudy………………………………………………………………………………………………… 10
Significance of the S tudy………………………………………………………………………………………..
12
Definition of Key Term s………………………………………………………………………………………… 13
Sum m ary……………………………………………………………………………………………………………….. 1
8
Chapter 2: Literature Review……………………………………………………………………………………….. 20
Documentation………………………………………………………………………………………………………. 20
Historical Perspective of Business M anagem ent…………………………………………………….
23
Individual Learning Theories………………………………………………………………………………….
26
Organizational Learning Theory……………………………………………………………………………..
30
Complexity T heory…………………………………………………………………………………………………38
Complexity Leadership Theory: The Nascent D isciplines………………………………………
40
Complexity Leadership Theory: The Context o f S M E s…………………………………………..
50
Complexity Leadership Theory: Framework L im itations……………………………………….
62
Competitive Edge……………………………………………………………………………………………………
64
Implementation of Change— R esisters……………………………………………………………………
65
Sum m ary………………………………………………………………………………………………………………..
68
Chapter 3: Research M ethod…………………………………………………………………………………………
70
Research Method and D e s ig n …………………………………………………………………………………
71
Population………………………………………………………………………………………………………………
74
Sample……………………………………………………………………………………………………………………
75
M aterials/Instruments……………………………………………………………………………………………. 77
Data Collection, Processing, and A nalysis……………………………………………………………..
79
Assumptions…………………………………………………………………………………………………………..
83
Lim itations……………………………………………………………………………………………………………. 83
Delimitations………………………………………………………………………………………………………….
84
Ethical Assurances………………………………………………………………………………………………….84
Sum m ary………………………………………………………………………………………………………………..
86
Chapter 4: Findings………………………………………………………………………………………………………
88
Results……………………………………………………………………………………………………………………
89
Evaluation of Findings………………………………………………………………………………………….
108
Sum m ary……………………………………………………………………………………………………………… 131
Chapter 5: Implications, Recommendations, and C onclusions…………………………………… 1
33
Implications…………………………………………………………………………………………………………..1
36
Recommendations………………………………………………………………………………………………… 1
45
Conclusions…………………………………………………………………………………………………………..
148
References………………………………………………………………………………………………………………….
150
Appendices
………………………………………………………………………………………………………………… 168
Appendix A: Permission to Conduct Research— Case A ……………………………………….1
69
Appendix B: Permission to Conduct Research— Case B ……………………………………….
170
Appendix C: Permission to Conduct Research— Case C ……………………………………….
171
Appendix D: Permission to Conduct Research— Case D ……………………………………….1
72
Appendix E: Various Contributions of Researchers to Learning T h eo ry………………..
173
Appendix F: Assessment Protocol………………………………………………………………………… 1
80
Appendix G: Source Map for Assessment Protocol Instrument……………………………..
183
Appendix H: Participant— Informed Consent F orm ……………………………………………….
187
Appendix I: Transcription Instructions………………………………………………………………….
189
viii
List of Tables
Table 1 Literature Review Synthesis: Span o f T im e ………………………………………………………
22
Table 2 Case-Study Site Candidate Performance H istories…………………………………………..
76
Table 3 Data Reduction and A nalysis……………………………………………………………………………81
Table 4 Research Question Q l: Emergent Them es…………………………………………………….
91
Table 5 Research Question Q l, Prominent Theme: Communication………………………….. 91
Table 6 Research Question Q2: Emergent Them es……………………………………………………..
93
Table 7 Research Question Q2, Prominent Theme: Innovation……………………………………
94
Table 8 Research Question Q3: Em ergent Them es……………………………………………………..
96
Table 9 Research Question Q3, Prominent Theme: Communication……………………………..96
Table 10 Research Question Q4: Emergent Them es……………………………………………………
97
Table 11 Research Question Q4, Prominent Theme: Compensation…………………………….
98
Table 12 Research Question Q5: Emergent T hem es…………………………………………………..
100
Table 13 Research Question Q5, Predominant Theme: Highest quality p r o d u c t
101
Table 14 Research Question Q6: Emergent T hem es…………………………………………………..
102
Table 15 Research Question Q6, Prominent Theme: Integrity…………………………………… 102
Table 16 Overarching Themes…………………………………………………………………………………….
104
Table 17 Strategies to Enhance Organizational Learning and Theoretical Inference…
130
ix
List of Figures
Figure 1. Literature search strategy.
……………………………………………………………………………
21
Figure 2. Conceptual model of traditional hierarchical organizational structure
43
Figure 3. Conceptual model of complex leadership theory organizational structure
44
Figure 4. Conceptual model of complex adaptive systems……………………………………….
49
Figure 5. Model of culture of competitiveness, knowledge development, and cycle time
performance in supply chains………………………………………………………………………………………. 65
Figure 6. Multiple-case-study design for the study that demonstrates three people were
interviewed in each case………………………………………………………………………………………………. 73
Figure 7. Methodology implementation flow ……………………………………………………………….74
1
Chapter 1: Introduction
The U.S. Census Bureau (2009) defined small and medium-sized enterprises
(SMEs) as companies with less than 500
employees.
The U.S. Department of Labor
Bureau of Labor Statistics (USDLBLS, 2009) indicated SMEs provided 86% of
employment in the United States in March 2008. More than half of the SMEs are
industrial manufacturing firms, and provided products to the aerospace and automotive
industries (Platzer,
2009).
Orders from larger corporations, such as General Motors, to
SMEs contribute in excess of $100 billion annually to the global economy (Gilbert,
Rasche, & Waddock, 2011; Thorton, 2010). Small and medium-sized enterprises are
therefore considered a business sector with significant impact to employment and the
global economy (American Bankruptcy Institute [ABI], 2009; Fulton & Hon, 2009;
Platzer, 2009; USDLBLS,
2009).
Large corporate and SME business leaders recognize the need to address the
dynamics of the current economic landscape to avoid the downward trend in productivity
and to protect the sustainability of their companies (ABI, 2009; Area & Prado-Prado,
2008). Traditional means of trimming budgets and reducing costs have created internal
savings (Prokopeak et al., 2011). However, the resultant savings are retained rather than
reinvested in hiring new workers and expanding the business (Prokopeak et al., 2011).
Consequently, 21st-century corporate and SME leaders are seeking alternative means of
internal growth for business sustainability and to gain competitive advantage in the
marketplace (Crawford, Hasan, W am e, & Linger, 2009).
Chapter 1 includes the background of the study, the problem and purpose
statement, and the theoretical framework. Also included in Chapter 1 are the research
questions; the nature and significance of the study; and definitions o f the terms used in
the study. The chapter ends with a summary.
Background
The ABI (2009) noted a downward trend in the domestic economy beginning in
early 2008. From the first quarter of 2008 to the first quarter of 2009, business
management in the United States posted a 64.3% increase in business bankruptcy filings
(ABI, 2009). By the third quarter of 2009, a record number o f U.S. companies (4,585)
had filed for protection under U.S. Bankruptcy Code Chapter 11 (ABI, 2009). The
increase in bankruptcies among larger corporations, the largest of which was General
Motors with $91 billion in assets, led to a disruption in orders to SMEs and subsequently
contributed to more than 42% of bankruptcies in manufacturing SMEs (ABI, 2009; Ben-
Ishai & Lubben, 2011; Lubben, 2009).
Unemployment in the United States continued to increase from 5.4% in January
2008, to 8.5% in January 2009, to 10.4% in January 2010 (“Notes on Current,” 2011;
USDLBLS, 2010a). Increased manufacturing costs were passed to consumers, and,
concurrent with the rise in u