Ask Cid?

CareerEDGE Employability: Examining if a difference exists between online and traditional college students

 

Submitted by Edward J. Files

Chapter 1: Introduction to the Study

Introduction

This is a quantitative, causal-comparative study to investigate if a difference exists in the scores from factor 2 (work & life experience skills) and factor 4 (generic skills) of the CareerEDGE Employability Development Profile (EDP) survey instrument between online college pathways students and traditional college pathways students from the Bachelor program at a large Western Christian university. Per a study by Dacre-Pool, Qualter and Sewell (2013), “There has been little empirical research conducted in relation to graduate employability and diagnostic tools available in this area are very limited” (p. 303).

A study from the U.S. Government Accountability Office (2012) and another by Hong, Polanin, Key and Choi (2014), suggest that there is need for further research in employability development assessment. The significance of this study is that it will produce statistical data that will lead to a more accurate, essential and deeper understanding of the relationship between education and application of employability skill sets of 2016 graduates. Per a combined study report from the U.S. Departments of Labor, Commerce, Education and Health & Human Services, on July 22, 2014, there is a need to “expand and improve access to labor market, occupational, and skills data and continue basic research on labor markets and employment” (pp. 21-22).

The importance of this study is that it will produce statistical evidence to determine if a difference exists in employability skills of online college pathways students versus traditional college pathways students. And if a difference exists, this study will help to define and explore the implications of the difference. Moreover, the two major educational delivery pathways, online and traditional college degree modes, are often investigated from only one perspective, such as online college pathways students (Allen & Seaman, 2011, 2012, 2013, 2014). This study will present evidence gathered from current literature reviews describing both online college pathways student perspectives (Allen & Seaman, 2015; Essary, 2014) and traditional college pathways student perspectives (Cataldi, Siegel, Shepherd & Cooney, 2014; NCES, 2014; U.S. Departments of Labor, Commerce, Education and Health & Human Services, 2014). This study will be using the CareerEDGE Employability Development Profile (EDP) survey instrument (Dacre-Pool & Sewell, 2007) to guide the research questions to explore the difference of employability skills between these two educational groups. Per the Governments (2014) report; “More evidence is needed to fill gaps in knowledge, improve job training programs, inform practitioners about adopting promising strategies, and expand proven models to address the needs of specific groups of workers, industries, communities and institutions” (U.S. Departments of Labor, Commerce, Education and Health & Human Services, 2014, pp. 21-22).

Chapter one of this study covers the introduction to the study, background of the study, the problem statement, the purpose of the study, the research questions and hypotheses, significance of the study, as well as the advancing scientific knowledge section. The rationale for the methodology, nature of the research design, definition of terms used in the study, assumptions, limitations and delimitations of the study and the summary and organization of the remainder of the study follow.

Background of the Study

This non-experimental, quantitative methodology, causal-comparative research study will explore the differences in employability skills of students from online college pathways and traditional college pathways, as determined from the Bachelor students’ perspective in the academic year 2016 at a large Western Christian university. Per numerous national and international studies (Allen & Seaman, 2014; Chanco, 2015; Youhang & Dongmao, 2015), employers are complaining that institutional education has manifests itself in a disconnect between educational skills taught and employability skills employers say they need for a modern workforce (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a).

Recent studies by Chanco, (2015) suggest that college graduates’ employability skills in their field of study do not match those needed by executive hiring authorities. Per qualitative articles and literature explored, a difference exists between educational skill sets being taught by colleges and the employability skills employers say they need (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a). It is universally agreed that a mismatch exists in skills taught by colleges and those needed by employers, but no studies exist that distinguish between the two educational groups. That all past and current studies combine or lump both educational pathways together when determining if there is a mismatch in skill sets. This study will statistically provide an answer to if a difference exists between online college pathways students and traditional college pathways students. The importance to the field of study is significant as determining if a single (or) two different curriculums are needed to bring best practices back to education, this can only be developed once all variables have been examined separately.

Problem Statement

It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from the online college pathway and the traditional colleges pathway from the Bachelor program at a large Western Christian university. This issue of a mismatch between employability skills of online college pathways students and traditional college collage pathways students, have not been investigated separately (Allen & Seaman, 2011, 2012, 2013, 2014, 2015). The issue is that past and current studies, as well as articles and literature reviews, lump together or investigate separately the educational pathways leading to a Bachelor degree (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a). This study will determine if a difference exist between the educational pathways, thus allowing practitioners from education and business to determine where changes can be made to produce optimal benefit to both institutions.

The effected population of approximately 21.596 million students in 2016; 21.266 million in 2015 plus approximately 330,000 in 2016 (NCES, 2017), adds an economic perspective, that of a national student loan debt crisis of $1.3 Trillion dollars (Chopra, 2013, Consumer Financial Protection Bureau). Per a combined study report from the U.S. Departments of Labor, Commerce, Education and Health & Human Services, on July 22, 2014, there is a need to “expand and improve access to labor market, occupational, and skills data and continue basic research on labor markets and employment” (pp. 21-22).

The two educational delivery pathways, online college degrees and traditional college degrees, are often investigated from only one of the two perspectives, and comparison of the differences between each other is limited to a single perspective in most cases (Allen & Seaman, 2012, 2013, 2014, 2015; Brungardt, 2011; Deepa & Seth, 2013; Robles, 2012; Sharma & Sharma, 2010; Weaver & Kulesza, 2014). This study will investigate both online college pathways and traditional college pathways, using a causal-comparative research design, to determine if a statistical difference exists between the two pathways. If a statistical difference exists between the two pathways, educational institutions will be able to update current curriculums to focus on areas where employability skills development are lacking. Additionally, employers will be able to use the findings to better partner with educational institutions in the development of internships and/or school-work programs that focus on the skills development in these areas.

An example of institutions using skills evaluation to develop partnerships is the Lindsey and Rice, (2015) study using a test known as the Situational Test for Emotional Management (STEM). Where findings showed that one or more online courses added to the traditional college pathways course curriculum increased student scores. Their findings showed that students benefit from the “time, training, experience, and practice of interpersonal skills in an online environment” (p. 126). Statistical evidence such as that developed by Lindsey and Rice, (2015) study demonstrates how valuable the findings of this study, using the CareerEDGE Employability Development Profile (EDP) instrument scores could be. The CareerEDGE Employability Development Profile (EDP) instrument was specifically designed and developed, and is in current use in Canada, to place students with partnered institutions and employers through internships and/or school-work programs (Dacre-Pool, Qualter & Sewell, 2013).

Purpose of the Study

The purpose of this study is to investigate if a difference exists in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, and factor 4, generic skills, of students from the online college pathway and the traditional college pathway from the Bachelor program at a large Western Christian university. This is a quantitative methodology, causal-comparative research study using a general target population (N) totaling 43,725 students, of which 7,975 are traditional college pathways students and 35,750 are online college pathways students from the Bachelor program at a large Western Christian university. The variable groups investigated are the Independent variable group consisting of online college pathways students and traditional college pathways students, and the dependent variable group consisting of the two dependent variables; factors 2 and 4 that support the CareerEDGE Employability Development Profile (EDP) instrument skill set questions (Dacre-Pool & Sewell, 2007; Dacre-Pool, Qualter & Sewell, 2013).

This study will add to the body of knowledge called for in the U.S. Departments of Labor, Commerce, Education and Health & Human Services, (2014, July 22) report, which calls for further research in several areas aimed at employability skills development. This study will suggest solutions to the problem statement: It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways, from the Bachelor program at a large Western Christian university. Solutions to the problem and contributions to the field of study will be in areas such as, adjustments to curriculum design (Hart Research Associates, 2015), authenticating learning outcomes match employability skills needed (Britt, 2015), and assure interpersonal skills development matches employability skills needed by 2017 employers (Lindsey & Rice, 2015).

Research Question(s) and Hypotheses

This quantitative methodology, causal-comparative research study will add to the current body of research into the investigation if a statistically significant difference exists between online college pathways students and traditional college pathways students from the Bachelor program at a large Western Christian university, using their CareerEDGE Employability Development Profile (EDP) instrument scores. The instrument used to determine the research questions for this survey questionnaire study is supplied by Dacre-Pool and Sewell, (2007). The permission letter to use the instrument and questionnaire (survey questions) are found in (Appendix D, pp. 192 – 194). To assure reliability a test-retest process was used in the development of the CareerEDGE Employability Development Profile (EDP) instrument (Pool & Sewell, 2007; Dacre-Pool, Qualter & Sewell, 2013).

RQ1:    Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway?

H1o:   There is no statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.

H1a:   There is a statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.

RQ2:    Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway?

H2o:   There is no statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.

H2a:   There is a statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.

The independent variable groups measured are online college pathways students and traditional college pathways students. The dependent variable group consist of factor 2 (work & life experience skills) and factor 4 (generic skills) from the CareerEDGE Employability Development Profile (EDP) survey questionnaire, see (Appendix D, continued, p. 195). These variables align to the research questions, hypotheses, and theoretical foundations of this study (Becker, 1964; Collins, 1979; Durkheim, 1895; Schultz, 1961, in Walters, 2004; U.S. Departments of Labor, Commerce, Education and Health & Human Services, (2014, July 22).

It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from the online college pathway and the traditional colleges pathway from the Bachelor program at a large Western Christian university. The two research questions are directly related to the two educational delivery methods available to all student degree seekers, that of online college pathways and traditional college pathways. The hypotheses are worded in such a fashion as to answer the central question of the study; does a causal-comparative significant difference exists between online college pathways students and traditional college pathways students from the Bachelor program at a large Western Christian university. The data collected from the survey questionnaire will be analyzed after completion of the 18 survey questions associated with, factor 2 work & life experience skills (2 questions) and factor 4 generic skills (16 questions), by both online and traditional college pathways students from the Bachelor degree seeking student populations under investigation for the educational year 2016.

Advancing Scientific Knowledge

Per the literature reviews dating back to 2002 (Allen & Seaman, 2002, 2003) and continuing through current literature (U.S. Department of Education, 2014a; White House, 2015a), the new population of students will most likely graduate with the wrong workforce readiness skills for the jobs they seek (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015). This study will advance scientific knowledge by adding to the body of knowledge in literature with empirical statistical data answering the question of does a difference exist between online college pathways students and traditional college pathways students from the Bachelor program of a large Western Christian university in the academic year 2016.

Moreover, this quantitative study will narrow the sample size from the 805 (n) respondents in the original study by Dacre-Pool and Sewell, (2007), to a sample size, true effect size for statistical power analysis for validity, to 210 (n) respondents. This sampling population covers traditional college pathways student respondents and online college pathways student respondents, thus making this the newest and most inclusive study measuring workforce readiness skills using the CareerEDGE Employability Development Profile (EDP) survey instrument (Dacre-Pool & Sewell, 2007).

It is universally agreed that a mismatch exist in skills taught by colleges and those needed by employers (Allen & Seaman, 2011, 2012, 2013, 2014; Deming, Goldin & Katz, 2012), but no studies statistically distinguish between the two educational groups, online and traditional college degree students (Brungardt, 2011; Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a). That most past and current studies either address the topic from one perspective or combine or lump both educational pathways together when determining if there is a mismatch in skill sets (Deepa & Seth, 2013; Kyllonen, 2013; Mannapperuma, 2015). This study will statistically provide an answer to if a difference exists between online college pathways students and traditional college pathways students’ skill sets and offer contrasting views/perspectives (Amrein-Beardsley, Holloway-Libell, Cirell, Hays & Chapman, 2015; Britt, 2015; Cappelli, 2015; Soulé & Warrick, 2015). The importance to the field of study is significant as determining a single curriculum to present best practices can only be developed once all variables have been examined separately (Lindsey & Rice, 2015).

This study will add to the body of knowledge requested in the U.S. Departments of Labor, Commerce, Education and Health & Human Services, (2014, July 22) report, which calls for further research in several areas aimed at workforce readiness skills development. Per national and international studies, employers are complaining that institutional education has manifest itself in a disconnect between educational workforce readiness skills taught and job ready skills employers say they need (Allen & Seaman, 2014, 2015; Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Sidhu & Calderon, 2014; Youhang & Dongmao, 2015). This study will statistically identify if the need exists for further investigation into academic areas such as curriculum design (Sharma & Sharma, 2010), authentic learning outcomes (Britt, 2015) and interpersonal skills development (Lindsey & Rice, 2015). Mastering these skill areas is bedrock to any successful doctorate in education, with concentrations in organizational and leadership development, as is being sought by this learner.

The theoretical foundation used to support this study is Human Capital Theory (HCT) (Becker, 1964; Schultz, 1961; in Walters, 2004), and the two major theories associated with Human Capital theory (HCT) are the Functionalist theory model of education (Durkheim, 1895), and the Credentialist theory model of education (Collins, 1971, 1979). These two major sub-theories of Human Capital Theory (HCT) will be discussed in detail as they directly relate to the theoretical foundation of this study by demonstrating operational models (Collins, 1971, 1979; Durkheim, 1895). These theories will explain possibilities for the disconnect between those skills being taught and the skills executive hiring authorities say they need from their entry-level workforce.

Significance of the Study

The significance of this study is that it will produce statistical data that will lead to a more accurate, essential and deeper understanding of the relationship between education and application of employability skill sets of 2017 graduates. Per a combined study report from the U.S. Departments of Labor, Commerce, Education and Health & Human Services, (2014, July 22), there is a need to “expand and improve access to labor market, occupational, and skills data and continue basic research on labor markets and employment” (pp. 21-22).

Additionally, this study will inform the business perspective (Light & McGee, 2015) and community environments (Sidhu & Calderon, 2014), where skills gaps were discovered by the U.S. Departments of Labor, Commerce, Education and Health & Human Services, (2014, July 22) report. Per the Federal Governments (2014) report, “More evidence is needed to fill gaps in knowledge, improve job training programs, inform practitioners about adopting promising strategies, and expand proven models to address the needs of specific groups of workers, industries, communities and institutions” (p. 21).

The need for this study is justified by the fact that college students, both domestic and international, are not graduating with the skill sets they need for employability as an entry-level worker in 2016 organizations (Cai, 2013; Iuliana, Dragoș & Mitran, 2014; Maurer, 2015; Sibolski, 2012). Additionally, the two research questions and hypotheses address each individual student pathway of the study in the Bachelor degree program at a large Western Christian university. This research study will determine an answer to the problem statement: It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways from the Bachelor program at a large Western Christian university. Once it is determined if and/or to what degree a difference exists between the two educational pathways, practical applications can be applied to the problem areas.

The potential practical applications of the findings of this study will directly affect the body of knowledge concerning educational administration and curriculum development (Iuliana, Dragoș & Mitran, 2014; Sibolski, 2012), as-well-as business workforce readiness skills development (Cai, 2013; Maurer, 2015), and government initiatives to increase knowledge and operational applications (U.S. Departments of Labor, Commerce, Education and Health & Human Services, 2014; Departments of Labor and Education (2015, April 16).

Rationale for Methodology

This quantitative methodology, causal-comparative research study will add to the current body of research on the investigation into workforce readiness skills acquired by students from online college pathways and traditional college pathways, by examining the differences between the two educational groups. This will be accomplished by examination results of the Bachelor program at a large Western Christian university, and will be determined by their CareerEDGE Employability Development Profile (EDP) instrument scores. This quantitative methodology, using the survey designed research method (Brungardt, 2011) is the best methodology compared to others available, such as Qualitative or Mixed designs (Brasoveanu, 2012; Gay & Weaver, 2011; Gravetter & Forzano, 2012). The reason is that it will answer, statistically, the degree that the researcher can defend findings that answer the research questions, and hypotheses, as-well-as address the problem statement: It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways from the Bachelor program at a large Western Christian university.

Per Brungardt, (2011) study of the emerging academic discipline of leadership studies, suggest that the purpose of the survey research method is “to gather large amounts of data from groups of people demographically and geographically dispersed to maximize sample populations” (p. 5). Fray (1996) suggest that the first step in methodology selection should concern what information the investigator wishes to know and then determine what research design method is needed to acquire said data in the fewest number of questions. Additionally, Fray (1996) suggests that the survey questionnaire method avoids qualitative responses of open-ended question answers, which lend themselves to respondent bias.

The rationale for selecting a quantitative methodology and using a causal-comparative research method comes directly from the issues faced in Dacre-Pool and Sewell, (2007) of low response rate of respondents and requires a numerical accounting of this study’s data. Additionally, the causal-comparative survey designed research method aligns with Dacre-Pool and Sewell, (2007) original study (as a comparison value to the 8-years of recession between studies) and increases the probability of a larger respondent rate through the increase in (N) general target population size to 43,725 respondents. This will produce findings that can be generalized to a larger population in the fields of labor, commerce, education & health and human services (U.S. Departments of Labor, Commerce, Education and Health & Human Services, 2014; Departments of Labor and Education (2015, April 16).

Nature of the Research Design for the Study

This study is a causal-comparative research design to investigate the relationships between two or more groups that differ on a variable of interest, and compares them with another variable(s) without manipulating said variables (Airasian & Gay, 2003). Basically, causal-comparative designs, use two groups with the intent of understanding the causes or in some instances the effect for the two groups being different (Grand Canyon University, 2016, CIRT-Basic Research Designs). The research data collection will be conducted through a questionnaire (survey designed method) instrument (Dacre-Pool & Sewell, 2007). This researcher has permission from the authors to use the CareerEDGE Employability Development Profile (EDP), (2007) survey instrument for this study (Appendix D, pp. 182 – 183).

Because this is a study consisting of volunteers, sampling only adult students from the Bachelor program at a large Western Christian university, a request for site authorization application will be submitted to the universities Institutional Review Board (IRB) network prior to study initiation. The site authorization application will describe the purpose and scope of the research, duration of the study, target population, impact on operations and resources, data use and potential benefit to said university and participants. The actual email survey instrument will be initiated by the large Western Christian university, through their Email Survey Distribution system. All email survey requests (including initial requests, follow-ups and reminders) that have necessary and/or appropriate site authorization and IRB approval will be distributed by the email survey distribution manager.

An invitation to participate in an online survey will be sent to all Bachelor program students from the online student pathway and traditional student pathway of study for the academic year 2016. The invitation (e-mail) introduction letter will discuss the materials and instrument used in this study. The invitation email will describe the purpose and significance of the study to the student population and university. Additionally, risks and benefits are addressed concerning student’s participation, as well as, sample selection ideology, protection of rights/well-being, maintaining of data security, sample recruitment process, data collection instruments and approaches will be discussed. A follow-up-letter will be sent via e-mail five days after the letter of introduction is sent to encourage full participation by the student population of the large Christian university. The survey instrument will be hosted by the university and a link will be provided to the website to complete the survey. The participants informed consent form (Appendix B, pp. 185 – 188) will be included as required by the Independent Review Board (IRB) of the university. The survey will consist of 18 survey questions from the CareerEDGE Employability Development Profile (EDP) instrument from factor 2 (work & life experience skills) and factor 4 (generic skills) for measurement of the two independent variable groups, traditional student pathways and online student pathways from the Bachelor program at a large Western Christian university.

The general target population size (N) for the study is 43,725 respondents, upon which a sample true effect size (n) for statistical power analysis for validity is 210 respondents, as determined using the G*Power instrument from authors Faul, Erdfelder, Lang and Buchner, (2007a). (Figure 4, G*Power Distribution Plot, p. 195). This study will explore the difference between two populations, online college pathways students and traditional college pathways students using the scores derived from the CareerEDGE Employability Development Profile (EDP) factor 2 (work & life experience skills) and factor 4 (generic skills) (Dacre-Pool & Sewell, 2007).

Definition of Terms

Traditional Education. Traditional education is defined as those classes taken through the traditional face-to-face contact with instructors in a set location, also referred to as a brick and mortar schools (Raj & Al-Alawneh, 2010).

Online Learning. With the advent of online educational delivery systems starting in 1994 (Hill, 2012) online education is delivering classes 24 hours per day, 7 days a week through the college or universities portal system (Intranet) that is connected to the students through the Internet.

Workforce Readiness Skills. Skills or skill sets are defined as to belonging to a set of both soft and hard abilities that the college graduate has acquired during matriculation in a specific career field or through personal or work experience (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015).

Soft skills are (interpersonal, emotional, cognitive, spacial skills, etc.) that describe what the learner acquires through education and interaction with instructors, fellow students, family and the community (Yin & Volkwein, 2010).

Hard skills are defined as job related skills, those specific to the career domain of the degree field studied (Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a).

Independent Variable Group. The independent variable groups being measured are Online college pathways students and Traditional college pathways students (Hill, 2012; Raj & Al-Alawneh, 2010).

Dependent Variable Group. The dependent variables of interest are the two factors that are being measured by the 18 (EDP) survey questions being scored by members of the Bachelor program at a large Western Christian university. These variables are aligned to the research questions, and hypotheses, as well as theoretical foundations of this study (Becker, 1964; Collins, 1979; Durkheim, 1895; Schultz, 1961, in Walters, 2004). The 2 CareerEDGE Employability Development Profile (EDP) factors being measured by this study are Factor 2, Experience Work/Life Skills and Factor 4, Generic Skills (Dacre-Pool & Sewell, 2007).

Assumptions, Limitations, Delimitations

The tool is self-report and can only measure the students’ own perceptions of their employability, and while there is research to suggest that many self-perceptions are associated with actual behavior, scrutiny should be employed when interrupting instrument results (e.g. Bandura, 1995; Bandura et al., 2003; Compte and Postlewaite, 2004). The current rationale for this assumption is that if the tool is to be used to evaluate interventions or further experimental studies are done, it is recommended that caution should be taken when interpreting these results.

In any study, there are possibilities of multiple sources for limitations, beyond that of the literature reviews themselves (Baruch & Holton, 2008; Branch, 2007; Podsakoff, MacKenzie & Podsakoff, 2012). While one can deflect the possibility of incorrect information and data from primary research studies and articles by closely checking one’s references, some forms of limitations are less identifiable. Limitations of this study may concern factors of low response rate in answering the survey questions, non-completion of survey instrument questions, bias by respondents toward online surveys, and methodology bias, (the biasing effects that measuring two or more constructs with the same methodology may have on estimates of the relationships between them) (Podsakoff, MacKenzie & Podsakoff, 2012).

For the sake of this study, delimitation is defined as, a shortcoming of this study due to the researchers’ decision-making-process. One of these decision-making-processes, such as using a research design that incorporates a convenience-sampling-frame, which produces delimitations for this study concerning the population size (N) being restricted to just the Bachelor student degree program (Baruch & Holton, 2008). Using a low general population (N) size directly affects the sample size response rate (n) in this survey research study. The minimum number of requirements, as per the large Western Christian university general rule of thumb on survey research, is 10 subjects per survey question or 280 respondents for this study.

For this study an a-priority Power Analysis was conducted to justify the study sample size based on the anticipated effect size (d = 0.03) and the critical t = 1.2857988, Df = 200 and Actual power = 0.8015396, which produced a respondent (n) sample size of 210 respondents were needed for statistical validity alone (G*Power, 2015) (Figure 4, p. 204). The survey instrument will be hosted by the large Western Christian universities email survey system to minimize further delimitations in the Data Collection and Management section.

Summary and Organization of the Remainder of the Study

Chapter 1 has been presented and it has provided an overview of the problem, the background of the problem, the purpose of the study, the rational for the methodology, research questions, hypotheses, significance of the study, advancing scientific knowledge, definition of terms, assumptions, limitations and delimitations, and nature of the research design of the study (Hart Research Associates, 2015; Nye, 2015).

Chapter 2 presents a review of the current literature on the workforce readiness skills of students from both traditional college pathways and online college pathways, how to assess the workforce readiness skills, and the effect the workforce readiness skills taught have on education today (Bryant & Bates, 2015; Lysne & Miller, 2015).

Chapter 3 focuses on the research methodology used in this study. It defines the research design, general and sample respondent sizes, the setting, the research instrument, data analysis, validity and reliability, ethical considerations that should be addressed and summary (Cornacchione & Daugherty, 2013; Jonbekova, 2015; Săveanu & Buhaş, 2015; Youhang & Dongmao, 2015).

Chapter 4 details the actual study raw data collected and the real-time data analysis procedures completed. It also discusses any new limitations discovered during the collection of raw data and the procedures used to address said limitations. This section consists of the results of the research, the analysis of the results and the authors sub-conclusions. The data will be illustrated in tables, figures, diagrams, charts and/or graphs and will be arranged in such a way that the specific groups of data correspond to any tables, figures, diagrams, charts or graphs used in the analysis of the results. This section will be logical, cumulative and simple as possible. Sub-conclusions and findings will always be clear on which facts and/or published literature the sub-conclusions and findings are based (Podsakoff, MacKenzie & Podsakoff, 2012).

Chapter 5 concludes the study by listing the discussion, conclusions, and any recommendations for future research, change or both based on the results of this research study.

Chapter 2: Literature Review

Introduction to the Chapter and Background to the Problem

This non-experimental, quantitative methodology, causal-comparative research study will add to the current body of research on the investigation into workforce readiness skills acquired by students from online college pathways and traditional college pathways, by examining the differences between the two educational groups. Examination through investigation of the Bachelor program at a large Western Christian university, where a test will be administered using their CareerEDGE Employability Development Profile (EDP) instrument scores (Dacre-Pool & Sewell, 2007). This is in-line with the recent call from the U.S. Department of Labor, Department of Commerce, Department of Education and Department of Health and Human Services, (2014, July 22) report in what works in job training. In addition, per the study that supplied the CareerEDGE Employability Development Profile (EDP) survey instrument questionnaire for this study, “There has been little empirical research conducted in relation to graduate employability…” (Dacre-Pool, Qualter & Sewell, 2013, p. 303).

This quantitative study will expand the general population size from the 805 (N) respondents originally selected to 43,725 (N) respondents by including all members of the Bachelor program at a large Western Christian university. The increase in general population size (N) will produce an effect size (n) of approximately 210 respondents needed for G*Power statistical validity (Faul, Erdfelder, Lang & Buchner, 2007a, 2007b). The literature review will address the following sections to produce knowledge of the topic and will describe how these sections apply to the current literature reviews through empirical primary and secondary resource investigation. These sections of this literature review cover the introduction and background of the study. The next section is theoretical foundations, which covers the conceptual framework of Human Capital Theory (HCT), and the two sub-theories directly affecting education; Functionalist theory and Credentialist theory (Becker, 1964; Collins, 1971 & 1979; Durkheim, 1895; Schultz, 1961, in Walters, 2004, pp. 99-100). Additionally, the instrumentations theoretical design model, that of survey questionnaires (Dillman, Smyth, & Christian, 2014; Ornstein, 2013) is discussed as it applies to this study.

Survey instrument design is based in Probability Theory (specifically, Frequency Probability) (Brasoveanu, 2012), where (n) equals the sample size, which is not finite, as it is dependent on a robust general population size (N) and the Frequency Probability equals the sample space (the set of all possible outcomes) (Brasoveanu, 2012; Penrose, 1995). The next section is an expanded view (185 + primary and secondary resources) and clarification of the literature reviewed, variable groups and perspectives of the actors directly involved with the problem statement of this study. The rationale for the methodology is discussed, and the dependent and independent variable groups are described, as well as how leadership studies affect both soft and hard skills development (Brungardt, 2011). An a-priori effect size test was run to determine the sample size appropriate for statistical validity, that number was 210 (n) total respondents (Faul, Erdfelder, Lang & Buchner, 2007a, 2007b; UCLA, 2015, Institute for Digital Research and Education) (Figure 5, p. 173). A section on instrumentation is next followed by definitions of variables, such as workforce readiness skills, soft skills and hard skills (Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a).

Following these sections is the meat of the topic discussion/explanation where perspectives of workforce readiness skills are discussed from numerous viewpoints by prominent authors in Labor, Commerce, Education and Health and Human services (U. S. Department of Labor, Department of Commerce, Department of Education and Department of Health & Human Services, 2014, July 22). These sections also cover the global and government perspectives of workforce readiness skills development (Chanco, 2015; National Statistics Office of Philippines, 2015; U.S. Departments of Labor, Commerce, Education and Health & Human Services, (2014, July 22). This section discusses three pathways toward education, for-profit, non-profit and State degree paths (Deming, Goldin & Katz, 2012). Additionally, traditional and online learning is defined and discussed from multiple perspectives (Lindsey & Rice, 2015). The final sections cover workforce readiness skills from the employers, educational institutions and the students’ perspectives (Deepa & Seth, 2013; Kyllonen, 2013; Mannapperuma, 2015). The final section of the literature review will discuss contrasting views/perspectives on the topic from several seminal authors (Amrein-Beardsley, Holloway-Libell, Cirell, Hays & Chapman, 2015; Britt, 2015; Cappelli, 2015; Soulé & Warrick, 2015). The final section of Chapter two will conclude with a summary and highlight of important points discussed prior.

This study will further extend past research through the investigation into workforce readiness skills acquired by students from online college pathways and traditional college pathways, by examining the differences between the two educational groups scores on the CareerEDGE Employability Development Profile (EDP) factors 2, work & life experience skills, and factor 4, generic skills, as determined from the Bachelor program at a large Western Christian university. The importance and need of studying whether workforce readiness skills being taught and those 21st century employers claim are not valid to entry-level employees, cannot be determined by lumping the two different educational pathways together. The two different pathways must be investigated separately to determine if a difference exist in educational skill sets and what may cause said differences before curriculum’s can be designed to produce better skilled graduates.

Per a combined study report from the U.S. Departments of Labor, Commerce, Education and Health & Human Services, on July 22, 2014, there is a need to “expand and improve access to labor market, occupational, and skills data and continue basic research on labor markets and employment” (pp. 21-22). The call for further research concerning workforce readiness skills is exacerbated by the effected population of 21.266 Million U. S. students who will be attending colleges and universities in 2015 (NCES, 2014, Table 1, 1990 through fall 2103). Per the literature reviews dating back to 2002 (Allen & Seaman, 2002, 2003), this new population of students will most likely graduate with the wrong workforce readiness skills for the jobs they seek (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Sidhu & Calderon, 2014; Youhang & Dongmao, 2015; U.S. Department of Education, 2014a; White House, 2015a). Moreover, those figures are projected to increase by approximately 330,000 students each following year through 2103 (NCES, 2014, Table 1, 1990 through fall 2103) (Table 1, p. 174).

Per Allen & Seaman, (2002, 2003) and international studies as recent as 2015 (Chanco, 2015; Săveanu & Buhaş, 2015), employers are complaining that institutional education has manifest itself in a disconnect between educational workforce readiness skills taught, and job ready skills employers say they need (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Youhang & Dongmao, 2015). The issue of online college pathways and traditional college pathways students not having the needed workforce readiness skills employers want from their entry-level workforce is not solely endemic to the U.S. business environment (Chanco, 2015; Jianru & Yinan, 2015). Recent studies drawn from diverse demographic and geographic global literature reviews strongly suggest that the issue of college students and their workforce readiness skills in their field of study, not matching those needed by executive hiring authorities, should be measured on a pandemic scale (Chanco, 2015; Jianru & Yinan, 2015). Studies from such diverse geographic locations as China, Romania, Tajikistan, Philippines and European Union, to name but a few, produce facts that show they are facing the same issues with hiring qualified college students as their U.S. counterparts (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Youhang & Dongmao, 2015). This supports the problem statement of this study: It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways. The significance of this study is that it will produce statistical data that will lead to a more accurate, essential and deeper understanding of the relationship between education and application of employability skill sets of new graduates.

Theoretical Foundations and/or Conceptual Framework

The conceptual framework for this non-experimental, quantitative methodology, causal-comparative research study is based in Human Capital Theory (HCT) (Becker, 1964; Schultz, 1961; in Walters, 2004, pp. 99-100). This section will postulate theoretical foundations and theories associated with academic training directly relating to the development of workforce readiness skills (Becker, 1964; Collins, 1979; Durkheim, 1895; Schultz, 1961; in Walters, 2004, pp. 99-100). This theoretical foundation supports the answering of the research questions, and hypotheses, as-well-as the problem statement derived from the background to the problem section, where literature reviews illustrate two pronounced but differing theories in education relating to the study topic.

Instrumentation Theoretical Model

The model upon which the study instrument is built is that of a questionnaire, normally regarded as a survey design (Dillman, Smyth, & Christian, 2014; Ornstein, 2013). The theory supporting this design is based on Probability Theory – specifically that of Frequency Probability (Brasoveanu, 2012) where (n) equals the sample size population, which is not finite, as it is dependent on a robust general population size (N) and the Frequency Probability equals the sample space (the set of all possible outcomes) (Blakstad, 2015; Brasoveanu, 2012; Penrose, 1959).

Theoretical Foundation Used

The theoretical foundation used to support this study is Human Capital Theory (HCT) (Becker, 1964; Schultz, 1961, in Walters, 2004), and the two major theories associated with Human Capital theory (HCT) are the Functionalist theory model of education (Durkheim, 1895), and the Credentialist theory model of education (Collins, 1971, 1979).

Human Capital Theory (HCT)

What is HCT? Human Capital Theory of education (Becker, 1964; Schultz, 1961; in Walters, 2004, pp. 99-100) is an economic variant of the later work, Technical Functional Theory (Collins, 1971). The foundation of Human Capital Theory suggests that increases in education are directly related to increased demand for skilled labor. Hence, college students will continue to seek increases in educational degree paths until the opportunity costs of procuring additional education surpasses the benefits that it provides (Cornacchione & Daugherty, 2013; Walters, 2004).

How it fits this study! HCT is directly aligned to this study, as it is bedrock to pedagogy; the teacher/student paradigm is prevalent for both online college pathways and traditional (ground campus) college pathways and workforce readiness skills attainment for college students. Attached to this theory are operationalized theory models associated with perspectives from educators and seminal authors in the field(s) of education, economics and social theory (Emile Durkheim, 1858-1917).

Theories Supporting Theoretical Foundation

Two major theories (sub-theories) of Human Capital Theory (HCT) will be discussed in detail as they directly relate to the theoretical foundation of this study by demonstrating operational models (Collins, 1979; Durkheim, 1895) that could explain possibilities for the disconnect between those skills being taught and the skills executive hiring authorities say they need from their entry-level workforce.

Functionalist Theory Operational Model

What is Functionalism? One of the oldest theories most closely associated with HCT in education is that of Functionalism (Durkheim, 1895). A functionalist theory in education posits; “That it is the role of education to transmit core values and social control through the attributes that support the political and economic systems that fuel education” (Durkheim, 1892, in Walters, 2004, p. 99). Per functionalist theory, technology and economic innovations raise skill levels that are required to perform jobs of the future (Walters, 2004).

How a functionalist theory fits this study? The research topic for this study explores the differences in workforce readiness skills obtained by students from online college pathways and traditional college pathways, as determined from the Bachelor program students at a large Western Christian university. A Functionalist theory (Durkheim, 1892) is directly aligned with the economic and technological innovation needed to raise the workforce readiness skill levels required to perform jobs in 2016 and beyond. Additionally, this study is a quantitative analysis of those workforce readiness skills being taught by colleges and universities to online and traditional college students, as determined by their CareerEDGE Employability Development Profile (EDP) instrument scores.

Credentialist Theory Operational Model

What is a Credentialist theory model? Per Credentialist theory model, learning in 2016’s educational institutions is more about accepted standards of social norms than with contributory learning and cognitive skill sets (Collins, 1979). Collins (1979) Credentialist theory model posits that employers use credentials (degrees, certifications, etc.) as a contributory factor to elevate educated workers to more lucrative pay and career positions.

How Credentialist theory model fits this study. Currently employability is judged on several factors, one of those being what credentials (degrees, certifications, etc.) an entry-level college graduate has attained in their chosen field of study. The credentials offered by entry-level graduate employees will determine the position they receive and the starting salary of that position (Collins, 1979). Comparing Credentialist theory and Functionalist theory is important to connecting theory and application.

Connecting Theory and Application

The subject topic of this study, if a difference exists in skill sets between students of online college pathways and traditional college pathways, is bedrock to these two theories. Functionalist theory (Durkheim, 1895) supports educations operational model for teaching, transmitting core values and social control through the attributes that support the political and economic systems. While Credentialist theory supports businesses operational model of hiring, where credentials (degrees and certifications, etc.) must be supported by the workforce readiness skills that the employers demand from graduate entry-level employees.

This study will explore, through statistical analysis of the two pertinent factors, factor 2 and factor 4, using 18 CareerEDGE Employability Development Profile (EDP) skills questions (Dacre-Pool & Sewell, (2007), that are associated with both higher educational credentialing (Collins, 1979) and teaching economic and technological functionalism (Durkheim, 1895). The variables associated with this study are directly related to the workforce readiness skills development that entry-level college students need to attain employability upon graduation. The results of this study will add to the body of knowledge as justified by the U.S. Departments of Education, Labor, Commerce, and Health and Human Services, (2014, July 22) report, calling for further research studies in workforce readiness skills development.

Variable Association to Theoretical Foundations

The two (EDP) factor variables (dependent variable group) being measured are:
Factor 2 – Experience Work/Life
Factor 4 – Generic Skills (Dacre-Pool & Sewell, 2007).

The rationale for usage of each theory in this study directly relates to quantitative methodology and causal-comparative research design to further address statistically, the gap within the literature related directly to theory verses operationalized practice challenges. This study will add to the body of knowledge of Human Capital Theory and both the Functionalist and Credentialist theories by determining if there is a difference in the workforce readiness skills of online and traditional college students. Moreover, what specific skill sets each possess, and which could directly affect administration and curriculum design in education, as-well-as employability possibilities of future hires. The scores derived from analysis of the survey instrument will aid in the answering of the research questions, and hypotheses, as-well-as produce knowledge relevant to the areas of labor, commerce, education and health and human services, as called for by the U.S. Departments of Labor, Commerce, Education and Health and Human Services, (2014, July 22) report.

Review of the Literature

This non-experimental, quantitative methodology, causal-comparative designed study will add to the current body of research on the investigation into workforce readiness skills acquired by students from online college pathways and traditional college pathways, by examining the differences between the two educational groups, as determined from the Bachelor program at a large Western Christian university, from their CareerEDGE Employability Development Profile (EDP) instrument scores. Employers are vigorously complaining that online distance learning and face-to-face contact models of education, have manifest their self in a disconnect between educational skills taught and job ready skills employers need (Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a).

Because of the increased popularity of online classes, the total effected population from both online and traditional educational pathways is 21.266 Million students who attended college in the academic year 2015 alone (NCES, 2014, Fast Facts). The importance and need of studying whether workforce readiness skills being taught and those 21st century employers claim are not valid to entry-level employees, cannot be determined by lumping the two different educational pathways together. The two different pathways must be investigated each to determine if a difference exist in educational skill sets and what may cause said differences before curriculum’s can be designed to produce better skilled graduates. The significance of this study is that it will produce statistical data that will lead to a more accurate, essential and deeper understanding of the relationship between education and application of employability skill sets of future graduates.

This study will examine the gap in knowledge from literature reviews that show a disconnect between the academic skills colleges and universities are teaching and the skills today’s organizations say they need from their entry-level workforce (Bessolo, 2011; Brungardt, 2011; Cai, 2013; Lindsey & Rice, 2015; Robles, 2012; Soulé & Warrick, 2015). This gap in knowledge is supported through the studies of Sidhu and Calderon, (2014) where they found that, “more than one-third (39%) of business leaders are not confident that U.S. college students are graduating with the skills and competencies that their businesses need” (p. 1). The literature review further supports this disconnect, as three quarters (75%) of chief academic officers from educational institutions consider their pedagogical offering as highly valued (Allen & Seaman, 2013).

Per recent studies drawn from global references, this disconnect of skills taught and skills needed by organizations entry-level workforce, should be measured on a pandemic scale (Chanco, 2015; Săveanu & Buhaş, 2015). Studies from China, Romania, Tajikistan, Philippines and the European Union produce results that show they are facing the same issues with hiring talent as their U.S. counterparts (Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Youhang & Dongmao, 2015). Additionally, this problem is further exacerbated by the fact that 21.266 Million U. S. students entered colleges and universities in 2015 (NCES, 2014, Fast Facts), and those figures are projected to increase by approximately 330,000 students each following year through 2023 (NCES, 2014, Table 1, p. 174). This disconnect between what college chief academic officers believe is a credible academic offering and what organizations, government and global studies show are not credible, demands further empirical study (Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Sidhu, & Calderon, 2014; Youhang & Dongmao, 2015; U.S. Department of Education, 2014a; White House, 2015a).

Background to the Problem

Numerous studies (Brungardt, 2011; Deepa & Seth, 2013; Lindsey & Rice, 2015; Robles, 2012; Weaver & Kulesza, 2014) concerning workforce readiness skills were published in the past 8 years since Dacre-Pool and Sewell, (2007) study. However, most of these studies are approached from one of five perspectives. They are (a) the employer (Iyengar, 2015; Kelly, 2015; Kyng, Tickle & Wood, 2013; Zhao, 2015), (b) that of the educational institutions (Iuliana, Dragoș & Mitran, 2014; Soulé & Warrick, 2015), (c) that of the student (Hart Research Associates, 2015; Iliško, Skrinda & Mičule, 2014), (d) that of Government (U.S. Departments of Labor, Commerce, Education and Health & Human Services, (2014, July 22) and (e) that of the global perspective (Chanco, 2015; Youhang & Dongmao, 2015). The two major delivery paths, online college pathways and traditional college pathways, are often investigated from only one of these perspectives, and comparison of their differences between each other is minimal in most cases (Allen & Seaman, 2012, 2013, 2014). This study will investigate both online college pathways and traditional college pathways from all five perspectives.

Rationale for Methodology

This quantitative methodology, causal-comparative research study, will add to the current body of research on the investigation into workforce readiness skills acquired by students from online college pathways and traditional college pathways, by examining the differences between the two educational groups, as determined from the Bachelor degree program at a large Western Christian university, using their CareerEDGE Employability Development Profile (EDP) instrument scores. This quantitative methodology is best (Brasoveanu, 2012; Gay & Weaver, 2011) because it will answer the research questions, and hypotheses, as-well-as the problem statement: It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways. This is the best methodology compared to others available, such as qualitative or mixed methodology because it will answer, statistically, the degree that the researcher can defend findings that answer the research questions and hypotheses (Gravetter & Forzano, 2012). By examining, a list of the acquired workforce readiness skills of both online and traditional college pathways students’ raw scores, from the (EDP) survey instrument, a definitive answer to the question of whether a difference exists will be attained. This will be accomplished through statistical analysis using a two-tailed independent samples t-test (Blakstad, 2015; Light & McGee, 2015).

Dependent Variable Groups

Factor 2Experience Work/Life Skills
Factor 4Generic Skills

Independent Variable Groups

The independent variable groups being measured are online college and university pathways students and traditional college and university pathways students. These variables are aligned to the research questions and hypotheses, as-well-as theoretical foundations, thus aligning with the Problem Statement: It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways.

Theoretical Foundations Association with Study

Instrumentation Theoretical Model

The model upon which the study instrument is built is that of a questionnaire, normally regarded as a survey research method design (Dillman, Smyth, & Christian, 2014; Ornstein, 2013). The theory supporting this design is based on Probability Theory – specifically that of Frequency Probability (Brasoveanu, 2012), where (n) equals the sample population, which is not finite, as it is dependent on a robust population size (N) and the Frequency Probability equals the sample space (the set of all possible outcomes) (Blakstad, 2015; Brasoveanu, 2012; Penrose, 1995).

Human Capital Theory (HCT)

What is HCT? Human Capital Theory of education (Becker, 1964; Schultz, 1961; in Walters, 2004, pp. 99-100) is an economic variant of the Technical Functional Theory (Collins, 1971). The foundation of Human Capital Theory suggests that increases in education are directly related to increased demand for skilled labor. Hence, college students will continue to seek increases in educational degree paths until the opportunity costs (Cornacchione & Daugherty, 2013) of procuring additional education is surpassing the benefits that it provides (Walters, 2004).

How it fits this study. HCT is directly aligned to this study, as it is bedrock to education; the teacher/student paradigm is prevalent for both online and traditional ground campus education and workforce readiness skills attainment for college students. Attached to this theory are operationalized theory models associated to perspectives from educators and seminal authors in the field(s) of education, economics and social theory (Emile Durkheim, 1858-1917). Additionally, Bessolo, (2011) suggest that there is “a growing recognition of, and value placed on human capital particularly the impact on higher education” (p. 2).

Theories Supporting Theoretical Foundation

Two major sub-theories of Human Capital Theory (HCT) will be discussed in detail as they directly relate to the theoretical foundation of this study by demonstrating operational models (Durkheim, 1895; Collins, 1979) that could explain possibilities for the disconnect between those skills being taught and the skills executive hiring authorities say they need from their entry-level workforce.

Functionalist Theory Operational Model

What is Functionalism? One of the oldest theories most closely associated with HCT in education is that of Functionalism (Durkheim, 1895). A functionalist theory in education posits, “That it is the role of education to transmit core values and social control through the attributes that support the political and economic systems that fuel education” (Durkheim, 1895, pp. 60-81). Per the functionalist theory, “economic and technological innovation generally raises the skill levels required to perform jobs” (Walters, 2004, p. 99).

How a functionalist theory fits this study? The research topic for this study explores the differences in workforce readiness skills obtained by students from online and traditional colleges, as determined from the Bachelor program at a large Western Christian university, as determined by their CareerEDGE Employability Development Profile (EDP) instrument scores. A Functionalist theory is directly aligned with the economic and technological innovation needed to raise the workforce readiness skill levels required to perform jobs, and this study is a quantitative analysis of those workforce readiness skills being taught by colleges and universities.

Credentialist Theory Operational Models

What is a Credentialist theory model? Per Credentialist theory models, learning in today’s educational institutions is more about accepted standards of social norms than with contributory learning and cognitive skill sets (Collins, 1979). Collins (1979) Credentialist theory model posits that employers use credentials (degrees, certifications, etc.) as a contributory factor to elevate educated workers to better jobs, and these higher educated workers are enjoying more lucrative jobs.

How Credentialist theory model fits this study. Currently employability is judged on several factors, one of those being what credentials (degrees, certifications, etc.) an entry-level workforce college graduate has attained in their chosen field of study. The credentials offered by entry-level graduate employees will determine the position they receive and the starting salary of that position (Collins, 1979).

Education/Employer Collaborating and Credentialist Theory

Grant, (2015) presents her own example of a solution to the skills gap through a Learning Blueprint, which is a set of five program requirements that are based on Credentialist Theory (Collins, 1979). The program guidelines require that employers and educational institutions collaborate (form partnerships) to enhance employee credentialing and training, as-well-as increasing workforce readiness skills development (Grant, 2015). The idea behind her partial solution to the problem of workforce readiness skills being taught by colleges and universities, not matching what hiring authorities say they need, is through partnerships between business and educational institutions. The two-way communication would “allow present employees to earn college credits for the experiential theory and application of their work-product” (Grant, 2015, p. 76). Additionally, educational institutes will conduct an academic evaluation that would determine college credit equivalencies for said work-product. Educational institutions benefit from both the new influx of students getting college equivalencies and the input from the partnering businesses concerning what workforce readiness skills they need in the future (Grant, 2015). Businesses are benefitted by having employees gain academic credentialing, as-well-as having direct input to the educational institute with which they are collaborating.

Leadership Studies and Soft Skills Development

Brungardt, (2011) used literature reviews from nine major studies to assess soft skills (also referred to as “Teamwork Skills” in their study) of students of business schools compared to leadership education figures from two other groups. The study emphasizes the need for “leadership training in all courses or degree paths as an aid to better soft skills development” (Brungardt, 2011, p. 14). Three hypotheses were tested: those students with no leadership training (Ho1), those with two years’ college and/or a certificate in leadership (Ho2), (i.e. those with 9 credit hours or less of leadership training) and third, those students with both business skills education, ‘hard skills’ and leadership classes (Ho3), (i.e. those with 12 credit hours of leadership classes or more). Their findings suggest that no significant difference was recorded in the first two hypotheses Ho1 and Ho2, but hypotheses Ho3 showed “significant difference in soft skills or teamwork skills development of those with leadership studies training” (Brungardt, 2011, p. 13). Brungardt, (2011) study emphasizes that “leadership courses, such as Organizational Leadership and/or Leadership Development, added to any degree path, will contribute to the development of more effective soft skills development overall” (p. 14).

Sample Population Defined

The sample population size (n) of 210 respondents  (G*Power, 2015, Figure 5, p. 203), is drawn from a general population size of 43,725 (N) respondents possible from the Bachelor program online and traditional college pathways students at a large Western Christian university, as determined by their CareerEDGE Employability Development Profile (EDP) instrument scores. The true effect size (d = 0.5), (α = 0.05) alpha level, Power (1-β err prob) = 0.95, Allocation ratio N2/N1 = 1, Critical t =1.9714347, and Actual power = 0.9501287 was needed for statistical power analysis for validity and will produce a sample size of 210 (n) respondents (Faul, Erdfelder, Lang & Buchner, 2007a, 2007b). This G*Power analysis is needed because of the probability of correctly rejecting the test hypotheses “can have severe consequences as Type I and Type II Errors are possible” (Faul, Erdfelder, Lang & Buchner, 2007b, p. 51). G*Power test the probability of correctly rejecting the test hypotheses when the alternative hypotheses is true (Faul, Erdfelder, Lang & Buchner, 2007a, 2007b).

Instrumentation

The (EDP) survey questionnaire instrument for this quantitative study was supplied by permission of Dacre-Pool & Sewell, (2007), (see Appendix D, p. 188). The survey instrument was previously certified for validity and reliability through testing using Exploratory Factor Analysis and Confirmatory Factor Analysis (Dacre-Pool & Sewell, 2007; Dacre-Pool, Qualter & Sewell, 2013). Additionally, In the case of the original CareerEDGE Employability Development Profile (EDP) instrument (Dacre-Pool & Sewell, 2007), the reliability of the instrument was determined by a test-retest reliability procedure. The test-retest reliability procedure examined the extent to which scores from one sample are stable over time from one test administration to another (Creswell & Plano Clark, 2011). The instrument was administered to 19 individuals. The same instrument was then administered to the same individuals several weeks later. The arrangement of the questions changed during the second administration. The scores from Time 1 and Time 2 were used to compute the Pearson correlation coefficient as a measure of the test-retest reliability of the instrument. The Pearson correlation coefficient is represented as ΓTime1∙Time2 =??? (Creswell & Plano Clark, 2011; Green, 2015).

EDP Factor Subscale

N = 19                                                                                          Time 1                     Time 2                                 t-test & p-value
Career Development Learning                                         24.58 (4.51)           29.47 (3.47)                          t = 4.483, p = 0.000
Experience Work/Life                                                          8.37 (2.93)              9.53 (2.20)                         t = 2.226, p = 0.039
Degree Subject Knowledge                                               25.32 (4.02)            28.37 (2.01)                         t = 3.522, p = 0.002
Generic Skills                                                                      16.89 (2.40)             17.12 (2.26)                        t = 0.482, p = 0.635
Emotional Intelligence (EI) & Self-Management       63.42 (6.89)            63.95 (8.31)                        t = 0.446, p = 0.661
 (Dacre-Pool, Qualter & Sewell, 2013, p. 309) (Appendix F, pp. 198-199).

The use of the survey instrument in this study will contribute directly to answering the research questions and hypotheses through analysis of the instrument scores of the independent variable groups. The survey questionnaire consists of 18 (EDP) skills questions, dependent, dependent variable group concerning the independent variable groups, online and traditional college pathways students. This study will explore the difference between two populations and the scores derived from the (EDP) skills instrument scoring (online students [O] = x1, and traditional students [T] = x2) by the Bachelor program students at a large Western Christian university, as determined by their CareerEDGE Employability Development Profile (EDP) instrument scores.

Workforce Readiness Skills Definitions

Skills or skill sets are defined as belonging to either soft or hard skills “developed aptitude or ability” (Merriam-Webster Dictionary, 2015, p. 1), that the colleges graduate has acquired during matriculation in a specific career field and/or from life experience. The term workforce is used to pinpoint a domain relevant to employability, employers and/or employees.

Soft Skills Defined. Soft skills are interpersonal qualities, also known as people skills, and personal attributes that one possesses (Robles, 2012).

Hard Skills Defined. Hard skills are the technical expertise and knowledge needed for a job (Robles, 2012). Hard skills are defined as job related skills, those specific to the career domain of the degree field studied (Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a).

For-Profit, Non-Profit and State Educational Pathways

Per the U.S. Departments of Commerce, (2014, July 22) report, educational student debt has reached an epidemic level of $1.3 Trillion dollars. This report and others posit that For-Profit, Non-Profit and State educational pathways equally share in the blame. Per Deming, Goldin & Katz (2012) for-profit institutions are U.S. higher education’s fastest growing sector in both increased enrollments from “0.2 percent to 9.1 percent from 1970 through 2009” (Deming, et. al., 2015, p. 1) and “for-profits leave students with far larger student loan debt burdens” (Deming, et. al., 2015, p. 4). Additionally, for-profit students have higher unemployment rates and lower earning, comparison to students from Community colleges, State colleges and private Non-Profit institutions (Deming, et. al., 2015).

Moreover, “for-profit students incur far greater default rates due to lower starting salaries than these institutions suggest they will earn with their degree paths” (Deming, et. al., 2015, pp. 4-5; U. S. Department of Justice, 2015, p. 1). Additionally, “for-profit students self-report lower satisfaction with courses and are less likely to consider their education and loans worth the price-tag” (Deming, et. al., 2015, pp. 4-5). The for-profit sector disproportionately serves what The Executive Office of the President, (2014, January), refers to as “low-income students” (pp. 2-47), those in the categories of “older students, women, African Americans, and Hispanics” (p. 8).

A Complete College America (2014) report suggests that students not graduating in 4-year degree paths are because for-profits, non-profits and state colleges and universities are “adding additional classes that extend graduation up to an additional 2 to 3 years in some cases” (p. 9). One statistic shows that 60% of bachelor’s degree recipients from for-profits, non-profits and state colleges and universities change colleges at least once during matriculation and nearly half of those transfer students lose most or all their earned credits from the institution they leave because of “broken transfer policies” (p. 9). These facts account for “$600 Million dollars lost each year if only two courses per transfer student fail to transfer all credits earned” (p. 9). Moreover, the policy is determined at “each individual college, not at the Federal level, as to how many credits will transfer” (Complete College America, 2014, pp. 8-11).

Of the top 25 for-profits, non-profits and state colleges and universities that have produced the most college student debt, “the top 13 account for $109 Billion or almost 10% of all federal student loans” (Looney & Constantine, 2014, Bookings Institute Papers on Economic Activity, p. 2). The top 13 colleges consist of nine for-profit schools, #1 being the University of Phoenix-Phoenix campus ($35 Billions) and #11 Grand Canyon University ($5.9 Billions) and only four non-profit or state public institutions are included in that number. Those institutions are #3 Nova Southeastern University ($8.7 Billion), #8 New York University ($6.3 Billion), #12 Liberty University ($5.7 Billion) and #13 University of Southern California ($5.3 Billion) (Looney & Constantine, 2014, Bookings Institute Papers on Economic Activity). (See Figure 6 this study, p. 199 for complete list of top 25.)

Additionally, per the Bookings Institute paper, “U.S. student loan debt has quadrupled in the past 12 years to the $1.3 Trillion now owed, with most of the debt being held by non-traditional borrowers attending for-profit and non-selective institutions” (Bookings Institute Papers on Economic Activity, p. 2; Looney & Constantine, 2014). Per the report; “In contrast, most borrowers at four-year public and private non-profit institutions have relatively low rates of default, solid earnings, and steady employability rates” (p. 2).

Traditional Education

Traditional education is defined as those classes taken through the traditional face-to-face contact with instructors and students in a set location, also referred to as “brick and mortar schools” (Raj & Al-Alawneh, 2010, p. 5). While technical and vocational institutions are showing increases in enrollment, traditional college pathway students are showing a steady decline in enrollment, which may be due to the greater increase in online education (Burns, 2011, p. 1). While this decline has several contributing factors, the number one overriding reason is high tuition cost resulting in large student debt and long-term payments, which are often tied to high interest rates on said loans (Burns, 2011). One of the reasons for this increase in student debt is that non-profit institutions often change the status of students from in-school-deferment status (if the student takes any break longer than two weeks) to a forbearance status, which allows the lending institution to raise the interest rate from the original 2.5% to as high as 9.75%, such as Wells Fargo Bank, N.A. for example (Burns, 2011).

Per Weaver & Kulesza, (2014), in a study of accounting students in business schools (traditional college pathway) show “a persistent gap exists between what is taught and what skills employers expect in students” (p. 34). Per the authors; “A series of studies show that increasingly employers desire soft skills such as critical thinking, problem solving, and communication skills in addition to essential accounting education” (p. 34). This study will add to the body of knowledge concerning these soft skills through the scoring of the (EDP) skills survey (dependent, dependent variable group) questionnaire, which will determine if differences exist between online and traditional college pathways students.

Online Education

Online educational delivery systems started in 1994 (Hill, 2012). Online education delivers classes 24 hours per day, 7 days a week through the college or universities portal system (Intranet) connected to students through the Internet (Hill, 2012). Allen & Seaman, (2014) report tracking online education from for-profits, non-profits and state colleges and universities in the U.S. (annual report since 2002) shows evidence that online learning is critical to their institution’s long-term strategy and the numbers validate their findings: “2002 student online enrollment was 48.8% but by 2015 that number had mushroomed to a staggering 70.8%” (p. 4).

A question of whether online offerings are comparable to face-to-face traditional offerings has been a question of concern for numerous years when obtaining data for their reports (Allen & Seaman, 2014). Past studies found that “chief academic officers rated the learning outcomes for online education ‘as good as or better’ than those for face-to-face instruction until 2013” (p. 5). The newest report for 2015, (using 2013 and 2014 data), now suggest that trend is reversing. While 77.0% of academic leaders in 2012 believed, their offering was as good as or better than face-to-face instruction, 2013 and 2014 data shows that figure has dropped to 74.1% for two years running. Allen & Seaman (2014) study uses data acquired each year from the U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS), which collects data from the institutions eligible for Title IV financial aid.

Per Mannapperuma (2015) study of online for-profit educational industries shows, “little regulation and numerous accusations that it places profits before the interests of its students” (p. 541). Mannapperuma (2015) study addresses the distance learning industry from the legal and regulatory perspective. The author found that “proposed regional interstate compacts promise to standardize oversight of the for-profit distance learning industry; but it fails to include states that regulate the industry the least and thus fails to protect students who are most likely to need protection” (p. 541). In the years “2007-2008 nearly 4.3 Million undergraduate students (20% of all undergrad students), took at least one distance (online) educational course” (Mannapperuma, 2015, p. 543).

Deming, Goldin & Katz’s (2012) study of for-profit educational institutions (referred to as “Chains” in their study) because of the number of online students they enroll, “no more than 33% of students from one U.S. State is allowed” (p. 3) but not enforced, found that “for-profit institutions leave students with far greater student loan debt burdens” (p. 4). Lindsey & Rice (2015) study of the interpersonal abilities (soft skills) of online students versus traditional students by evaluating their Emotional Intelligence (EI) using the Situational Test of Emotional Management (STEM), showed surprising results. Their findings showed that “students who completed at least one online course scored significantly higher on the test than students from a traditional college pathway (face-to-face instruction only)” (p. 134). The study did not differentiate between for-profit, non-profit or State educational pathways, “only student business majors and minors were surveyed with a sample population of 865 respondents” (p. 127).

Economic Perspective of Student Educational Debt

With student loan debt reaching 1.3 Trillion (Carnevale, Strohl & Gulish, 2015; Complete College America, 2014; NCES, 2015, Education Statistics May 7), there is a Federal Government (2014) request for further research in the areas of “academic training, labor market outcomes, and economics (student educational debt)” (U.S. Departments of Labor, Commerce, Education and Health & Human Services, 2014, p. 21). This literature review would not be complete without briefly identifying some history and legal requirements (or lack thereof), that have led to the current situation.

Mannapperuma (2015) identifies the legal issues that provided the factual support for the author’s conclusions that for-profit Educational Management Organizations (EMO’s) need greater regulation through “tying federal Title IV funds to the Interstate Compact System” (p. 589). There are few protections available to students attending for-profit distance learning institutions, those that are causing the epidemic in student debt through continually raising the cost of tuition and fees because of non-regulation of this educational sector at the State and Federal level.

The issue of regulation of education in this country is built on the same fragmented approach that the Department of Education has used starting in 2010, entitled the “Program Integrity Rules” (Mannapperuma, 2015, p. 544). The issue is that one of these rules that deal with enforcement, “the Federal Online State Authorization Rule (FOSAR), is not actively enforced by the U.S. Department of Education, in any State” (Mannapperuma, 2015, p. 545). The (FOSA) rule states “that higher education institutions offering distance-learning courses must obtain that States authorization through an accreditation process to do business within the State’s borders or risk losing federal funds under Title IV” (Mannapperuma, 2015, p. 545).

Per the federal Title IV program, it requires institutions to disclose information related to several areas relevant to consumer protection, “including institutional information and characteristics, student financial aid information, health and safety programs, student outcomes, athletic programs, and student loan information” (Commission on the Regulation of Postsecondary Distance Education, 2013, p. 24). The U.S. Department of Education decides eligibility for Title IV participation and assigns a financial responsibility composite score to the State educational institution or its Education Management Organization (EMO) that must be between 1.0 – 1.5 (Commission on the Regulation of Postsecondary Distance Education, 2013). Regardless of the fact of non-enforcement of the rule, if the institution’s eligibility score is less than that specified, the “institution can continue to receive Title IV funds for up to three years before losing eligibility” (Commission on the Regulation of Postsecondary Distance Education, 2013, p. 5).

Per an article by Hentschke, Oschman and Snell (2002), “Education Management Organizations (EMOs) are for-profit firms that provide whole-school operation services to public school agencies” (p. 1). Despite objections within the education profession, “EMO’s have grown exponentially in the last two decades, reaching a total of approximately 36 companies (estimated in 2002) operating in more than 24 States (estimates in 2002) and affecting some 368 institutions of higher learning (estimates in 2002)” (Hentschke, et. al., 2002, pp. 1-16). This practice is referred to as “privatization of public schools” (p. 15) and since the article was published in 2002, the numbers in both the U.S. and abroad now have expanded to the point that an accurate accounting is not available as these EMO’s now include elementary schools, charter-schools and public school districts (Hentschke, et. al., 2002).

A stunning U.S. Department of Justice (2015, November 16) decision was reached in a landmark global settlement with Education Management Corp. (EDMC), the second-largest for-profit education management company (EMO) in the U.S. (U.S. Department of Justice, 2015, November 16). The ongoing 8-year-old case (began in 2007) reached a settlement with the Department of Justice and Pennsylvania’s Attorney General’s office, when they agreed to pay $95.5 million to settle claims it “illegally paid recruiters and exaggerated the career-placement abilities of its schools” (p. 1) located in the U.S. and Canada. In addition, the Education Management Corp. (EDMC), which runs 110 schools in 32 states and Canada, will forgive an additional $102.8 million in students’ loans it made to 80,000 former students, per U.S. Attorney General Loretta Lynch (Mandak & Tucker, 2015, Associated Press 2:49 PM EST).

Global Perspective on Workforce Readiness Skills

The issue of online and traditional college students not having the needed skill sets employers want from recent students is not solely endemic to the U.S. business environment. Per recent studies drawn from global references, suggest the issue of college students and their skill sets in their field of study not matching those needed by organizations should be measured on a pandemic scale. Studies from China, Romania, Tajikistan, Philippines and European Union produce facts that show they are facing the same issues with hiring talent as their U.S. counterparts (Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Youhang & Dongmao, 2015). Chanco (2015) study, based on a National Statistics Office of Philippines, (2015, October) study investigated the correlation of college students’ skill sets not matching the requirements for jobs, compared to the unemployment rate for the Philippines and “found that 22.2% percent of the unemployed were college students with Master level degrees or higher” (p. 1).

Jackson (2014) study of factors influencing job attainment in recent Bachelor degree students in Australia; “…illustrate the true significance of workforce readiness skills development or the lack thereof, on the global economic labor markets” (Jackson, 2014, p. 136). Using a survey completed each year by the Australian Graduate Survey (AGS) in 2011 and 2012, where a sample population (n) of 28,246 (2011) and (n) of 28,009 (2012), found that Bachelor degree students noted a 9% decline in graduate full-time employability since 2008 (Graduate Careers Australia, 2012b). This decline is attributed to two economic factors: “…the global financial crisis and economic stagnation that is ongoing in the UK and U.S. labor markets” (Jackson, 2014, p. 136). Studies by Accenture (2013) and Purcell, Elias, Atfield, Behle, Ellison & Luchinskaya (2013) in Jackson (2014) suggests, “…that the decline in graduate full-time employability and lower starting salaries can be blamed on the economic stagnation in the UK and the U.S.” (p. 136).

Bondarenko (2015) study of the shortage of professional skills and qualities of workers in the Russian labor market found that the qualification structure of the employees is not well balanced. Per Bondarenko (2015), “The problem of lack of balance is linked to, (1) the deficiency of workers’ qualifications (workers have qualifications that are lower than what is required by the employers), and (2) personnel’s over-qualification (qualifications that are higher than what is required)” (Bondarenko, 2015, p. 120). This deficiency of worker’s qualifications (workforce readiness skills) was attributed to multiple factors with the highest deficiency rating of 75%, attributed to workers not able to “solve unforeseen job problems and accomplished tasks on their own” (Bondarenko, 2015, p. 133).

Another study completed in post-Soviet Tajikistan by Jonbekova (2015), University of Cambridge, examines employer’s perspectives of university students’ skills and preparation for employability (Jonbekova, 2015). This thematic analysis of employers and secondary data, “points to a 2-year decline in the quality of higher education in Tajikistan, thus widening the gap between learned workforce readiness skills and those required by employers” (p. 169). The findings of Jonbekova (2015) study produces an answer to the question of workforce readiness skills being taught, not aligning with employer’s needs, and makes a valid argument for a global solution. Per Jonbekova (2015), “Employers’ perspectives suggest that the reform of the education sector without the creation of more decent job opportunities will likely exacerbate the current skills mismatch in Tajikistan” (p. 169).

Not only are recently formed nations having an issue with workforce readiness skills not matching what employer’s need but older civilizations, such as Turkey, are feeling the effect of the problem. Per a study by Alpaydin (2015), “There are various findings indicating that the mismatch between qualification (degrees) and skill is significantly high in Turkey” (p. 945). Turkish officials and employers are calling for further studies in three areas; “labor forecasting, skills need relating to said forecast and participation and contributions of parties in the educational process” (Alpaydin, 2015, p. 945).

Government Perspective of Workforce Readiness Skills

Per a combined study report from the U.S. Departments of Labor, Commerce, Education and Health & Human Services, on July 22, 2014, there is a need to “expand and improve access to labor market, occupational, and skills data and continue basic research on labor markets and employment” (pp. 21-22). Per the Federal Governments (2014) report, “More evidence is needed to fill gaps in knowledge, improve job training programs, inform practitioners about adopting promising strategies, and expand proven models to address the needs of specific groups of workers, industries, communities and institutions” (U.S. Departments of Labor, Commerce, Education and Health & Human Services, 2014, p. 21).

The need for this study is justified by the fact that college students, both domestic and international, may not be graduating with the skill sets they need for employability as an entry-level worker in today’s organizations. This supports the purpose of this study, which is to investigate the differences between workforce readiness skills acquired by students from online college pathways and traditional college pathways, from the Bachelor program at a large Western Christian university, as determined by their CareerEDGE Employability Development Profile (EDP) instrument scores. This research study will investigate this disconnect and apparent gap in knowledge to define to a measured degree the problem statement; It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways. The Executive Office of the President, (2014, January 7) has outlined four (4) barriers facing low-income students to entering college, which they suggest will increase college attendance and graduation for that class of student. The first initiative listed is “I. Connecting more low-income students to colleges where they can succeed and encouraging completion once they arrive on campus.” (Executive Office of the President, 2014, January, p. 4). The language of the report suggests that students are to blame for their failure to graduate because they choose colleges that have smarter and better-prepared students than those coming from low-income families. Quoting the study, “many low-income students choose a college that does not match their academic ability” (Executive Office of the President, 2014, January, p. 4).

The second initiative is “II. Increasing the pool of students preparing for college.” (The Executive Office of the President, 2014, January, p. 6), want this pool of student customers to start at the 8th grade level. Quoting the study; “we also need to reach students earlier to increase the pool of low-income students ready for college” (pp. 6-7).

The third initiative is “III. Reducing inequalities in college advising and test preparation.” (Executive Office of the President, 2014, January, pp. 7-8). The report states that counselors from schools that service low-income families see “1,000 students per counselor versus 470 students per counselor nationally” (Haskins, Holzer & Lerman, 2009, Economic Mobility Project, pp. 43-44).

The fourth initiative is “IV. Seeking breakthroughs in remedial education.” (Executive Office of the President, 2014, January, pp. 8-9). The reasoning being that students attending college from low-income families are going to “enter college underprepared to succeed, and remediation needs at four-year institutions are greatest for low-income students” (Executive Office of the President, 2014, January, pp. 8-9).

The White House has released numerous reports, initiatives and studies after President Obama sent Congress a Blueprint for Reform of the Elementary and Secondary Education Act (ESEA) (U.S. Department of Education, 2010, March 13), which Congress has not acted on to date. The (ESEA) Act was to address issues created by the No Child Left Behind Act of 2001. When Congress failed to act on the bill, the administration moved forward by providing States flexibility within the law – “as authorized by provisions in the law itself – to pursue comprehensive plans to improve educational outcomes for all students, close achievement gaps, and improve the quality of teaching” (The White House, 2014, Ready to Work, p. 1). To date 43 States and the District of Columbia have received ESES Flexibility (U.S. Department of Education, 2010). ESES Flexibility has as its goal, “State and local innovation aimed at increasing the quality of instruction and improving student academic achievement” (ESEA Flexibility, 2012, pp. 1-3) and has numerous sections on the requirements for State Educational Agencies (SEA’s) and Local Educational Agencies (LEA’s) (U.S. Department of Education, 2012, June 7, pp. 4-35).

A U.S. Department of Labor, Department of Commerce, Department of Education and Department of Health & Human Services, (2014, July 22) government-wide report, under the guidance of Vice President Joseph Biden, calls for further study to “determine what information is lacking and identify future research and evaluation that can be undertaken to ensure the Federal programs invest in effective practices” (p. 1). Some of the findings suggest that for adults, “a post-secondary educational degree related to jobs in demand, is the most important determinant in earnings and incomes” (p.1). The closer training is related to actual job or occupation real-world requirements, the better the results of said training. Employer and industry engagement strategies will help improve alignment of training to employer’s needs.

Employers’ Perspective of Workforce Readiness Skills

Carnevale, Gulish & Strohl, (2015) study shows the breakdown of the employers’ role in the $1.3 Trillion postsecondary education and training system, spent by educational institutions and employers annually on formal and informal higher education and training. Of that amount, “educational institutions spend $407 billion and employers spend $177 billion, up from $140 billion in 1994” (p. 3). “Employer spending on education and training increased by 26% since 1994 (or 1.238 percent increase each year)” (p. 3). By comparison, colleges and university spending rose by “82% in the same period” (p. 3). Additionally, “employers spend 58% of training dollars on Bachelor’s degree-holders, ages 25 to 54, which typically complements a traditional college education” (p. 5) and accounts for the core workforce. Annualizing these figures would suggest that employers spend approximately 1.762 billion per year on upgrading training of critical employees but no breakdown was found of what type of training (hard or soft skills development) this amount accounts for in total training dollars.

Sidhu & Calderon, (2014) study results show that, “More than one-third of business leaders (39%) are not confident that U.S. college students are graduating with the skills and competencies that their businesses need” (p. 1). A separate 2013 Lumina/Gallup poll report published February 25, 2014, “finds that 96% of chief academic officers at higher education institutions say their institution is ‘very or

somewhat’ effective at preparing students for the world of work” (p. 1). The logical assumption from these facts is that educational institutions believe they are teaching the 21.266 million students entering college in 2015 with the correct workforce readiness skills needed for employability but at the least 40% of all employers say that their offering is not adequate (NCES, 2015, Education Statistics, May 7).

“Seventy-one percent (71%) of business leaders who participated in the Lumina/Gallop poll suggest that they would hire someone without a post-secondary degree or credentials over an individual with a degree, when prioritizing skills and knowledge in the hiring decision process” (Lumina/Gallop Poll, 2014, p. 23). The conclusions drawn from the survey by the authors suggest that higher educational institutions need to reexamine their degree and credentialing programs and bring them in line with the skill sets that businesses need most. Per most business leaders, “(71%) say they currently do not partner with any higher educational institution, with only 29% stating that they do have a partnership in place” (p. 25). When business leaders were asked what talent, skills and knowledge higher educational institutions should develop, the most popular answer was “internships and practical on-the-job experience for students at 14%” (Lumina/Gallop Poll, 2014, p. 30).

The issue per literature review of the Lumina/Gallop report seems to be that employers most want students to have practical internship and/or experience on-the-job, but 71% do not attempt to rectify the situation. Additionally, the report lists the “second most sought after skill sets as communication skills/English speaking and writing skills at 12%” (p. 30). These are the skills that employers suggest are necessary and must be taught by current higher educational institutions to affect change in the current labor hiring market. In the Lumina/Gallop poll (2014) the employers were asked; “What is your business currently doing to help increase the proportion of Americans to attain postsecondary degrees, certificates, or credentials?” (p. 31). The responses covered 10 items (see Lumina/Gallop poll, p. 31), and the response proportions were from “2% to 6% as to what employers were doing now, but the largest response was 58% of employers were doing nothing” (p. 31). “Only 1 in 10 employers provide tuition reimbursement, scholarships or internships/mentoring/training or certification opportunities for employees” (p. 31).

Parasuraman & Prasad, (2015) extensive study of acquisition of corporate employability skills, found convincing evidence to support the conclusions of the Sidhu & Calderon, (2014) and Lumina/Gallop Poll, (2014). The three studies suggest that employers and higher educational institutions must start collaborating and/or forming partnerships to ensure that newly graduating students have the experience and workforce readiness skills needed by today’s employers. Grant, (2015) not only agrees with Parasuraman & Prasad, (2015), Sidhu & Calderon, (2014) and Lumina/Gallop Poll, (2014), but also presents her own example, a Learning Blueprint, which is a set of five program requirements that are based on Credentialist Theory (Collins, 1979). The program requirements guidelines are based on the requirement that employers and educational institutions collaborate (form partnerships) to enhance employee credentialing and training, as well as increasing workforce readiness skills development (Grant, 2015).

Using the facts previously stated, one could extrapolate, those 21.266 million students who entered colleges in 2015, that will not be hired upon graduation by 40% of employers, equates to roughly 8.51 million unemployed students not able to secure employability because they lack the needed workforce readiness skills that they are now required to pay for. This is in line with the statistics from the U. S. Department of Labor, Department of Commerce, Department of Education and Department of Health & Human Services, (2014, July 22) report calling for further research in each department’s area of interest.

Educations’ Perspective of Workforce Readiness Skills

Zhao, (2015) study, A World at Risk: An Imperative for a Paradigm Shift to Cultivate 21st Century Learners, argues that the most popular reforms in education in the U. S. have focused on fixing the past mistakes education has made and new strategies are simply “doing the wrong thing more right” (p. 1). The author suggests that to address the ever changing global and technological environments of business, “a new educational paradigm must be initiated” (p. 2). One of the imperatives for a paradigm shift comes from Lindsey & Rice, (2015), where the authors collected data from 865 students, which was analyzed using the Situational Test of Emotional Management (STEM) to validate findings using Item Response Theory and Latent Class Analysis, by Matthias von Davier (2014) in (Lindsey & Rice, 2015). Their findings suggest that students taking at least one online class scored significantly higher on the test and benefitted from the “time, training, experience and practice of interpersonal skills (soft skills) development” (p. 127).

Iuliana, Dragoș Mihai & Mitran, (2014) discusses employability skills as a starting point for the redesign process of educational curriculums. The authors believe that university and college management, through developing curriculums that concentrate on transversal skills (soft skills), could exert pressure on the representatives of the labor market (hiring authorities), “in the context of the policies and strategies targeted per social needs” (pp. 237-238). Their study analyzed eight national and international studies, dating from 1998 to 2012, to identify soft skills that employers believe to be most important to their firms.

Iuliana, et. al. (2014) suggest a list of 15 soft skills that employers felt were most important for newly graduated economics students, but these skills would aid all educational fields as well. Of the 15 soft skills listed, some split certain disciplines into two categories, such as “TC 14 – ability to propose effective solutions; TC 15 – ability to generate effective decisions” (pp. 240-241). This study’s research questions cover nearly all the Iuliana, et. al. (2014) list, as-well-as some additional workforce readiness skills and demographic questions not covered by Iuliana, et. al. (2014).

Shea (2015), Editor-in-Chief of the Journal of Online Teaching and Learning (JOLT), in the June 1, 2015 Online Learning Journal (OLJ) edition, examined several papers concerning collaborative online learning environments, such as a Community of Inquiry-based instructional approach (CoI) framework (Hayes, Smith & Shea, (OLJ) in Shea, 2015) and Transformational theory in curriculum design. This study suggest that a productive conversation should be forth-coming by educational faculty and staff concerning the commonalities and distinctions between the two models, particularly in the context of online learning.

The issue of educational institutions not preparing students with the needed soft skills development has a history in multiple global literature review sources. Sharma & Sharma, (2010) study examined engineering students in India and determined that despite being in one of the more sought after professions, “these students are struggling with the problem of centralization and archaic examination systems at India’s educational institutions, which is detrimental to student learning” (p. 39). Sharma & Sharma, (2010) suggest that curriculum and methodologies need to be restructured at the institutional level by standardizing the training content across India, so the needs of employers will be meet regardless of where the student comes from. The authors suggest that the design of new curriculums must include soft skills such as, “communication skills, interpersonal skills, group dynamics skills, teamwork skills, body language skills, business etiquette skills, selling skills, presentation skills and confidence building skills” (p. 41).

Essary, (2014) study of how Athens State University, a small Alabama college, can gain competitive advantage shows that by identifying external factors in education, such as “changing student demographics and students’ demand for flexibility, can increase enrollment, increase revenues, reduce cost and help small colleges and universities to remain competitive” (p. 134). Athens State University (ASU) administrators, staff and faculty were included in their qualitative case study, using interview questions to determine what areas in traditional and distance learning courses will secure competitive advantage at (ASU). Their findings suggest that “online distance learning courses are increasing due to non-traditional students need for flexibility, as non-traditional students (those 25 and older with either full or part-time jobs or family commitments) made up the majority (85.4%) of their Spring 2009 student body, up from 69.61% of their Fall 2009 enrollment figures” (p. 131).

Weng, (2015) article, Eight Skills in Future Work, posits that there are eight future job skills that students will need to be effective in attaining employability in any industry or specific field of study. The eight skill sets are designed around the technologies and global interactions needed by a 21st Century workforce. These skills, per the author, will define cross-cultural competences to, “(a) functions effectively within a new cultural context and/or (b) interact effectively with people from different cultural backgrounds” (Wilson, Ward, and Fischer (as cited in Chiu, Lonner, Matsumoto, Ward, 2013, p. 844). The list of eight skills starts with technologies, which will be indispensable and have a dramatic influence on human life, per the author. The next two are computational thinking and new-media literacy, where math and social science skills are acquired. These are followed by “sense-making, developing three intelligences (SI, El and CQ), design mindset, novel and adaptive thinking, and management of cognitive load” (p. 421).

Students’ Perspective of Workforce Readiness Skills
Per Brill, Gilfoil & Doll (2014) study, “minimal work has been done to develop and validate the tools that are needed to assess soft skills” (p. 175). Their study examined forty (40) graduate MBA students from nine (9) courses using the McCann Soft Skills Assessment Tool (MSSAT) (McCann.edu, 2015) to evaluate the students in six (6) soft skills areas, “leadership, teamwork, critical thinking, logical reasoning, communication, and holistic thinking” (Brill, et. al., 2014, p. 175). The written test examined student’s abilities in the above skill areas, which was verified by their instructors rating scores of the same student sample population. The results showed significant correlations between leadership and communication from both the test scores and the instructors rating scores but empirical validation did not exist for the remaining four (4) skill areas. This means that students believe they possess teamwork, critical thinking, logical reasoning and holistic thinking skills needed for employability but their instructors do not believe this is an accurate appraisal.

A study by Burns, (2011) that centers on the adult learner, posits that an “estimated 76 million workers from the baby boomer’s generation will retire by the end of 2010” (p. 2). Per Reeves (2005) in Burns, (2011), these workers will leave a void in competent workers in age groups 35-44 by 19%, while increasing workers in the age group 45-54 by 21%. Their findings suggest that the more education an individual has, the more likely they will be employed and these findings are substantiated by the U.S. Department of Labor (2010) report.

A study by Mitchell, Pritchett & Skinner, (2013) of MBA students suggest that the integration of soft skills in the curriculum, particularly those skills in communication (both written and oral), ethics, diversity, and leadership were statistically significant to this population. Additionally, a study by Iyengar, (2015) investigated MBA degree holders and the soft and hard skills that matter to employers. The author suggests that MBA holders should have hard skills pertinent to their chosen area of specialization, such as “creativity, quantitative analytical, as-well-as strategic skills and competencies to manage innovation and policy” (p. 10). Moreover, soft skills such as problem solving, communication skills (both written and oral), leadership abilities to inspire, guide, steer, and manage teams to work towards common goals and lead by example are expected by today’s organizations. The author suggest that requirements change, as markets change, and educational institutions need curriculums to change to better reflect current trends and market demands.

In an article by Kyllonen, (2013) on a discussion panel of recent college students and two employers, conducted by host CNN’s Christiane Amanpour, “suggest that testing for cognitive skills were not important to employers as much as they are to education” (p. 18). The results indicated that an “increased awareness of non-cognitive skills, such as those associated with human-capital theory (Collins, 1979) are appearing more often in economics literature” (p. 18).
Iliško, Skrinda & Mičule, (2014) investigated Latvia education as a ‘future-facing activity’ (Facer & Sandford, 2010, p. 74). The authors suggest that students play a significant role within the interconnected framework of cultural, economic, political and ecological dimensions. This interaction leads students when deciding what career path, they study and should engage educators in the process of developing more nuanced and alternative trajectories of preferable future scenarios by “defining responsibilities and consequences in one’s personal actions” (p. 91).
Mulig, (2015) study of the high cost of graduate school loans presents some troubling figures concerning whether a graduate degree is worth the increasing cost of attainment. The author suggests that these cost increases, even as enrollment soars (which should lower cost to students), continues due to a “controversial practice called differential tuition” (p. 21), where educational institutions charge higher tuition for courses that are more popular.

Contrasting Views

Per a study by Soulé & Warrick, (2015), 21st Century learning must encompass core knowledge instruction and essential skills (soft skills) for success in today’s labor markets. The core knowledge and essential skills, “known collectively as the 4Cs: critical thinking and problem-solving, communication, collaboration, creativity and innovation” (p. 181) need to be incorporated into any educational framework for the future. The authors suggest that a 21st Century educational framework must encompass these disciplines’ “working together, not in isolation, which supports the teaching and learning of 21st century skill outcomes” (p. 183).

Additionally, Britt, (2015) study of online education posits a contrasting view of how educational institutions can better serve students effectively engaging with 21st Century workforce readiness skills. The author suggests that educational institutions should “require the creativity and imagination of the instructor to redesign the learning experience and adapt it to the online platform” (p. 399). Amrein-Beardsley, Holloway-Libell, Cirell, Hays & Chapman, (2015) article agrees with Britt, (2015) even though their discussion panel’s focus was that of “teacher evaluation of rational rule-based teaching as promoting teacher expertise” (p. 3). Amrein-Beardsley, et. al. (2015) suggest that “current models of teacher evaluation do not fairly evaluate teacher behaviors that increase instructional flexibility, creativity, and risk-taking” (p. 3). Teacher and student testing are the two-main standardized and quantifiable tools used to measure and evaluate teachers on “instructional design, pedagogy, educational outcomes, student learning and achievement” (p. 1). The issue with these observational rubrics is that the teacher is being measured on students’ scores on test getting better each time the student is tested; “but no thought is given to whether the student actually cares what score they receive is included in the evaluation” (p. 2).

The second method is observations of teachers in practice, where issues are also found as teacher qualities and practices are measured using tangible, measurable domains (e.g., preparation, organization, classroom, time management) (Amrein-Beardsley, et. al., 2015). The major issue with student and teacher testing is that rubrics do not take into consideration that teaching is a much more complex social practice and “not one lending itself to reductionism” (Amrein-Beardsley, et. al., 2015, p. 3).

An article by Cappelli (2015) concerning workforce readiness skills of students not matching what employers say they need, produces a startlingly contrasting view of the situation. Cappelli, (2015) posits that there is not such skill gap or skill shortage or skill mismatch of any type but rather the issue is with “student over-education” (p. 251). Cappelli (2015) suggest that students, especially K-12 public education, due to policy decisions, are not graduating with the basic skills they should have. He further suggests that the second complaint is with job-related skills associated with engineering and information technology (IT) specialist, and refers to this as a “skills shortage” (p. 252).

The final concern, which the author states is more common outside the United States, is that at any given time the supply of skills and the demand for skills could be disharmonious in one direction or the other: “oversupply or undersupply” (p. 252). Cappelli (2015) explains that this situation could occur with respect to “either labor markets or educational credentialing” (p. 252) and should be referred to as “skills mismatches” (p. 253). Cappelli (2015) discourse with the studies completed by other authors (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Youhang & Dongmao, 2015; Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a), does not stop at workforce readiness skills alone, “but disputes the predictions of labor shortages occurring by 2010 as well” (p. 256).

Cappelli (2015) references several seminal authors, such as Carnevale (2006), who predicted that a labor shortage was coming. Cappelli (2015) dismisses these assertions as simply misreading of the facts, suggesting that only the rate of increase in the labor force was expected to slow due to baby boomers retiring. The author references the Society of Human Resource Management (SHRM, 2003) report that large numbers of employers in the early 2000’s was preparing for a “labor shortage predicted to occur by 2010” (p. 256). The author dismisses the facts presented by the SHRM organization, (the organization that tracts labor market demographics as a part of their basic work product), as “projections that never came true” (p. 256).

Cappelli (2015) seems to be unable to discern the difference (or lack thereof) pertaining to personnel labor shortages and workforce readiness skills shortages, lumping the two into different categories without realizing that the two are synonymous in today’s labor market. The major issue with Cappelli (2015) article is that the author never discusses the workforce readiness skills employers say they need, or does he distinguishes between soft or hard skill sets. The author does present an exhaustive historical background to bolster his assumptions, as he believes they are, concerning the issues of workforce readiness skills not matching what employers say they need in a 21st Century economy.

Summary

This non-experimental, quantitative methodology, causal-comparative research study is best because analyzing the survey questions should be accomplished through statistical analysis that produce numerical values, which will produce findings supported through statistical testing of all pertinent variables (Blakstad, 2015; Dillman, Smyth, & Christian, 2014; Ornstein, 2013). Per the literature reviews dating back to 2002 to present day (Allen & Seaman, 2002, 2003 – 2011, 2012, 2013, 2014, 2015), this new population of students will most likely graduate with the wrong workforce readiness skills for the jobs they seek. This position is further supported by more current local, state, federal and global studies (Dongmao, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; U.S. Department of Education, 2014a; White House, 2015a; Youhang & Sidhu & Calderon, 2014).

Numerous studies concerning workforce readiness skills were published in the past 3 years since Dacre-Pool, Qualter & Sewell (2013) study (Deepa & Seth, 2013; Lindsey & Rice, 2015; Weaver & Kulesza, 2014). However, most of these studies are approached from only one of five perspectives (Hart Research Associates, 2015; Iliško, Skrinda & Mičule, 2014; Iuliana, Dragoș & Mitran, 2014; Iyengar, 2015; Kelly, 2015; Kyng, Tickle & Wood, 2013; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Soulé & Warrick, 2015; U.S. Departments of Labor, Commerce, Education and Health & Human Services, 2014; Youhang & Dongmao, 2015; Zhao, 2015).

The call for further research concerning workforce readiness skills is exacerbated by the effected population of approximately 21.266 Million U. S. students who attended colleges and universities in 2015 (NCES, 2014, Table 1, 1990 through fall 2103). Moreover, those figures are projected to increase by approximately 330,000 students each following year through 2103 (NCES, 2014, Table 1, 1990 through fall 2103, p. 174). This study will further extend past research through the investigation into employability skills of online college pathways students and traditional college pathways students, to determine if a difference is found between the two educational groups. This study will add to the body of knowledge called for in the U.S. Departments of Labor, Commerce, Education and Health & Human Services, (2014, July 22) report; which illustrates a need to “expand and improve access to labor market, occupational, and skills data and continue basic research on labor markets and employment” (pp. 21-22).

Recent studies drawn from diverse demographic and geographic global literature reviews strongly suggest that the issue of college students and their employability skills in their field of study, not matching those needed by executive hiring authorities, should be measured on a pandemic scale (Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015). Studies from such diverse geographic locations as China, Romania, Tajikistan, Philippines and European Union, to name but a few, produce facts that show they are facing the same issues with hiring qualified college graduates as their U.S. counterparts (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Youhang & Dongmao, 2015). This supports the problem statement of this study; It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways.

Theoretical Foundations Association with Study
The model upon which the study instrument is built is that of a questionnaire, normally regarded as a survey research method design (Dillman, Smyth, & Christian, 2014; Ornstein, 2013). The theory supporting this design is based on Probability Theory – specifically that of Frequency Probability (Brasoveanu, 2012) where (n) equals the sample population, which is not finite, as it is dependent on a robust population size (N) and the Frequency Probability equals the Sample Space (the set of all possible outcomes) (Blakstad, 2015; Brasoveanu, 2012; Penrose, 1995).

Human Capital Theory (HCT)
What is HCT? Human Capital Theory of education (Becker, 1964; Schultz, 1961; in Walters, 2004, pp. 99-100) is an economic variant of the Technical Functional Theory (Collins, 1971). The foundation of Human Capital Theory suggests that increases in pedagogy are directly related to increased demand for skilled labor.

Theories Supporting Theoretical Foundation
Two major theories (sub-theories) of Human Capital Theory (HCT) were discussed in detail as they directly relate to the theoretical foundation of this study by demonstrating operational models (Durkheim, 1895; Collins, 1979). The theories explain possibilities for the disconnect between those skills being taught and the skills executive hiring authorities say they need from their entry-level workforce.

Functionalist Theory Operational Model
What is Functionalism? A functionalist theory in education posits, “that it is the role of education to transmit core values and social control through the attributes that support the political and economic systems that fuel education” (Durkheim, 1982).

Credentialist Theory Operational Models
What is a Credentialist theory model? Collins (1979) Credentialist theory model posits that employers use credentials (degrees, certifications, etc.) as a contributory factor to elevate educated workers to better jobs, and these higher educated workers are enjoying more lucrative jobs.

Sample Population

The general population size is 43,725 (N) respondents from the Bachelor program at a large Western Christian university. The true effect sample size for validity was 210 (n) respondents, determined using the G*Power 3.1.9.2 instrument (Faul, Erdfelder, Lang & Buchner, 2007a, 2007b). G*Power test the probability of correctly rejecting the test hypotheses when the alternative hypotheses is true (Faul, Erdfelder, Lang & Buchner, 2007a, 2007b). This study will explore the difference between two populations, online college pathways students [O] = x1, and traditional college pathways students [T] = x2, by examining the CareerEDGE Employability Development Profile (EDP) survey instruments test scores of Bachelor program students at a large Western Christian university (Laerd Statistics, 2015).

Research Variables Defined

Dependent Variable Group. The 2 dependent variable groups being investigated and the two groups directly being analyzed in this study are, Factor 2, Experience Work/Life skills, and Factor 4 Generic Skills (Dacre-Pool & Sewell, 2007) (Appendix F, pp. 186–187). The 18 CareerEDGE Employability Development Profile (EDP) survey instrument skills questions are listed in (Appendix D continued, pp. 182 – 184).

Independent Variable Group. The independent variable groups being measured are online college and university pathways students and traditional college and university pathways students. These variables are aligned to the research questions, hypotheses, and theoretical foundations, thus aligning with the Problem Statement: It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways. The study need was determined to be of great importance after careful analysis of 185 plus studies and numerous article reviews dealing with the topic of employability and workforce readiness skills, which supports the argument for a current need to add to the body of knowledge associated with the study topic. After analysis of the literature concerning which employability skills are considered most important by employers (Carnevale, Gulish & Strohl, 2015), education (Lindsey & Rice, 2015; Zhao, 2015), students (Brill, Gilfoil & Doll, 2014), government (U.S. Departments of Labor, Commerce, Education and Health & Human Services, 2014) and the global perspective (Jonbekova, 2015; Săveanu & Buhaş, 2015), the following research questions needed to be answered.

Research Questions and Hypotheses

RQ1:    Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway?
H1o:   There is no statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.
H1a:   There is a statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.
RQ2:    Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway?
H2o:   There is no statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.
H2a:   There is a statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.

The conclusions drawn from the survey by the Lumina/Gallop Poll, (2014) suggest that higher educational institutions need to reexamine their degree and credentialing programs and bring them in line with the skill sets that businesses most need. When business leaders were asked what talent, skills and knowledge higher educational institutions should develop, the most popular answer was “internships and practical on-the-job experience for students at 14%” (Lumina/Gallop Poll, 2014, p. 30). The issue seems to be a lack of communication and action between educational institutions and hiring authorities, “as a majority (71%) of business leaders say they currently do not collaborate or partner with any higher educational institution, with only 29% stating that they do have a partnership in place” (Lumina/Gallop Poll, 2014, p. 25). The following section, Methodology, will illustrate and describe in greater detail the actual research design, quantitative methodology used, variables investigated, data collection processes, data analysis processes, research questions, survey questions, as-well-as hypotheses used to answer the problem statement of this study.
 

Chapter 3: Methodology

Introduction

This is a quantitative, causal-comparative research study to investigate if a difference exists between online college pathways students’ skill sets and traditional college pathways students’ skill sets. Chapter 3 focuses on the research methodology used in this study. It defines the research design, sample population, the theoretical foundations (Becker, 1964; Schultz, 1961; in Walters, 2004, pp. 99-100), the research instrument (Dacre-Pool & Sewell, (2007), data analysis, validity and reliability, and any ethical considerations that should be under consideration by the large Western Christian universities, Institutional Review Board (IRB) network. The invitation (e-mail) introduction letter will discuss the large Western Christian universities Internal Review Board (IRB) and Informed Consent approval process. The individual steps used in the collection of the data is covered in the Data Collection Procedures section and Data Analysis Procedures section describes the details involved in computation and analysis of the raw data (Blakstad, 2015; Brasoveanu, 2012; Boone & Boone, 2012).

Sidhu and Calderon, (2014) study results show that, “More than one-third of business leaders (39%) are not confident that U.S. college students are graduating with the skills and competencies that their businesses need” (p. 1). The need for this study is illustrated by the general population affected by this problem. There are approximately 21.266 million students attending American colleges and universities in the academic year 2015 (NCES, 2014). The purpose of this study is to investigate if a significant difference exists between students from online college pathways and traditional college pathways, using the Bachelor program at a large Western Christian university, from their CareerEDGE Employability Development Profile (EDP) instrument scores (Pool & Sewell, 2007). The significance of this study is that it will produce statistical data that will lead to a more accurate, essential and deeper understanding of the relationship between education and application of employability skill sets of current graduates. Per a combined study report from the U.S. Departments of Labor, Commerce, Education and Health & Human Services, on July 22, 2014, there is a need to “expand and improve access to labor market, occupational, and skills data and continue basic research on labor markets and employment” (pp. 22-24).

The expectations, (i.e. prospects for acquisition of unknown knowledge acquired from this study), is to determine if a statistically significant difference in skill sets exist, determined by online and traditional college pathways students in the Bachelor program. The remainder of this chapter will discuss subjects concerning the statement of the current problem under investigation, and will list the research questions, hypotheses, variables and instrumentation that will be used to gain empirical supporting data.

Statement of the Problem

It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from the online college pathway and the traditional colleges pathway from the Bachelor program at a large Western Christian university. The pathways described are the major modes of delivery of the curriculum to the approximately 43,725 (N) general population of students from the Bachelor program of the large Western Christian university in 2017. The two different pathways must be investigated together to determine if there is an existing difference in educational skill sets being taught. It is important to determine the extent of the difference before curricula can be designed to produce better skilled graduates.

The issue of online and traditional college students not having the needed employability skills executive hiring authorities want from recent graduates is not solely endemic to the U.S. business environment (Sidhu & Calderon, 2014). Per recent studies drawn from global references the issue of college students’ employability skills in their field of study, that do not match those needed by the employer should be measured on a pandemic scale (Chanco, 2015; National Statistics Office of Philippines, 2015). A sampling of literature reviews (studies) from China, Romania, Tajikistan, Philippines and European Union produce facts that show they are facing the same issues with hiring talent as their U.S. counterparts (Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015; National Statistics Office of Philippines, 2015; Săveanu & Buhaş, 2015; Youhang & Dongmao, 2015).

This quantitative, causal-comparative research study will add to the current body of research by investigation of whether a statistically significant difference exists between online college pathways students and traditional college pathways students from the Bachelor program at a large Western Christian university, using their CareerEDGE Employability Development Profile (EDP) instrument scores. The instrument used to determine the research questions for this survey questionnaire study are supplied by Dacre-Pool and Sewell, (2007). The permission letter to use the instrument and questionnaire (survey questions) are found in (Appendix D, pp. 192–194).

The research questions align directly with the problem statement: It is not known if there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from online college pathways and traditional colleges pathways. The research questions also align with the purpose statement of this study: The purpose of this study is to investigate if a statistically significant difference exists between students from online college pathways and traditional college pathways from the Bachelor program at a large Western Christian university, using their CareerEDGE Employability Development Profile (EDP) instrument scores.

The research questions will be analyzed after completion of the survey questionnaire by both online college pathway students and traditional college pathway students separately. This means that each student will answer 18 survey questions from the two dependent variable groups; factor 2, work & life experience skills and factor 4, generic skills will be evaluated for this study. The Work & Life Experience skills, dependent variable group, consist of two survey questions and the Generic skills dependent variable group, consist of sixteen survey questions. All 18 survey questions will be answered by the volunteer sample population of this study, but the two dependent variable groups, Factor 2 and Factor 4, will be the focus of the hypotheses testing and analysis for this study. The decision to use only two of the five factors for analysis comes from the gap in knowledge from literature reviews that show a disconnect between the academic skills colleges and universities are teaching (Cai, 2013; Iuliana, Dragoș & Mitran, 2014; Maurer, 2015) and the skills current organizations say they need from their entry-level workforce (Bessolo, 2011; Brungardt, 2011; Cai, 2013; Lindsey & Rice, 2015; Robles, 2012; Soulé & Warrick, 2015).

Research Questions

RQ1:    Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway?
H1o:   There is no statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.
H1a:   There is a statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.
RQ2:    Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway?
H2o:   There is no statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.
H2a:   There is a statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.

Operationalized Variables

The independent variable indicates if a participant is an online college and/or university pathway student or traditional college and/or university pathway student from the Bachelor program at a large Western Christian university. These variables align to the purpose statement, problem statement, research questions, hypotheses and theoretical foundations of this study. The two dependent variables groups, factors 2 and 4, associated with the CareerEDGE Employability Development Profile (EDP) instrument (Dacre-Pool & Sewell, 2007) are operationalized to produce a scaled score index. A scaled score is the total number of correct questions (raw score) that have been converted onto a consistent and standardized scale. “Scaled scores reflect the difficulty of the questions when reporting student results. Scale scores are meant to help with the interpretation of test results” (Katz & Warner, 2017).

Factors Under Investigation                             Items               Range            Scaled Score
Factor 2 – Experience Work & Life                  6 – 7                 2 – 14              165 – 153
Factor 4 – Generic Skills                                10 – 25              16 – 112              139 – 43
Total Score                                                         1 – 26              26 – 196             _______

Research Methodology

A quantitative methodology was selected for this study because a statistical analysis of the differences in the dependent variable groups, Factor 2 and Factor 4, are needed. An association with the independent variable groups, online and traditional college and university pathways in the Bachelor degree program at a large Western Christian university, was needed to determine if a difference exists and the possible cause(s) for it. The two different pathways must be investigated separately to determine if a statistically significant difference exists in educational skill sets being taught and those 21st Century employers say they need.

A quantitative methodology is used when one needs to determine an answer of or relating to the quantity or amount of something. This is the best methodology compared to others available, such as qualitative or a mixed-methodology, because it will determine if a statistically significant difference exists in the independent variable groups online and traditional college or university pathways students. Fray (1996) suggested that the first step in methodology selection should concern what information the investigator wishes to know and then determine what research design method is needed to acquire said data in the fewest number of questions. Additionally, Fray (1996) suggested that the survey questionnaire method avoids qualitative responses of open-ended question answers, which lend themselves to respondent bias. Additionally, the rationale for use in this study is it will statistically define if the researcher can defend results that produce answers to the research questions using applied social research methods (Trochim, William M.K., 2006).

Research Design

In selecting any research design, seven factors are considered; a) selection of the problem, b) selection of participants (sample size), c) selection of instrumentation, d) selection of study research design, e) selection of procedure(s), f) selection of data-analysis tool(s) and g) interpretation of the findings or results (Gay, 1987). This causal-comparative design is often referred to as an ‘ex post facto’ study design because the effect and the alleged cause have already occurred and must be studied in retrospect (Gay, Mills & Airasian, 2006). This is the most common research design in educational research since it describes conditions that already exist (Gay, et. al., 2006). The basic approach starts with investigation, which involves dependent variables (the EDP instrument survey Factors 2 and 4) and one (or more) independent variable groups, online and traditional college degree pathways (Gay, et. al., 2006).

This causal-comparative study design is the best choice for this study, as it will use pre-existing groups to investigate differences between or among those groups (Schenker and Rumrill, 2004). Additionally, the variables often examined in causal-comparative research studies “cannot (should not) be manipulated for practical or ethical reasons” (Schenker and Rumrill, 2004, p. 117). This is most important in cause-effect experimental studies but it is equally important to a causal-comparative, non-experimental study such as this one, so that future research of an experimental type can simply build on the current research design, data collection applications, and data analysis applications using the two-tailed independent samples t-test.

The quantitative methodology, causal-comparative study design (Gay, Mills & Airasian, 2006; Schenker & Rumrill, 2004) is the best design for this study compared to others available because it will determine statistically if a significant difference exists between the two groups of individuals (independent variables group) affected by the degree of workforce readiness skills (dependent variable group). The two groups affected are traditional college and university pathways students and online college and university pathways students. The dependent variable groups are the two factors of the CareerEDGE Employability Development Profile (EDP) instrument (Dacre-Pool, Qualter & Sewell, 2013). The 18 survey instrument skills questions are those that “employers state are not being taught to the degree that employers need by today’s educational institutions” (NACE, 2014, p. 325). Causal-comparative research designs focus on determining if a cause-effect (or effect-cause) exists between groups of individuals – the independent variables – and a second factor or set of factors – the dependent variables. This is the guiding rationale and purpose for the use of this study research design (Schenker and Rumrill, 2004).

The closest alternative study design to causal-comparative in quantitative research is the correlational research design (Cohen, Cohen, West & Aiken, 2013). Correlational research attempts to determine the relationship of two or more variables (Gay, 1987). Additional study designs often used after a non-experimental, causal-comparative study produces results are Experimental and Quasi-Experimental research designs (Gay, Mills & Airasian, 2006). With non-experimental research designs, there is no expected relationship assumed between variable groups. The reasoning for using hypotheses with this non-experimental, causal-comparative research design is for greater clarity in the study results section, chapter 4 of the final dissertation, and to make future research conducted using either Experimental or Quasi-Experimental research designs more insightful (Cohen, Cohen, West & Aiken, 2013; Schenker & Rumrill, 2004).

Population and Sample Selection

The total general population for this study is approximately 43,725 (N) students from the Bachelor program using both the online college degree pathway and the traditional college degree pathway in 2017, at a large Western Christian university. With this large of a general population to draw from, statistical power for validity and reliability are assured. An A priori G*Power test, using effect size d = 0.5 (upper bounds) and actual power = 0.9501287, which produced a sample size (n) of 210 participants (G*Power 3.1.9.2 Software, 2015) (Figure 5, p. 173). Screenshot below.

Test covering both one and two range responses (lower bounds and upper bounds effect size), were run to determine the sample size (n) from the estimated general population used in this study (G*Power, 2015, ver.3.1.9.2 Software). The rationale for this decision was that by selecting an upper and lower effect size percentage, the most accurate sample population size (n) could be attained to assure research results were generalizable to the broadest audience.

Because exact numbers of the Bachelor program students from online and traditional college degree pathways were not available, the following data was used in estimating the general populations. A general population of 79,500 (N) is enrolled in the three programs offered, Bachelor, Masters and Doctoral, by the large Western Christian university. The 2016 breakdown of students was 14,500 Traditional & 65,000 Online students across all three programs determined by pathway. The following percentages of the student total population were assigned in determining only the Bachelor programs online and traditional college degree pathways general populations (N).

                             Students % of Population/Pathway

Online Gen Pop Est.                                            Traditional Gen Pop Est.

35,750 Bachelor Students @ 55%                                           7,975 Bachelor Students @ 55%
26,000 Masters Students @ 40%                                           5,800 Masters Students @ 40%
3,250 Doctoral Students   @ 5%                                                720 Doctoral Students @ 5%

An a-priori power analysis was used to determine the true sample size needed for ‘statistical power’ for validity and reliability using the G*Power instrument (G*Power 3.1.9.2 Software, 2015). Using input parameters of two-tailed independent samples t-test to determine the statistical difference between two independent means (two groups), the author used an A-priori power analysis to compute input parameters for the required sample size, given: α (alpha) = 0.5 and (α err prob) at 0.05, power = (1-β err prob) at 0.95 and effect size d = 0.5 (upper bounds) and allocation ratio N2/N1 = 1. The output parameters were Noncentrality parameter δ = 3.6228442, Critical t = 1.9714347, df (degrees freedom) = 208, Sample size group 1 = 105, Sample size group 2 = 105, total sample size = 210 and Actual power = 0.9501287 (Figure 5, p. 173, power plot 2).

The lower bound test used input parameters, α (alpha) = 0.20 (α err prob) at 0.20, power = (1-β err prob) at 0.80 and effect size d = 0.30 (lower bounds) and allocation ratio N2/N1 = 1. The output parameters were Noncentrality parameter δ = 2.1319006, Critical t = 1.2857988, df (degrees freedom) = 200, Sample size group 1 = 101, Sample size group 2 = 101, total sample size = 202 and Actual power = 0.8015396 (Figure 4, p. 172, power plot 1). With adjusted effect size of 0.3 (low), and 0.5 (high), where effect size 0.5 (upper bounds) was chosen as it produced 210 needed respondents for statistical validity and reliability (G*Power 3.1.9.2 Software, 2015) (Figure 4 & 5, pp. 172-173).

Because this is a study consisting of a sample size (n) of volunteers, sampling only adult students from the Bachelor program at a large Western Christian university, a request for site authorization application is required to be submitted to the university prior to study initiation. A request to conduct research of any type will need prior approval by the universities Institutional Review Board (IRB). After permission to conduct research has been attained by the IRB network, site authorization can begin. The site authorization application will describe the purpose and scope of the research, duration of the study, target population, impact on operations and resources, data use and potential benefit to the large Western Christian University.

A sample size is a subset of the population being studied. It is representative of a larger population and is used to draw inferences concerning that general population. It is a research technique widely used in social science without having to measure the entire population (Ross, 2005). Because this studies sampling frame consist of Bachelor degree seeking students from a large Western Christian university, it is believed it will represent a population that can be generalized to the larger U. S. college population. Because all U.S. colleges respondent population (N) would not be possible to survey individually, (because a complete list of U.S. colleges cannot be compiled due to cost, time restraints and/or unforeseen elements), a general population size (N) of approximately 43,725, (7,975 Traditional ground campus and 35,750 Online students) from a large Western Christian university was selected.

Ross, (2005) guidelines for a convenience sampling frame states, “Representativeness, is where a set of sample data refers specifically to the marker variables selected for analysis” (p. 4). These guidelines were used to assure a high degree of representativeness between the general U.S. population and that of the U.S. college and university populations. The operational definition for this sampling technique is ‘convenience sampling’ because it is applied to all bachelor students from one large Western Christian university in the U.S. which is representative of the larger U. S. educational population. A convenience sample is one of the main types of non-probability sampling methods used in social science research (Allen & Seaman, 2015). A convenience sample is made up of people who are easy to reach and the cost is relatively low. The sample selection in this study is supplied by the large Western Christian university’s email survey manager. The email survey manager will direct students from the Bachelor program of the large Western Christian university to use a link to the survey instrument page.

Instrumentation

The survey questionnaire instrument for this causal-comparative study is supplied by Dacre-Pool and Sewell (2007) using their CareerEDGE Employability Development Profile (EDP) instrument, which measures workforce readiness skills from five factor elements (Appendix D, pp. 189-192). The EDP was designed specifically for developmental work with students of any higher education institution (Dacre-Pool & Sewell, 2007). This diagnostic tool is a self-report questionnaire that asks students to rate themselves on different aspects of employability, “as defined by the CareerEDGE model” (Dacre-Pool & Sewell, 2007, p. 305). The CareerEDGE Employability Development Profile (EDP) instrument questionnaire consists of 18 questions covering a wide range of employability skills. The focus of the skills questions for this study concerns dependent variable groups Factor 2, Work & Life Experience and Factor 4, Generic skills. Examples of how the questions are worded are listed below and the answer rubric consists of a 7-point Likert scale.

Examples:
Factor 2 = SQ-6. I have a lot of work-relevant experience.
Factor 4 = SQ-10. I have good oral communication skills.

The samples used for construct validity were exploratory factor analysis (EFA), comprised of (n = 403) respondents and confirmatory factor analysis (CFA) comprised of (n = 402) respondents (Dacre-Pool, Qualter & Sewell (2013, p. 307), (Appendix F, pp. 198-199). The two factor reliabilities of the dependent variable groups used in this study were mean 0.78 (Factor 2 – Experience Work/Life skills), and mean 0.63 (Factor 4 – Generic skills). The final model producing the best fit (root mean square error of approximation) RMSEA = .057, where values between .05 to .08 are deemed best fit, (normed fit index) NFI = .96, where values above .90 are deemed acceptable and the confirmatory fit index, CFI = 0.91, where values above 0.90 are deemed best fit (Dacre-Pool, et. al., 2013). The use of the survey instrument in this study will contribute directly to answering the research questions through statistical analysis of the scores attained by the two independent variable groups, online and traditional college and university pathways students. The survey questionnaire consists of eighteen dependent variables, identified as the CareerEDGE Employability Development Profile (EDP) survey questions from factor 2, work & life experience skills and factor 4, generic skills.

Validity

The survey questionnaire instrument for this quantitative designed study was supplied by permission of Dacre-Pool & Sewell, (2007) (Appendix D, pp. 191-194). There are three types of validity measurements; content validity, dependent validity, and in this case, construct validity. Construct validity in the case of a complex theory means it is built from several simpler elements or given conditions. Modern validity theory defines construct validity as the overarching concern of validity research, subsuming all other types of validity evidence (Law & Watts, 1977). In the case of a causal-comparative research study the author is interested in measuring whether the theoretical concept matches up with the conditions the researcher wants to measure. The main theoretical model that has underpinned the CareerEDGE EDP model is the DOTS model (Law & Watts, 1977), which consists of planned experiences designed to facilitate the development of:

Decision learning – decision making skills,
Opportunity awareness – knowing work opportunities exist and what their requirements are,
Transition learning – including job searching and self-presenting skills, and
Self-awareness – in terms of interests, abilities, values, etc.” (Watts, 2006, pp. 9-10).

Construct validity simply refers to whether a scale or test measures the intended construct adequately (Brown, 1996, p. 231). Notice that the evidential basis for validity includes both test score interpretation and test score use (Messick’s, 1988, 1989). There are two basic types of construct validity, differential-groups study (used with experimental demonstrations) and the type used by the authors of the CareerEDGE (EDP) instrument to validate their model, Intervention study, wherein a group that is weak in the construct is measured using the test, then taught the construct, and measured again (Brown, 1996). If a statistically significant difference is found between the pretest and posttest, that difference can be said to support the construct validity of the test.

Do not confuse construct validity type with construct validity tests. The types of test that are used for the two different types of validity include, i.e. factor analysis and pretest-posttest intervention studies. The test used by the authors of the CareerEDGE (EDP) instrument to validate their model involved pretest-posttest and factor analysis using Statistical Package for Social Science (SPSS ™) software (International Business Machines (IBM), 2015) (Dacre-Pool, Qualter & Sewell, 2013). The findings suggest that the EDP is multidimensional and maps clearly onto the CareerEDGE model of graduate employability (Dacre-Pool, Qualter & Sewell, 2013).

Factor analysis, using IBM’s Statistical Package for Social Science (SPSS ™) software (International Business Machines (IBM), (2015) was used for the validation sample of the two dependent variable groups of interest in this study, producing an overall mean of 0.76 (Factor 2) and overall mean of 0.57 (Factor 4) (Dacre-Pool, Qualter & Sewell, 2013, p. 309). The pretest-posttest of the two factors that are the focus of this study produced a mean and standard deviation (SD) for (Factor 2), Experience Work/Life Skills was, pretest = 8.37 (2.93) and posttest = 9.53 (2.20). The mean and standard deviation (SD) for (Factor 4), generic skills were, pretest = 63.42 (6.89) and posttest = 63.95 (8.31).

Reliability

The survey questionnaire instrument for this quantitative study was supplied by permission of Dacre-Pool and Sewell, (2007) (Appendix D, pp. 191-194). There is a growing concern that objectivity in research is no longer possible (AQR, 2016). Where reliability is concerned, this refers to “the closeness to fact that a scientific test or piece of research measures what it sets out to, or how well it reflects that reliabilities claims” (AQR, 2016, p. 1). The two factors (dependent variable groups) that are the focus of this study are Factor (2) Experience Work/Life Skills and Factor (4) Generic Skills (Dacre-Pool, Qualter & Sewell, 2013, pp. 307-309) (Appendix F, pp. 196-197).

The quantitative method requires statistical results as proof of reliability, and in statistics, particularly classical test theory, Cronbach’s (alpha) is that test. It is a (lower bound) estimate of the reliability of a psychometric test, (i.e. mental capabilities and/or behavioral style, etc.). Cronbach’s alpha is a measure of internal consistency, or how closely related a set of items are as a group. It is a measure of scale reliability (Institute of Psychometric Coaching, 2016). Cronbach’s (alpha) coefficient is also called the internal consistency of said reliability of the test (Tavakol & Dennick, 2011). Coefficient alpha simply represents a ratio of true score variance (reliable/consistent), to total variance (How2Stats, 2015, Jan 19). If the items in a test are correlated to each other, the value of alpha is increased (Tavakol & Dennick, 2011).

Inspection of the correlation matrix for each sample revealed the presence of coefficients of 0.30 and above (Tabachnick and Fidell, 2007). Unidimensionality implies the presence of only one factor in the data (and is determined with factor analysis); coefficient alpha (consistency) assumes unidimensionality, but it cannot test for it (How2Stats, 2015, Jan 19). Reliability can be viewed as the expected correlation of two tests that measure the same construct, hence its closeness in theory and practical application to validity testing (Institute of Psychometric Coaching, 2016; Leeuw, 2005).

A statistical test for Cronbach’s (alpha) was run using the upper limits of the test scores (5, 6 & 7) producing the following results. Using the Statistical Package for the Social Science (SPSS) (International Business Machines (IBM), 2016) a test was run to determine if using the top three scores from the answers available using a Likert grading scale would produce a Cronbach’s score in the range of .700 or higher. The measure for the SPSS testing for the Cronbach’s Alpha was set to Ordinal on the variables page, though many scholars suggest that a scale measurement is also permitted and often used. Testing of the five factors (dependent variable groups) produced results of Cronbach’s (alpha) = .931 for both Factor 2, Experience Work/Life and Factor 4, Generic Skills, (the focus of this study). The test produced an Inter-Item Correlation Matrix of 1.000 and Item-Total Correlation of .726 each, compared to Cronbach’s Alpha if item deleted at .932. Item statistics produced scores of Means = 6.57 and Standard Deviation = .573 on the five items. Scale statistics produced scores of Means = 32.86, Variance = 6.423 and Standard Deviation = 2.534 on the five factors. (Appendix G, pp. 197-198).

In the case of the original CareerEDGE Employability Development Profile (EDP) instrument (Dacre-Pool & Sewell, 2007), the reliability of the instrument was determined by a test-retest reliability procedure. The test-retest reliability procedure examined the extent to which scores from one sample are stable over time from one test administration to another (Creswell & Plano Clark, 2011). The instrument was administered to 19 individuals. The same instrument was then administered to the same individuals several weeks later. The arrangement of the questions changed during the second administration. The scores from Time 1 and Time 2 were used to compute the Pearson correlation coefficient as a measure of the test-retest reliability of the instrument. The Pearson correlation coefficient is represented as Time1∙Time2 =??? (Creswell & Plano Clark, 2011; Green, 2015). The EDP factor subscale for the test-retest procedure consist of 19 respondents. These respondents were first tested as Time 1 and the Mean (SD) and t-values and p-values were recorded. Several weeks later the same 19 respondents were tested again with the order of the questions changed. The second test, Time 2 and the Mean (SD) and t-values and p-values were recorded again and the two test were compared. The two test of reliability produced results of Factor 2, Experience Work/Life, being comparable to Factor 4, Generic Skills, with the Means (SD) and the t-test and p-values within accepted limits (Dacre-Pool, Qualter & Sewell, 2013) (Appendix F, pp. 194-195).

This quantitative, causal-comparative study will add to the current body of research on the investigation into workforce readiness skills acquired by students from online college pathways and traditional college pathways, by examining the differences between the two educational groups EDP scores, as determined from the Bachelor program at a large Western Christian university. An invitation to participate in an online survey will be sent to all students from the Bachelor degree program at the large Western Christian university, using their Email Survey Distribution system. The total general population size (N) for the study is approximately 43,725 respondents possible (7,975 traditional and 35,750 online students).

Site Authorization Process

All email survey requests (including initial requests, follow-ups and reminders) that have necessary and/or appropriate site authorization and IRB approval, will be distributed by the email survey distribution manager. The invitation (e-mail) introduction letter will discuss the large Western Christian university, Internal Review Board (IRB) approval process and discuss informed consent from participates. Completion of the survey instrument by research respondents will be considered as written affirmation of informed consent by the independent variable groups, online college and university pathways students, and traditional college and university pathways students, in accordance with Common Rule, 2009, January 15. Additionally, risks and benefits are addressed concerning participants and the organization. An IRB review, sample selection ideology, protection of rights/well-being, maintaining of data security, sample recruitment process, data collection instruments and approaches will be discussed.

Data Management Safety, Storage and Destruction Processes

The demographic and geographic data collected (Appendix E, p. 194) will not be used as a measurement tool in the actual study, but will be used in Chapter 4 (Data Analysis and Results) of the final dissertation in the Comments from Survey Participants section. Per applicable Collaborative Institutional Training Initiative (CITI) regulations (Collaborative Institutional Training Initiative, 2015), study data will be stored on a flash drive and locked in a safe. The survey results and all electronic data collected and analyzed will be secured on a portable external drive, which will be secured in the researchers private safe in accordance to (CITI) review for a period of three years. At the end of the three-year period all paper notes and all electronic data held on computers, websites and flash drives will be destroyed in compliance with (CITI) published standards.

Data Analysis Procedures

The purpose of this study is to investigate if a difference exists between students from online college pathways and traditional college pathways Bachelor program at a large Western Christian university. Using the students CareerEDGE Employability Development Profile (EDP) instrument scores from factor 2 (work & life experience skills) and factor 4 (generic skills) an accurate assessment will be produced (Dacre-Pool & Sewell, 2007). This quantitative causal-comparative research study will analyze the scores of factors 2 (work & life experience skills) and factor 4 (generic skills) by the Bachelor students at a large Western Christian university using the CareerEDGE Employability Development Profile (EDP) instrument (Dacre Pool & Sewell, 2007). The respondent is asked to rate the same employability development profile survey questions for each type of graduate pathway, traditional or online. Following the statistical analysis of Factors 2 and 4, through the scouring of the eighteen-employability development profile survey questions using a Likert scale, the results will be compared to the research questions RQ1 and RQ2. Additionally, raw data will be analyzed and illustrated, both written and/or through tables, figures, diagrams, charts or graphs (Blakstad, 2015; Brasoveanu, 2012; Boone & Boone, 2012).

This research instrument will use a 7-point Likert scale of: 7) Strongly Agree, 6) Agree, 5) Slightly agree, 4) Neither Agree nor Disagree, 3) Slightly disagree, 2) Disagree and 1) Strongly Disagree (Laerd Statistics, 2016). The numerical scale given to the 18 survey questions from factor 2, work & life experience skills and factor 4, generic skills, was designed to facilitate scored responses to be used with a causal-comparative research design, and a two-independent samples t-test to determine if an association exist (Gay, Mills & Airasian, 2006; International Business Machines (IBM), 2016; Laerd Statistics, 2016; Schenker & Rumrill, 2004; Statistical Package for Social Sciences (SPSS), 2015).

Research Questions and Hypotheses

RQ1:    Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway?
H1o:   There is no statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) dependent variable group work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.
H1a:   There is a statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) dependent variable group work & life experience skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.
RQ2:    Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway?
H2o:   There is no statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) dependent variable group generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university through their online college pathway and their traditional college pathway.
H2a:   There is a statistically significant difference in scores on the CareerEDGE Employability Development Profile (EDP) dependent variable group generic skills, of students who graduate with a bachelor’s degree at a large Western Christian university from an online college or university pathway and those who graduate from a traditional college or university pathway.

Construct of Factors

Construction of the two factors in this study are based on the variables under investigation being ordinal, as determined using the Likert scale, and the numerical values are without a true zero point, the opposite of ratio’s, thus they are Interval and continuous as they can represent any number, even decimal points (Jamieson, 2004). The measurement level of the two factors (dependent variable group) are determined by the following formula: Factor 2, (work/life experience skills) has two items (survey questions) and Factor 4, (generic skills) has 16 items (survey questions). Factor 2 = (item 1 + item 2) divided by the number of items in that factor and factor 4 = (item 3 + item 4 + … item 18) divided by the number of items in that factor.

Two-independent samples t-test
To answer the research questions, the researcher will use a two-independent samples t-test using the Statistical Package for Social Science (SPSS ™) software (International Business Machines (IBM), 2015). Descriptive statistical test tools, such as those chosen to determine central tendency, use measurement scales to produce the mean, standard deviation and variance values needed by the researcher to answer this study’s research questions. The t-test is an inferential statistical test that determines whether there is a statistically significant difference between the means in two unrelated groups (Laerd Statistics, 2016). The two-independent-samples t-test is also referred to as a between-groups design (Tabachnick & Fidell, 2007). This is accomplished by testing the null and alternative hypotheses, signified by the assumptions, Ho: u1 = u2 and Ha: u1 ≠ u2. Now that we have defined the null and alternative hypotheses we state the alpha level.

The alpha level, also referred to as the significance level, which is α = 0.05 for this study. Where normally an a priori alpha level is typically based on sample size by using either 0.05 or 0.01 significance (alpha) levels. And some conservative alpha levels, such as 0.01 and 0.001, are commonly used to evaluate the assumption of normality discussed below (Tabachnick & Fidell, 2007). Next, one calculates the degrees of freedom used to attain the Critical Value (Appendix H, pp. 192-194). Applying the following equation one determines both the degrees of freedom and the t-distribution critical value, df = (n1 – 1) + (n2 – 1). To determine the t-distribution critical value a t-distribution table is used. One applies the degrees of freedom numeric value (left side of table) to the corresponding significance (alpha) level, α = 0.05 (along the top of table). The corresponding numeric value of the two predetermined values results in the critical value (t-Distribution Table, p. 176).

Assumptions
Assumptions underlying the two-independent-samples t-test are predicated on three major assumptions, that of independence, normality and homogeneity of variance (Tabachnick & Fidell, 2007). The scores from the items (survey questions) are independent of each other, that is, scores of one participants, say online pathway students, are not systematically related to scores of the traditional pathway student participants. This is commonly referred to as the assumption of independence. Next, the dependent variable group is normally distributed within each of the two populations, online and traditional pathway students, independent variable groups. This is commonly referred to as the assumption of normality. Lastly, the variances of the test (dependent) variable in the two populations are equal. This is commonly referred to as the assumption of homogeneity of variance (Tabachnick & Fidell, 2007).

Testing of Studies Assumptions
Testing for assumptions of independence, normality and variance in this study will be conducted using Statistical Package for Social Science (SPSS ™) software (International Business Machines (IBM), 2015). This is accomplished using SPSS’s Explore command found by clicking Analyze, Descriptive Statistics and Explore. You will activate the explore dialogue box which will present you with both the dependent and independent variable groups. The two dependent variable groups, factor 2 (work/life experience skills) and factor 4 (general skills) along with the two independent variable groups online and traditional student pathways are available for testing. One selects an option from the dependent variable group (either factor 2 or both factors 2 and 4) then moves (drag-and-drop) it or them into the Dependent List box (if testing only one or more variables for normality).

Optionally, in this study the researcher will also select both independent variables, online and traditional student pathways and move them into the Factor List box to test if the variables groups are normally distributed for each level of the independent variables. Next select Both in the display box and click the Statistics dialogue box which will load the Explore: Statistics dialogue box, then select the Descriptive box and record a 95% setting in the confidence interval for the mean. Then click the Continue button at the bottom of the Explore: Statistics box. Next select the Plots button in the Explore: Statistics box and change the options to Factor-Levels-together in the Boxplots section, check the Stem-and-leaf option in the Descriptive section, check the Normality plots with test option and then click Continue and click the Ok button. Additionally, because SPSS is a robust tool it can be used to produce both statistical and graphical test of normality, such as the Shapiro-Wilk test and the Normal Q-Q Plot (graphically illustrated). SPSS also offers testing for Skewness and Kurtosis before running other test to determine validity of assumptions. Kurtosis is the state or quality of flatness or height of the curve describing a frequency distribution around its mean and mode. The kurtosis which is either, leptokurtic (has a high, narrow concentration about the mode) and is more concentrated about the mean than the corresponding normal distribution or platykurtic (has a wide, rather flat distribution about the mode) and is less concentrated about the mean than the corresponding normal distribution (University of Bedfordshire, AC, UK). If the data is not approximately normally distributed and/or groups sizes differ greatly, one can run the Mann-Whitney U test which is a non-parametric test that does not require the assumption of normality (Laerd Statistics, 2016, Descriptive and Inferential Statistics).

SPSS to Analyze & Summarize Demographics
Demographics entail many perspectives relating to the structure or sector of statistical data of a specific or general population. The demographics that will be computed for this study include Degree Concentration, Degree Attainment Date (year), Degree Pathway (T or O), Gender (M or F), Age (years old), Internship (Y or N) and Geographic Location (State or County) (Appendix E, p. 163). Because the student sample size will be drawn from the Bachelors program at a large Western Christian university, the Degree Concentration column will have a list of the career fields available in the Bachelors program. These programs will be numbered #1 through #10 to allow for a numeric accounting of the students’ career field choice. Additional demographic categories are Degree Attainment Date, Degree Pathway (Traditional or Online), Gender, Age, Internship (Yes or No) and Geographic Location (State or County) (Appendix I, Demographic Directory, p. 167).

Analyzing and summarizing of demographic data will be accomplished using Statistical Package for Social Science (SPSS ™) software (International Business Machines (IBM), 2015). Once all table category titles and variables are recorded in the Variable View in the appropriate columns, we will switch panels to the Data View where this researcher will transcribe the statistical data recorded from the descriptive statistics category page (Appendix E, p. 163). Once all statistical data is recorded this researcher will use the Analyze, Descriptive statistics, Frequencies option button to start the analysis process. The Frequencies dialogue box will appear with all the variables listed in the left-hand pane, then click on the arrow button to move the selected variables for analysis into the Variables pane on the right side of the box. Select the ‘Display frequency tables’ box and then click the Statistics button. The statistics dialogue box will appear; click on the desired statistics that you want to perform. When you have selected all the desired statistics (e.g. mean, median, mode, standard deviation, variance, range, etc.), click on the Continue button. Select which chart(s) you want to display by clicking on the Chart button. The chart dialog box will appear and because this research is using the Frequency command we will choose the Histogram and click on the Continue button.

SPSS offers two options in the Descriptive Statistics window, Frequencies and Descriptive. The two options deliver basically the same statistical data and graphical charts but the descriptive command does not display a frequency breakdown of each variable or show a Median and Mode statistic. Additionally, this research will use the ‘Split File’ command to compare groups based on one variable (say Gender), by placing it in the ‘groups based on’ box. Next, re-run the descriptive command again apart from the Gender variable. The output will now show the descriptive statistics for all the variables by Gender. This is most helpful when explaining and displaying the results and conclusions for reports of the study (University of Bedfordshire, AC, UK).

Descriptive statistics use data to provide descriptions of the population, using numerical calculations, which are represented through tables, figures, diagrams, charts and graphs. Inferential statistics makes inferences and/or predictions about a population based on a sample of data taken from the population in question (Laerd Statistics, 2016, Descriptive and Inferential Statistics). The significance level, (5% = 0.05) set prior to the test of the independent variables (online and traditional college and university pathway students’ scores), will determine how closely the correlation between the data (95% Confidence Level) will aid in determining possible causes of the effected outcome observed; a lack of needed skills of new college and university graduate employees. Hypotheses testing of the CareerEDGE Employability Development Profile (EDP) survey instrument scores, will be set at > 0.05, which means that if the significance (alpha level) is greater than > 0.05, the null hypotheses is rejected. If the significance is equal or less than = < 0.05, then the null hypotheses is accepted.

Addressing issues of anonymity, confidentiality, privacy, coercion, and any potential conflicts, such as name of respondent, address, telephone number, race, organizational affiliations will not be collected. All raw data, survey instrument responses, computer files, demographic and geographic information will be secured on external drives and kept in a safe and only viewed by the primary investigator, Edward Files, BSM, MBA. This researcher will request that all students from both online and traditional degree pathways, from the Bachelor program at a large Western Christian university in the school year 2017 to participate. This research does not involve the physical evaluation or manipulation of human subjects. It does require human subjects to participate in an online survey administered through the colleges email survey system; therefore, the potential risk associated with the study is low.

Participants that will be involved in the survey are required to read and sign an informed consent statement (Appendix B, pp. 187-190, this document). The consent form will define for them the purpose and procedures of the proposed study, as well as address the risks and benefits of the study to both the students and university. The research method, including the invitation to participate email, survey instrument, and follow-up email, must have the approval of the large Western Christian universities Internal Review Board (IRB) to conduct this research. All participants are volunteering for the e-mail survey and “must be capable of informed consent; no surrogates or proxies can be accepted” (Gordon, Levine, Mazure, Rubin, Schaller, & Young, 2011, p. 25).

Respondents will not use their name on the survey instrument, as only those members of the student body presently active in the Bachelor degree seeking program will be invited to participate in the academic year 2017 survey process at the large Western Christian university. Any additional information collected (Appendix E, Demographic and Geographic Information, p. 163) garnered through the survey instrument, will be coded in accordance with the Code of Federal Regulations (Protection of Human Subjects), referred to as the Common Rule (U.S. Department of Health and Human Services, 2015). For all participating departments and agencies, the Common Rule outlines the basic provisions for IRBs, informed consent, and Assurances of Compliance (U.S. Department of Health and Human Services, 2015).

The survey results and all analysis of raw data and results, both electronic and paper, will be secured in a safe in accordance with the Association for Qualitative Research (AQR) review for a period of three years. At the end of the three-year period, the information will be destroyed through wiping of electronic data and burning of any paper notes, comments, analysis, etc. As this study uses a quantitative methodology the above requirements are not mandatory but are associated with both validity and reliability concerns discussed earlier.

Limitations and Delimitations

In any study, there are possibilities of multiple sources for limitations, beyond that of the literature reviews themselves (Baruch & Holton, 2008; Branch, 2007; Podsakoff, MacKenzie & Podsakoff, 2012). While one can deflect the possibility of incorrect information and data from primary research studies and articles by closely checking one’s references, some forms of limitations are less identifiable. Limitations of this study may concern factors of low response rate in answering the survey questions, non-completion of survey instrument questions, bias by respondents toward online surveys, and methodology bias, (the biasing effects that measuring two or more constructs with the same methodology may have on estimates of the relationships between them) (Podsakoff, MacKenzie & Podsakoff, 2012).

For the sake of this study, delimitation is defined as, a shortcoming of this study due to the researchers’ decision-making-process. One of these decision-making-processes, such as using a research design that incorporates a convenience-sampling-frame, which produces delimitations for this study concerning the population size (N) being restricted to just the Bachelor student degree program (Baruch & Holton, 2008).         Using a low general population (N) size directly affects the sample size response rate (n) in this survey research study. The minimum number of requirements, as per the large Western Christian university general rule of thumb on survey research is 10 subjects per survey question or 280 respondents for this study. For this study an a-priority Power Analysis was conducted to justify the study sample size based on the anticipated effect size (d = 0.5) and the critical t = 1.9714347, Degrees freedom (Df) = 208 and Actual power = 0.9501287, which produced a respondent (n) sample size of 210 respondents were needed for statistical validity alone (G*Power, 2015) (Figure 4, p. 204). The survey instrument will be hosted by the large Western Christian universities email survey system to minimize further delimitations in the Data Collection and Management sections. The following section will summarize the preceding sections and describe the organization of the remainder of the study.

Summary

Chapter 3 focuses on the research methodology used in this study. It defines the research design, sample population, the theoretical foundations (Becker, 1964; Schultz, 1961; in Walters, 2004, pp. 99-100), the research instrument (Dacre-Pool & Sewell, (2007), data analysis, validity and reliability, and any ethical considerations that should be under investigation by the large Western Christian universities, Institutional Review Board (IRB) network. The invitation (e-mail) introduction letter will discuss the large Western Christian universities Internal Review Board (IRB) and Informed Consent approval process. The individual steps used in the collection of the data is covered in the Data Collection Procedures section and Data Analysis Procedures section describes the details involved in computation and analysis of the raw data (Blakstad, 2015; Brasoveanu, 2012; Boone & Boone, 2012). The methodology is quantitative, and the research method is causal-comparative and data collection will be conducted through a questionnaire (research survey design method) instrument (Dacre-Pool & Sewell, 2007). The instrument was tested as validated and pronounced reliable through Exploratory Factor analysis (EFA) and Confirmatory Factor analysis (CFA) (Dacre-Pool & Sewell, 2007). This researcher has permission to use Dacre-Pool & Sewell, (2007) survey instrument for this study (Appendix D, pp. 190-193).

An invitation to participate in an online survey will be sent to all students from both online and traditional degree pathways in 2017 and covers the Bachelor program at the large Western Christian university. Per Dacre-Pool, Qualter and Sewell (2013), little empirical research has been conducted in relation to graduate employability and/or the tools that measurement them. This is the study that supplied the CareerEDGE Employability Development Profile (EDP) Instrument for this study (Dacre-Pool & Sewell, 2007). Additionally, per Deepa and Seth (2013), more research is needed to explore the current gaps between educational curriculum design and the standards hiring authorities say they need.

Quantitative is the best methodology compared to others available because it will answer, statistically, the degree that the researcher can defend findings that answer the hypotheses thus answering the research questions, as well as addressing the problem and purpose statements (Gravetter & Forzano, 2012). By analyzing a list of test scores from the survey question instrument by using a two- independent samples t-test for statistical analysis to measure skill differences (Laerd Statistics, 2016; Light & McGee, 2015; Shuttleworth, 2015; Zhao, 2015).

The significance of this study is that it will produce statistical data that will lead to a more accurate, essential and deeper understanding of the relationship between education and application of employability skill sets of today’s graduates. The selected research design and instrument (causal-comparative and survey questionnaire) using the 18 CareerEDGE Employability Development Profile (EDP) skill set questions, derived from Factor 2 (work/life experience skills) and Factor 4 (general skills) are justified as they align with the research questions, hypotheses, variables, problem and purpose statements and defined gap from the literature reviews. These factors will be measured against the independent variable groups, online and traditional college pathway students. The research questions align with the problem and purpose statements, methodology, research design, instrumentation, data collection and analysis approach, which illustrates the need for this study as determined by the 2014, U.S. Government multi-departments call for further research in numerous areas concerned with education, economics and labor, as well as the communities affected (U.S. Departments of Labor, Commerce, Education and Health & Human Services, 2014, p. 21, Para 1). The instrument was tested as validated and pronounced reliable through Exploratory Factor analysis (EFA) and Confirmatory Factor analysis (CFA) (Dacre-Pool & Sewell, 2007). This researcher has permission to use Dacre-Pool and Sewell, (2007) survey instrument for this study (Appendix D, pp. 190-193).

References

AboutEducation.com (2014). Types of Sampling Designs. By Sociology Expert, Ashley Crossman (2014, December 16). Retrieved from http://sociology.about.com/od/Research/a/sampling-designs.htm
Accenture. (2013). College graduate Employability survey. US: Accenture.
Adams, S. (2014). The 10 Skills Employers Most Want In 2015 Students. Forbes Business Inc. Leadership Journal. Retrieved from http://www.forbes.com/sites/susanadams/2014/11/12/the-10-skills-employers-most-want-in-2015-students/
Airasian, P. & Gay, L. (2003). Educational Research, Chapter 12. Retrieved from https://faculty.unlv.edu/sloe/Courses/EPY%20703/Lecture%20Slides…/Class10.pdf
Allen, I.L. & Seaman, J. (2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011). Babson Survey Research Group. Retrieved from http://www.onlinelearningsurvey.com/
Allen, I.L. & Seaman, J. (2012). Changing Course: Ten Years of Tracking Online Education in the United States. (January 2013). Babson Survey Research Group and Quahog Research Group, LLC. Retrieved from http://www.onlinelearningsurvey.com/reports/changingcourse.pdf
Allen, I.L. & Seaman, J. (2013). Grade Change: Tracking Online Education in the United States (January 2014). Babson Survey Research Group and Quahog Research Group, LLC. Retrieved from http://www.onlinelearningsurvey.com/reports/gradechange.pdf
Allen, I.L. & Seaman, J. (2014). Grade Change: Tracking Online Education in the United States (January 2015). Babson Survey Research Group and Quahog Research Group, LLC. Retrieved from http://www.onlinelearningsurvey.com/reports/gradelevel.pdf
Alpaydın, Y. (2015). Identifying Higher-Education Level Skill Needs in Labor Markets: The Main Tools Usable for Turkey. Educational Sciences: Theory & Practice, 15(4), 945-967. doi:10.12738/estp.2015.4.2542. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=109416069&site=ehost-live&scope=site
Altonji, J. G. & Pierret, C. R. (2001). “Employer Learning and Statistical Discrimination.” Quarterly Journal of Economics, 116(1):313–50.
American Society for Training & Development, (2014). Employers place less value on college pedigrees. (2014). T + D, Foundation, 68(5), 19. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=a2h&AN=95790522&site=ehost-live&scope=site
Amrein-Beardsley, A., Holloway-Libell, J., Cirell, A. M., Hays, A. & Chapman, K. (2015). “Rational” Observational Systems of Educational Accountability and Reform. Practical Assessment, Research & Evaluation, 20(15-17), 1-8 pp. http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ofs&AN=109177085&site=ehost-live&scope=site
APICS, (2011). Understanding a Scaled Score. Retrieved from http://www.apics.org/docs/default-source/certification/scaledscoredocument.pdf?Status=Master
Association for Qualitative Research (AQR), (2016). Objectivity, Reliability and Validity. Retrieved from https://www.aqr.org.uk/glossary/objectivity
Bandura, A. (1995), “Exercise of personal and collective efficacy in changing societies”, in Bandura, A. (Ed.), Self-Efficacy in Changing Societies, Cambridge University Press, Cambridge, pp. 1-45.
Bandura, A., Caprara, G.V., Barbaranelli, C., Gerbi, M. and Pastorelli, C. (2003), “Role of affective self-regulatory efficacy in diverse spheres of psychosocial functioning”, Child Development, Vol. 74 No. 3, pp. 769-782.
Baruch, Y. & Holton, B.C. (2008). Survey response rate levels and trends in organizational research. SAGE Journals. Human Relations, August 2008, Vol. 61, no. 8, pp. 1139-1160. Doi: 10.1177/0018726708094863. Retrieved from http://hum.sagepub.com/content/61/8/1139
Beauducel, A. & Herzberg, P.Y. (2006). On the Performance of Maximum Likelihood Versus Means and Variance Adjusted Weighted Least Squares Estimation in CFA. Structural Equation Modeling, vol. 13(2), pp. 186–203. Retrieved from https://www.researchgate.net/profile/Philipp_Herzberg/publication/243043555_On_the_Performance_of_Maximum_Likelihood_Versus_Means_and_Variance_Adjusted_Weighted_Least_Squares_Estimation_in_CFA/links/5432618a0cf22395f29c02b0.pdf
Becker, S. (1964). Human capital: A theoretical and empirical analysis, with special reference to education. New York: Columbia University Press. Retrieved from http://www.nber.org/chapters/c3730.pdf
Belli, G. (2008). Non-experimental Quantitative Research. Chapter 4, Lapan c04.text V1 – 09/02/2008 – pp. 59-77. Retrieved from https://www.k4health.org/sites/default/files/migrated_toolkit_files/0470181095-1.pdf
Belmont Report, (2009). History, Principles and Application. Retrieved from http://humansubjects.stanford.edu/education/2009_05_Belmont.pdf
Bessolo, T. C. (2011). The implementation of online learning programs: A comparative analysis of public, nonprofit, and for-profit higher education institutions. Available from ProQuest Dissertations & Theses Full Text: The Humanities and Social Sciences Collection. (901453062).
Blakstad, O., (2015). Statistical Tutorial. Exportable Psychology Experiments. Retrieved from https://explorable.com/statistics-tutorial
Blanchard, K. H. (2008). Situational Leadership. Leadership Excellence. 25(5). 19. Retrieved from: http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=31950744&site=ehost-live&scope=site
Boone, H. N. & Boone, D. A., (2012). Analyzing Likert Data. Vol. 50, Number 2, Tools of the Trade (April 2012). Retrieved from http://www.joe.org/joe/2012april/tt2.php
Bondarenko, N. (2015). The Nature of the Current and Anticipated Shortage of Professional Skills and Qualities of Workers in the Russian Labor Market. Russian Education & Society, 57(3), 119-145. doi:10.1080/10609393.2015.1018744. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=108328980&site=eds-live&scope=site
Brasoveanu, A. (2012). Basic Probability Theory: Intro to Bayesian Data Analysis & Cognitive Modeling. University of California, Santa Cruz (UCSC Linguistics). Retrieved from http://people.ucsc.edu/~abrsvn/intro_prob_1.pdf
Britt, D. M. (2015). How to Better Engage Online Students with Online Strategies. College Student Journal, 49(3), 399-404. http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=109506012&site=eds-live&scope=site
Brill, R. T., Gilfoil, D. M., Doll, K., (2014). Exploring Predictability of Instructor Ratings Using a Quantitative Tool for Evaluating Soft Skills among MBA Students. American Journal of Business Education, v7 n3 p175-182 2014. Retrieved from http://eric.ed.gov/?id=EJ1053626
Brown, J. D. (1996). Testing in language programs. Upper Saddle River, NJ: Prentice Hall Regents.
Browne, M. W. & Cudeck, R. (1993). Alternative ways of assessing model fit. Sociological Methods & Research, Vol. 21, pp. 230–258. Retrieved from https://books.google.com/books?hl=en&lr=&id=FvIxxeYDLx4C&oi=fnd&pg=PA136&dq=info:Y5HZP1kjTIsJ:scholar.google.com&ots=_L-zz_-HxQ&sig=UYgNqJUQrQKi8ewUQpRJvX5JnZM#v=onepage&q&f=false
Brungardt, C. (2011). The Intersection Between Soft Skill Development and Leadership Education. Journal of Leadership Education, 10(1), 1-22. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=ehh&AN=70294635&site=ehost-live&scope=site
Bryant, J., & Bates, A. (2015). Creating a Constructivist Online Instructional Environment. Tech trends: Linking Research & Practice to Improve Learning, 59(2), 17-22. Doi: 10.1007/s11528-015-0834-1. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=100711554&site=ehost-live&scope=site
Burns, E. C., (2011). The Adult Learner: A Change Agent in Post-Secondary Education. The College of St. Scholastica. Retrieved from http://www.westga.edu/~distance/ojdla/summer142/burns_142.html
Bureau of Labor Statistics. (2012). Occupational Employability Projections to 2018: Table 1.2. Employability by occupation. Washington, DC: US Department of Labor. Retrieved from http://www.bls.gov/oes/
Cai, Y. (2013). Graduate employability: A conceptual framework for understanding employers’ perceptions. Higher Education, Vol. 65(4), 457-469. Doi: 10.1007/s10734-012-9556-x Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=a2h&AN=86177290&site=ehost-live&scope=site
Cappelli, P. H. (2015). Skill Gaps, Skill Shortages, and Skill Mismatches: Evidence and Arguments for the United States. Industrial & Labor Relations Review. 68(2) pp. 251-290. Doi: 10.1177/0019793914564961
Carnevale, A., P., Smith, N., & Strohl, J., (2010). Help Wanted: Projections of Jobs and Education Requirements through 2018. Georgetown University Center on Education and the Workforce. 122 pp. Retrieved from http://files.eric.ed.gov/fulltext/ED524310.pdf
Carnevale, D. (2006, February 3). Rule change may spark online boom for colleges. Chronicle of Higher Education, 52(22), A1-A36. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=a2h&AN=19635635&site=ehost-live&scope=site
Carnevale, A., P., Gulish, A., & Strohl, J., (2015). College is just the beginning: Employers’ Role in the $1.1 Trillion Postsecondary Education and Training System. Georgetown University Center on Education and the Workforce. McCourt School of Public Policy. 14 pp. Retrieved from https://cew.georgetown.edu/wp-content/uploads/2015/02/Trillion-Dollar-Training-System-.pdf
Cataldi, E. F., Siegel, P., Shepherd, B. & Cooney, (2014, July). Baccalaureate and Beyond: A First Look at the Employability Experiences and Lives of College Students, 4 Years On. National Center for Education Statistics (NCES). Retrieved from https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2014141
Chanco, B. (2015, June 17). Matching education with jobs. Demand and Supply (The Philippine Star). Retrieved from http://www.philstar.com/business/2015/06/17/1466631/matching-education-jobs
Chopra, R. (2013, July 17). Student Debt Swells, Federal Loans Now Top a Trillion. Consumer Financial Protection Bureau (cfpb). Retrieved from http://www.consumerfinance.gov/about-us/newsroom/student-debt-swells-federal-loans-now-top-a-trillion/
Chiu, C., Lonner, W. J., Matsumoto, D., & Ward, C. (2013). Cross-cultural competence: Theory, research, and application. Journal o f Cross-Cultural Psychology, Vol. 44, Issue 6, pp. 843-848. Doi: 10.1177/0022102113493716.
Cohen, J.; Cohen, P.; West, S. G.; Aiken, L. S. (2013). Applied multiple regression/correlation analysis for the behavioral sciences. Routledge. Retrieved from https://books.google.com/books?hl=en&lr=&id=gkalyqTMXNEC&oi=fnd&pg=PP1&dq=pearson’s+correlation+coefficient+formula+explained&ots=tQES0-s8fh&sig=x6Sc9WkM08b2HRhxE3At24wwcl4#v=onepage&q=pearson’s%20correlation%20coefficient%20formula%20explained&f=false
Collins, Randal (1971). Functional and Conflict Theories of Educational Stratification. American Sociological Review, Vol. 36, No. 6. (Dec, 1971), pp. 1002-1019. Retrieved from http://www.communicationcache.com/uploads/1/0/8/8/10887248/functional_and_conflict_theories_of_educational_stratification.pdf
Collins, R. (1979). The Credential Society: An Historical Sociology of Education and Stratification. Teachers College Record Volume 82, Number 2, 1980, p. 365-368. Retrieved from http://www.tcrecord.org/library/abstract.asp?contentid=1005
Commission on the Regulation of Postsecondary Distance Education, (2013). Advancing Access through Regulatory Reform: Findings, Principles, and Recommendations for the State Authorization Reciprocity Agreement (SARA). Retrieved from http://www.sheeo.org/sites/default/files/publications/Commission%20on%20Regulation%20of%20Postsecondary%20Distance%20Education%20Draft%20Recommendations%20FINAL%20April%20_0.pdf
Compte, O. and Postlewaite, A. (2004), “Confidence-enhanced performance”, American Economic Review, Vol. 94 No. 5, pp. 1536-1557.
Cornacchione, E. & Daugherty, J. L. (2013). Trends in opportunity costs of U.S. postsecondary education: A national HRD and human capital theory analysis. New Horizons in Adult Education & Human Resource Development. Vol. 25(2). 62-82. Doi: 10.1002/nha.20017.
Collaborative Institutional Training Initiative (CITI), (2015). Collaborative Institutional Training Initiative at the University of Miami. Retrieved from https://www.citiprogram.org/index.cfm?pageID=14&languagePreference=English&region=1
Columbaro, N. L., & Monaghan, C. H., (2009). Employer Perceptions of Online Degrees: A Literature Review. Online Journal of Distance Learning Administration, Volume XII, Number I, Spring 2009. University of West Georgia, Distance Education Center. Retrieved from http://www.westga.edu/~distance/ojdla/spring121/columbaro121.html?utm_source=twitterfeed&utm_medium=twitter
Common Rule, (2009, January 15). Code of Federal Regulations: TITLE 45 PUBLIC WELFARE DEPARTMENTS OF HEALTH AND HUMAN SERVICES, PART 46 PROTECTIONS OF HUMAN SUBJECTS. Retrieved from http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html
Complete College America, (2015). Four-Year MYTH. Retrieved from http://completecollege.org/wp-content/uploads/2014/11/4-Year-Myth.pdf
Constant, L., Stasz, C., Culbertson, S. & Vernez, G., (2014). Building a Sound Technical and Vocational Education and Training System (TVET). (EduTech Magazine, November 13, 2014). RAND Corporation. Retrieved from http://www.rand.org/blog/2014/12/building-a-sound-technical-and-vocational-education.html
Corder, G. W. & Foreman, D. I. (2014). Nonparametric Statistics: A Step-by-Step Approach. Wiley. Retrieved from https://books.google.com/books?hl=en&lr=&id=CIxgAwAAQBAJ&oi=fnd&pg=PP22&dq=Corder+%26+Foreman,+(2014).+Nonparametric+Statistics:+A+Step-by-Step+Approach.+&ots=aOxcCYJ4bc&sig=APfTLFB5SFrukXdIeG7VGu23OZc#v=onepage&q=Corder%20%26%20Foreman%2C%20(2014).%20Nonparametric%20Statistics%3A%20A%20Step-by-Step%20Approach.&f=false
Corley, K. & Gioia, D. (2011). Building theory about theory building: What constitutes a theoretical contribution? Academy of Management Review, 36, 12-32. Retrieved from http://cmsdev.aom.org/uploadedFiles/Publications/AMR/CorleyGioiaBuildingTheory.pdf
Creswell, J., & Plano Clark, V. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks, CA: Sage.
Crossman, Ashley (2016, May 13). Understanding Descriptive vs. Inferential Statistics. About Education: Sociology. Retrieved from http://sociology.about.com/od/Statistics/a/Descriptive-inferential-statistics.htm
Dacre-Pool, Qualter & Sewell, (2013). Exploring the factor structure of the CareerEDGE employability development profile. University of Central Lancashire, Preston, UK, 17 July 2013. Retrieved from http://www.emeraldinsight.com/loi/et
DeGhett, V. J. (2014). Effective use of Pearson’s product-moment correlation coefficient: an additional point. Science Direct, Vol. 98. p 1-2. Doi: 10.1016/j.anbehav.2014.10.006. Retrieved from http://www.sciencedirect.com.library.gcu.edu:2048/science/article/pii/S0003347214003844
Dell Statistics Textbook, (2015, April 8). How to Analyze Data with Low Quality or Small Samples, Nonparametric Statistics. Retrieved from http://documents.software.dell.com/Statistics/Textbook/nonparametric-statistics
Deming, D. J., Goldin, C., & Katz, L. F., (2012). The For-Profit Postsecondary School Sector: Nimble Critters or Agile Predators? Journal of Economic Perspectives. Volume 26, Number 1 (Winter 2012, pp. 139-164). Retrieved from http://capseecenter.org/wp-content/uploads/2012/02/ForProfit_Nimble-Critters_Feb-2012.pdf
Department of Health and Human Services, (2015). Office for Human Research Protections: The Nuremberg code. Retrieved from http://www.hhs.gov/ohrp/archive/nurcode.html
Diggs, S. (2012). Health disparities and health care financing: reconstruction the American health care system. Journal of Health Care Finance, Vol. 38, pp. 76–90.
Dillman, D. A., Smyth, J. D., & Christian, L. M., (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. Published by, John Wiley and Sons, Inc. Retrieved from http://books.google.com/books?hl=en&lr=&id=W5I_BAAAQBAJ&oi=fnd&pg=PT15&dq=Internet+questionnaire+surveys+&ots=IkdovVA3jt&sig=-AXUnucv-yoVkYMcRu5U_EJbxyM#v=onepage&q=Internet%20questionnaire%20surveys&f=false
Dissertation Writing, (2016, December 5). Writing Chapter 5: Discussion and Recommendations. Retrieved from http://dissertationwriting.com/thesis-dissertation-conclusion-chapter/
Dostis, M. (2013, USA Today). Degree alone not enough to prepare students for workforce. Retrieved from http://www.usatoday.com/story/news/nation/2013/10/31/more-than-a-college-degree/3324303/
Drake, B., (2012, July). Pew Research Survey: Americans see growing gap between rich and poor. Retrieved from http://www.pewresearch.org/fact-tank/2013/12/05/americans-see-growing-gap-between-rich-and-poor/
Dumbauld, B., (2014). A Brief History of Online Learning [Info-graphic]. Straighter Line Blog. Retrieved from http://www.straighterline.com/blog/brief-history-online-learning-infographic/
Durkheim, E. (1895). The Rules of Sociological Method. Retrieved from http://durkheim.uchicago.edu/Summaries/rules.html
EconEdLink.org, (2010). Figure 2: The US Unemployment Rate since 1990. Copyright: Council for Economic Education. Retrieved from http://www.econedlink.org/lessons/images_lessons/909_em909_figure21.jpg
Edmondson, D. R. (2005). Likert Scales: A History. Retrieved from http://faculty.quinnipiac.edu/charm/CHARM%20proceedings/CHARM%20article%20archive%20pdf%20format/Volume%2012%210005/127%20edmondson.pdf
Emerson, R. W. (2015). Causation and Pearson’s Correlation Coefficient. Journal of Visual Impairment & Blindness, Vol. 36(3), 242-244. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ccm&AN=2013010087&site=eds-live&scope=site
Essary, M. L. (2014). Key External Factors Influencing Successful Distance Education Programs. Academy of Educational Leadership Journal, Vol. 18(3), 121-136. Retrieved from http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=ofs&AN=100277020&site=eds-live&scope=site
Facer, K., & Sandford, R. (2010). The next 25 years? Future scenarios and future directions for education and technology. Journal of Computer Assisted Learning, Vol. 26, pp.74-93. doi:10.1111/1365-27292009.00337.
Faiwell, S. (2005). Dreaming of a big paycheck? Know what skills employers look for. Retrieved September 30, 2005, from http://www.youngmoney.com/careers/job_hunt/031201_01
Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007a). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175-191. Retrieved from http://www.gpower.hhu.de/fileadmin/redaktion/Fakultaeten/Mathematisch-Naturwissenschaftliche_Fakultaet/Psychologie/AAP/gpower/GPower3-BRM-Paper.pdf
Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007b). A short tutorial of G*Power. Tutorials in Quantitative Methods for Psychology 2007, Vol. 3(2), p.51-59. Retrieved from http://www.gpower.hhu.de/fileadmin/redaktion/Fakultaeten/Mathematisch-Naturwissenschaftliche_Fakultaet/Psychologie/AAP/gpower/GPowerShortTutorial.pdf
Flores, G. & Hin, L. (2013). Trends in racial/ethnic disparities in medical and oral health, access to care, and use of services in U.S. children: Has anything changed over the years? International Journal for Equity in Health, Vol. 12, pp. 2–16.
Florida International University, (2005). Skills desired by employers. Retrieved from http://www.fiu.edu/~career/student/fairs/skills.html
Fortune 500, (2014). Fortune 1000 Company List for 2014. Retrieved from http://fortune.com/fortune500/2014/
Fraenkel, J. R. & Wallen, N. E. (2008). Data Definitions. Adapted from the Glossary: How to Design and Evaluate Research in Education. Retrieved from http://www.johnlpryor.com/JP_Digital_Portfolio/EDU_7901_files/EDU%207901%20Data%20Definitions.pdf
Frary, R.B. (1996) Brief Guide to Questionnaire Development. Washington, DC: ERIC Clearinghouse on Assessment and Evaluation. pg. 30. Retrieved from http://medrescon.tripod.com/questionnaire.pdf
Fuscaldo, Donna (3016, April 28). Companies Finding Unique Ways to Address the Skills Gap for College Graduates. Good Call Education News. Retrieved from https://www.goodcall.com/news/companies-finding-unique-ways-to-address-skills-gap-6312
Gatewood, A. & Neff, S. (2011). Understanding APICS Exam Scaled Scoring. Volunteer Leadership Workshop at 2011 APICS International Conference & Expo: Building Pillars of Success. Retrieved from http://www.apics.org/docs/default-source/cbox-general/how-are-apics-certification-exams-scored-vlw-2011.pdf
Gay, L. R. (1987). Educational research: Competencies for analysis and application (3rd ed.). New York: Merrill.
Gay, B. & Weaver, S. (2011). Theory Building and Paradigms: A Primer on the Nuances of Theory Construction. American International Journal of Contemporary Research, Vol. 1 No. 2; September 2011. Retrieved from http://aijcrnet.com/journals/Vol_1_No_2_September_2011/4.pdf
Gay, L. R., Mills, G. E. & Airasian, P. (2006). Educational research: Competencies for analysis and application (8th ed.). Upper Saddle River, NJ: Pearson. Retrieved from https://books.google.com/books?hl=en&lr=&id=qLqIK3HomoEC&oi=fnd&pg=PA323&dq=Gay,+Mills+%26+Airasian+%2B+causalcomparative+method&ots=PmiiUd0zjT&sig=EWeeU0sqeaqPIG7CIdpVJRlM54A#v=onepage&q=Gay%2C%20Mills%20%26%20Airasian&f=false
G*Power 3.1 Manual (2014). G*Power Statistical Power Analysis. Retrieved from http://www.gpower.hhu.de/fileadmin/redaktion/Fakultaeten/Mathematisch-Naturwissenschaftliche_Fakultaet/Psychologie/AAP/gpower/GPowerManual.pdf
G*Power 3.1.9.2 Software, (2015). G*Power Distribution Plot. Retrieved from http://software2012downloads.com/computer-software/gpower-3-1-9-2-download/
Grand Canyon Universities, (2015). Institutional Review Board (IRB) 2014 Handbook. Retrieved from http://www.gcu.edu/Documents/Student-Resources/IRB-Handbook.pdf
Grand Canyon Universities, (2016). CIRT-Basic Research Designs. Retrieved from https://cirt.gcu.edu/research/developmentresources/tutorials/researchdesigns
Graduate Careers Australia [GCA], (2012b). GradStats: Employability and salary outcomes of recent higher education students. Melbourne: GCA.
Grant, T. (2015). Partnerships Help Fill Gap in Educational Qualifications. TD: Talent Development, 69(3), 76-77. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=101132499&site=eds-live&scope=site
Gravetter, F. J. & Forzano, L. B. (2012). Research methods for the behavioral sciences (4th ed). Belmont, CA: Wadsworth. Retrieved from http://www.cengagebrain.com/content/gravetter42253_1111342253_01.01_toc.pdf
Green, Lee C. (2015). Determining the Skills Gap: A Study of the Perceptions of Entry-Level Skills of Recent Career and Technology Education Completers. ProQuest 10022241. Retrieved from http://gradworks.umi.com/10/02/10022241.html
Groves, R. M. & Lyberg, L. (2010). Total Survey Error: Past, Present, and Future. Public Opinion Quarterly, Vol. 74, No. 5, 2010, pp. 849–879. Retrieved from http://poq.oxfordjournals.org/content/74/5/849.full.pdf+html
Gordon, J. B., Levine, R. J., Mazure, C. M., Rubin, P. E., Schaller, B. R., & Young, J. L. (2011). Social contexts influence ethical considerations of research. American Journal of Bioethics, 11(5), 24-30. doi:10.1080/15265161.2011.560338. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=ccm&AN=2011029707&site=eds-live&scope=site
Hannah, S. B. (1996). The Higher Education Act of 1992. Journal of Higher Education, 67(5), 498-527. Education Research Complete, EBSCOhost (accessed April 24, 2015). Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=ehh&AN=9703214919&site=ehost-live&scope=site
Hart Research Associates (2015, Apr 29). College Students’ Views on College Learning and Career Success. Retrieved from https://www.aacu.org/sites/default/files/files/LEAP/2015StudentSurveyReport.pdf
Haskins, R., Holzer, H. & Lerman, R. (2009). “Promoting Economic Mobility by Increasing Postsecondary Education.” Economic Mobility Project, Pew Charitable Trusts, May 2009, pp. 43-44. Retrieved from http://www.schoolcounselor.org/asca/media/asca/home/Ratios10-11.pdf
Haefner, R. (2015). Companies Planning to Hire More Recent College Students This Year and Pay Them Better, According to CareerBuilder Survey. Retrieved from http://www.careerbuilder.com/share/aboutus/pressreleasesdetail.aspx?sd=4%2f23%2f2015&siteid=cbpr&sc_cmp1=cb_pr889_&id=pr889&ed=12%2f31%2f2015
Hentschke, G. C., Oschman, S. and Snell, L. (2002). Education Management Organizations: Growing a For-profit Education Industry with Choice, Competition, and Innovation. Policy Brief 21. EDUCATION MANAGEMENT ORGANIZATIONS. Retrieved from http://reason.org/files/86f373eefe12bf11ff614e1305ff3362.pdf
Hill, P. (2012). Online Educational Delivery Models: A Descriptive View. Educause.edu. Retrieved from https://net.educause.edu/ir/library/pdf/ERM1263.pdf
Hong, P. P., Polanin, J. R., Key, W., & Choi, S. (2014). DEVELOPMENT OF THE PERCEIVED EMPLOYABILITY BARRIER SCALE (PEBS): MEASURING PSYCHOLOGICAL SELF-SUFFICIENCY. Journal of Community Psychology, 42(6), 689-706. doi:10.1002/jcop.21646. Retrieved from https://lopes.idm.oclc.org/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=97053256&site=eds-live&scope=site
Howie, Esq., Margaret-Ann (2015). Causal-comparative/quasi-experimental research. Key Elements of a Research Proposal: Quantitative Design. 2010 Curriculum Design and Writing Team. Baltimore County Public Schools. Retrieved from https://www.bcps.org/offices/lis/researchcourse/develop_quantitative.html
How2Stats, (2015, Jan 19). What is Cronbach’s Alpha? – Explained Simply. YouTube videos. Retrieved from https://www.youtube.com/watch?v=PCztXEfNJLM
Husing, J. (2015, Oct 15). Beyond the diploma: ‘Soft skills’ most in demand from employers. Los Angeles Daily News, Higher Education. Retrieved from http://www.dailynews.com/social-affairs/20151015/beyond-the-diploma-soft-skills-most-in-demand-from-employers
Institute of Psychometric Coaching, (2016). Introduction to psychometric tests. Retrieved from http://www.psychometricinstitute.com.au/Psychometric-Guide/Introduction_to_Psychometric_Tests.html
Iliško, D., Skrinda, A., & Mičule, I. (2014). Envisioning the Future: Bachelor’s and Master’s Degree Students’ Perspectives. Journal of Teacher Education for Sustainability, 16(2), 88-102. Doi: 10.2478/jtes-2014-0013. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=101678928&site=ehost-live&scope=site
International Business Machines (IBM), (2015). Statistical Package for Social Sciences (SPSS)™. Retrieved from http://www.01.ibm.com/software/analytics/spss/products/statistics/
Iuliana, P., Dragoș, M. I. & Mitran, P. C. (2014). Identification of Employability Skills – Starting Point for the Curriculum Design Process. Economics, Management & Financial Markets, Vol. 9(1), 237-246 pp. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=95505257&site=eds-live&scope=site
Iliško, D., Skrinda, A., & Mičule, I. (2014). Envisioning the Future: Bachelor’s and Master’s Degree Students’ Perspectives. Journal of Teacher Education for Sustainability, 16(2), 88-102. Doi: 10.2478/jtes-2014-0013. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=101678928&site=ehost-live&scope=site
Iyengar, R. V. (2015). MBA: The Soft and Hard Skills That Matter. IUP Journal Of Soft Skills, 9(1), 7-14. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=103322289&site=eds-live&scope=site
Jackson, D. (2014). Factors influencing job attainment in recent Bachelor students: evidence from Australia. Higher Education, 68(1), 135-153. Doi: 10.1007/s10734-013-9696-7. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=96444697&site=eds-live&scope=site
Jamieson, S. (2004). Likert scales: how to (ab)use them. Medical Education, 38, 1212-1218. Retrieved from http://www.theanalysisfactor.com/can-likert-scale-data-ever-be-continuous/
Jonbekova, D. (2015). University Students’ Skills Mismatches in Central Asia: Employers’ Perspectives from Post-Soviet Tajikistan. European Education, 47(2), 169-184. doi:10.1080/10564934.2015.1033315. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=103104798&site=eds-live&scope=site
Katz, S. and Warner, Z, PhD. (2017). Scaling & Scale Scores. Retrieved from https://www.ets.org/Media/Research/pdf/RD_Connections16.pdf
Kenny, D. A. (2012). Measuring model fit. Retrieved from http://davidakenny.net/cm/fit.htm
Kline, R. B. (2009). Principles and practices of structural equation modeling (3rd ed.). New York, NY: Guilford Press. Retrieved from https://books.google.com/books?id=Q61ECgAAQBAJ&lpg=PP1&ots=jDkfXuB7ni&dq=Kline%2C%20R.%20B.%20(2009).%20Principles%20and%20practices%20of%20structural%20equation%20modeling%20(3rd%20ed.).%20New%20York%2C&lr&pg=PR16#v=onepage&q&f=false
Kyllonen, P. C. (2013). Soft Skills for the Workplace. Change, 45(6), 16-23. doi:10.1080/00091383.2013.841516. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=92049516&site=eds-live&scope=site
Laerd Statistics, (2016). Independent Sample T-Test using SPSS. Retrieved from https://statistics.laerd.com/spss-tutorials/dependent-t-test-using-spss-statistics.php
Laerd Statistics, (2016). Descriptive and Inferential Statistics. Retrieved from https://statistics.laerd.com/statistical-guides/descriptive-inferential-statistics.php
Law, W. and Watts, A.G. (1977), Schools, Careers and Community, Church Information Office, London.
Leeuw, Jan De (2005). UNIDIMENSIONAL SCALING. The Encyclopedia of Statistics in Behavioral Science. Published by Wiley: 2005.
Light, A. and McGee, A. (2015). Employer Learning and the “Importance” of Skills. Journal of Human Resources, 50(1), 72-107. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=bth&AN=100840810&site=ehost-live&scope=site
Looney, A. & Constantine, Y., (2014). Bookings Institute Papers on Economic Activity. Retrieved from http://www.brookings.edu/about/projects/bpea/papers/2015/looney-yannelis-student-loan-defaults
Lindsey, N. S., & Rice, M. L. (2015). Interpersonal Skills and Education in the Traditional and Online Classroom Environments. Journal of Interactive Online Learning, 13(3), 126-136. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=103343167&site=eds-live&scope=site
Lumina/Gallop Poll, (2014, Feb 25). WHAT AMERICA NEEDS TO KNOW ABOUT HIGHER EDUCATION REDESIGN. The 2013 Lumina Study of the American Public’s Opinion on Higher Education and U.S. Business Leaders Poll on Higher Education. Retrieved from https://www.luminafoundation.org/files/resources/2013-gallup-lumina-foundation-report.pdf
Lysne, S. J. & Miller, B. G. (2015). Implementing Vision and Change in a Community College Classroom. Journal of College Science Teaching. 44(6). 11-16. Retrieved from: http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=103358617&site=ehost-live&scope=site
Mandak, J. & Tucker, E. (2015, Nov 16). U.S. reaches $95.5M settlement in for-profit education case. Associated Press Release Nov 16, 2015, 10:30PM. Retrieved from http://markets.cbsnews.com/US-reaches-955M-settlement-in-for-profit-education-case/786f19d6580f8285/261910/
Mannapperuma, M. (2015). Protecting Students, Protecting Consumers: A New Federal Regulation of the For-Profit Distance Learning Industry. Journal of Law & Policy. 2015, Vol. 23 Issue 2, p541-590. 50p. http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=109139262&site=ehost-live
Manton, A., Wolf, L. A., Baker, K. M., Carman, M. J., Clark, P. R., Henderson, D., & Zavotsky, K. E. (2014). Ethical Considerations in Human Subjects Research. JEN: Journal of Emergency Nursing, Vol. 40(1), 92-94. Doi: 10.1016/j.jen.2013.11.001. Retrieved from http://www.sciencedirect.com.library.gcu.edu:2048/science/article/pii/S0099176713005096
Mallon, A. J., & Stevens, G. V. G. (2010). Making the 1996 welfare reform work: The promise of a job. National Poverty Center Working Paper Series (#10–03). Retrieved from: http://www.npc.umich.edu/publications/working_papers/
Manyerere, David J. (2016). Social Capital: A Neglected Resource to Create Viable and Sustainable Youth Economic Groups in Urban Tanzania. Journal of Education and Practice, v7 n3 p136-146 2016. Retrieved from http://eric.ed.gov/?q=Human+Capital+as+a+Barrier+to+Employment&pr=on&ft=on&id=EJ1089828
Marsh, H.W., Hau, K.T. and Wen, Z.L. (2004). In search of golden rules: comment on hypotheses testing approaches to setting cutoff values for fit indexes and dangers in overgeneralizing Hu & Bentler’s (1999) findings. Structural Equation Modeling, Vol. 11, No. 3, pp. 320-341.
McKinney, K. (2015). The More Things Change, The More They Stay The Same. International Journal for the Scholarship of Teaching & Learning, 9(1), 1-6. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=ehh&AN=100881001&site=eds-live&scope=site
McLeod, S. (2011). Likert Scale. Retrieved from http://www.simplypsychology.org/likert-scale.html
Merriam-Webster Dictionary, (2015). Quality. Retrieved from http://www.merriam-webster.com/dictionary/quality
Moran, Kristen; Bodenhorn, Nancy (2015). Elementary School Counselors’ Collaboration with Community Mental Health Providers. Journal of School Counseling, v13 n4 2015. Retrieved from http://eric.ed.gov/?q=Physical+%26+Mental+Health+Barriers+to+Employment&pr=on&ft=on&id=EJ1062935
Mulig, L. (2015). The High Cost of Graduate School Loans: Lessons in Cost Benefit Analysis, Budgeting and Payback Periods. Academy Of Accounting & Financial Studies Journal, 19(1), 20-24. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=108525831&site=ehost-live&scope=site
National Association of Colleges and Employers (NACE), (2015). Figure-1 & 2-Job Outlook: College Hiring to Increase 9.6 Percent. Retrieved from https://www.naceweb.org/s04152015/job-outlook-spring-update-hiring-plans.aspx
National Association of Colleges and Employers (NACE), (2014, Nov 12). Job Outlook: The Candidate Skills/Qualities Employers Want, the Influence of Attributes. Retrieved from http://www.naceweb.org/s11122014/job-outlook-skills-qualities-employers-want.aspx
National Center for Education Statistics (NCES), (2017). Fast Facts: What are the new Back to School statistics for 2016? Institute of Education Sciences. Retrieved from http://nces.ed.gov/fastfacts/display.asp?id=372
National Center for Education Statistics (NCES), (2014). Table 105.20, Enrollment in educational institutions, by level and control of institution, enrollment level, and attendance status and sex of student: Selected years fall 1990 through fall 2103. Retrieved from http://nces.ed.gov/programs/digest/d13/tables/dt13_105.20.asp
National Center for Education Statistics (NCES), (2015, May 7). Educational Statistics. Retrieved from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2015011
National Statistical Service, (2015). Sample Size Calculator. Retrieved from http://www.nss.gov.au/nss/home.NSF/pages/Sample+size+calculator
National Statistics Office of Philippines, (2015, October). Philippines Unemployment Rate. Recovered from http://www.tradingeconomics.com/philippines/unEmployability-rate
Nye, A. (2015). Building an online academic learning community among undergraduate students. Distance Education, 2015. Vol. 36, No. 1, pp. 115–128, http://dx.doi.org/10.1080/01587919.2015.1019969
Office for Human Research Protections (OHRP), (1993). PUBLIC LAW 103-43; JUNE 10, 1993. Retrieved from http://www.hhs.gov/ohrp/policy/publiclaw103-43.htm.html
Or-Bach, Rachel (2014). Use of Personal Response Systems in Higher Education–A Case Study. International Journal of Higher Education, v3 n3 p134-141 2014. Retrieved from http://eric.ed.gov/?q=Personal+Balance+as+a+Barrier+to+Employment&pr=on&ft=on&pg=2&id=EJ1067560
Ornstein, M. (2013). A companion to survey research. London: SAGE Publications Ltd. Doi: http://dx.doi.org.library.gcu.edu:2048/10.4135/9781473913943
Penrose, E. T. (1995). The theory of the growth of the firm. New York: Oxford University Press. Retrieved from https://books.google.com/books?hl=en&lr=&id=aigWHVhP5tsC&oi=fnd&pg=PR22&dq=Penrose,+E.+T.+(1995).+The+theory+of+the+growth+of+the+firm+(3rd+ed.).+New+York:+Oxford&ots=AVzdO_sixv&sig=4aTFu_VvCNEKYqDexn44FVaETFk#v=onepage&q&f=false
Persaud, C. (2013). Great Recession Timeline. Retrieved from http://www.bankrate.com/finance/economics/great-recession-timeline.aspx
Po, Y., Jianru, G., & Yinan, J. (2015). Analysis of Employability Quality of Chinese Vocational and Technical College Students. Chinese Education & Society, 48(1), 1-22. DOI: 10.1080/10611932.2014.994940. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=trh&AN=102748332&site=ehost-live&scope=site
Podsakoff, P. M., MacKenzie, S. B. & Podsakoff, N. P., (2012). Sources of Method Bias in Social Science Research and Recommendations on How to Control It. Annul Revised Psychology. (2012). Vol. 63:539–69. Retrieved from http://isites.harvard.edu/fs/docs/icb.topic1392661.files/PodsakoffARP2012.pdf
Purcell, K., Elias, P., Atfield, G., Behle, H., Ellison, R., & Luchinskaya, D. (2013). Transitions into Employability, further study and other outcomes. Warwick: Warwick Institute of Employability Research.
Quinton, M. C. (2014). Self-employment as a solution for attitudinal barriers: A case study. Work, 48(1), 127-130. doi:10.3233/WOR-141861. Retrieved from https://lopes.idm.oclc.org/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=96353845&site=eds-live&scope=site
Raj, S., & Al-Alawneh, M., (2010). A Perspective on Online Degrees vs. Face-to-Face in the Academic Field. The Pennsylvania State University. Retrieved from http://linc.mit.edu/linc2010/proceedings/session16Raj.pdf
Research Methods Knowledge Base, (2016). Likert Scaling. Retrieved from http://www.socialresearchmethods.net/kb/scallik.php
Riley, W. J. (2012). Health disparities: gaps in access, quality, and affordability in medical care. Transactions of the American Clinical and Climatological Association, Vol. 123, pp.167–174.
Robles, M. M. (2012). Executive Perceptions of the Top 10 Soft Skills Needed in 2016 Workplace. Business Communication Quarterly, 75(4), 453-465. Doi: 10.1177/1080569912460400. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=83329495&site=eds-live&scope=site
Ross, K. N., (2005). Sample design for educational survey research. International Institute for Educational Planning/UNESCO. Vol. (1), pp. 1-89. Retrieved from http://www.unesco.org/iiep/PDF/TR_Mods/Qu_Mod3.pdf
Rothwell, A. and Arnold, J. (2007), “Self-perceived employability: development and validation of a scale”, Personnel Review, Vol. 46 No. 1, pp. 23-41.
Săveanu, S. M., & Buhaş, R. (2015). I’ve Just Graduated. Do You Want to Be My Employer? Skills Mismatches for Tertiary Students. Annals of the University of Oradea, Economic Science Series, Vol. (24) p. 63-64. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=103190329&site=ehost-live&scope=site
Schenck, J. & Cruickshank, J. (2015). Evolving Kolb: Experiential Education in the Age of Neuroscience. Journal of Experiential Education, 38(1), 73-95. Doi: 10.1177/1053825914547153. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=ehh&AN=100948154&site=ehost-live&scope=site
Schenker, J. D., & Rumrill, J. D. (2004). Causal-comparative research designs. Journal of Vocational Rehabilitation, Vol. 21(3), pp. 117-121. Retrieved from https://lopes.idm.oclc.org/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=16012317&site=eds-live&scope=site
Schonlau, M., Fricker, R. D., & Elliott, M. N., (2002). Conducting Research Surveys via E-mail and the Web. RAND Corporation. ISBN/EAN: 0-8330-3110-4. Retrieved from http://www.rand.org/pubs/monograph_reports/MR1480.html
Schultz, T.W. (1961). Investment in human capital. American Economic Review, Vol. 51, pp. 1–17.
Sewell, P. and Dacre Pool, L. (2010), “Moving from conceptual ambiguity to operational clarity. Employability, enterprise and entrepreneurship in higher education”, Education & Training, Vol. 52, No. 1, pp. 89-94.
Shea, P. (2015). A JOLT of New Energy for the Scholarship of Online Teaching and Learning. Online Learning, 19(3), 7-12. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=103359761&site=ehost-live&scope=site
Shachar, M. (2008). Meta-Analysis: The preferred method of choice for the assessment of distance learning quality factors. International Review of Research in Open & Distance Learning, 9(3), 1-15. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=ehh&AN=34905548&site=eds-live&scope=site
Sharma, G., & Sharma, P. (2010). Importance of Soft skills development in 21st century Curriculum. International Journal of Education & Allied Sciences, 2(2), 39-44. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=67146342&site=eds-live&scope=site
Shi, L., Lebrun-Harris, L., Daly, C., Sharma, R., Sripipatana, A., Hayashi, A. S. & Ngo-Metzger, Q. (2013). Reducing disparities in access to primary care and patient satisfaction with care. Journal of Health and Care for the Poor and Underserved, Vol. 24, pp. 56–66.
Shuttleworth, M. (2015). Quantitative Research Design. Exportable Psychology Experiments. Retrieved from https://explorable.com/quantitative-research-design
Sibolski, E. H. (2012). What is an Accrediting Agency Supposed to Do? Institutional Quality and Improvement vs. Regulatory Compliance. Retrieved from http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=ofs&AN=76423012&site=eds-live&scope=site
Sidhu, P., & Calderon, V. J. (2014). Many Business Leaders Doubt U.S. Colleges Prepare Students. Gallup Poll Briefing, 1. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=bth&AN=94998760&site=eds-live&scope=site
Staklis, Sandra; Skomsvold, Paul (2014). New College Graduates at Work: Employment among 1992-93, 1999-2000, and 2007-08 Bachelor’s Degree Recipients 1 Year after Graduation. Stats in Brief. NCES 2014-003. National Center for Education Statistics. Retrieved from http://files.eric.ed.gov/fulltext/ED544772.pdf
Stark, E., Poppler, P. P. & Murnane, J. (2011). Looking for Evidence of Human Capital (Or the Lack Thereof) in College/University Degrees Held by Managerial Level Employees. Journal of Behavioral & Applied Management, 13(1), 60-80. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=a9h&AN=76550296&site=eds-live&scope=site
Straumsheim, C. (2014). Identifying the Online Student. Retrieved from https://www.insidehighered.com/news/2014/06/03/us-releases-data-distance-education-enrollments
Society for Human Resource Management (SHRM) (2015). SHRM Customized Research Services. Retrieved from http://shrm.org/research/customizedresearch/pages/default.aspx
Soulé, H., & Warrick, T. (2015). Defining 21st Century Readiness for All Students: What We Know and How to Get There. Psychology of Aesthetics, Creativity & The Arts, 9(2), 178-186. Doi: 10.1037/aca0000017. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=102826725&site=ehost-live&scope=site
Tabachnick, B.G. and Fidell, L.S. (2007), Using Multivariate Statistics, 5th ed., Allyn & Bacon, Needham Heights, MA.
Tavakol, Mohsen & Dennick, Reg, (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education. 2011; 2:53-55. ISSN: 2042-6372 DOI: 10.5116/ijme.4dfb.8dfd
Trochim, William M.K. (2006). The Research Methods Knowledge Base. 2nd Edition. Web Center for Social Research Methods. Retrieved from http://socialresearchmethods.net/kb/index.php
The Executive Office of the President, (2014, January). Increasing College Opportunity for Low-Income Students: Promising Models and a Call to Action. Retrieved from http://www.whitehouse.gov/sites/default/files/docs/white_house_report_on_increasing_college_opportunity_for_low-income_students_1-16-2014_final.pdf
The White House, (2015a). Education: Knowledge and Skills for the Jobs of the Future. Retrieved from http://www.whitehouse.gov/issues/education/higher-education
The White House, (2015b). Obama Education Programs. Retrieved from http://www.whitehouse.gov/issues/education
The White House, (2014). Ready to Work: Job-Driven Training and American Opportunity. Retrieved from https://www.whitehouse.gov/sites/default/files/skills_report_072014_2.pdf
University of Bedfordshire, UK (2016). Descriptive Statistics. Retrieved from https://www.beds.ac.uk/howtoapply/departments/psychology/labs/spss/Descriptive_Statistics
U. S. Department of Education, (2010). Blueprint for Reform of the Elementary and Secondary Education Act. Retrieved from http://www2.ed.gov/policy/elsec/leg/blueprint/blueprint.pdf
U.S. Department of Education, (2010). ESEA Flexibility. Retrieved from http://www2.ed.gov/policy/elsec/guid/esea-flexibility/index.html
U. S. Department of Education, (2012, June 7). ESEA Flexibility Update. Retrieved from http://www2.ed.gov/policy/eseaflex/approved-requests/flexrequest.doc
U.S. Department of Education, (2014a). Higher Education Initiative Background. Retrieved from http://www.whitehouse.gov/issues/education/higher-education
U.S. Department of Education, (2014b). Connect ED Initiative. Retrieved from http://www.whitehouse.gov/issues/education/k-12/connected
U.S. Department of Health and Human Services (HHS), (2015). Code of Federal Regulations (45 CFR 46, Subpart A, B, C, & D-Protection of Human Subjects 2009). (The Common Rule). Retrieved from http://www.hhs.gov/ohrp/humansubjects/commonrule/
U. S. Department of Justice, (2015, Nov 16, Monday). For-Profit College Company to Pay $95.5 Million to Settle Claims of Illegal Recruiting, Consumer Fraud and Other Violations. Retrieved from http://www.justice.gov/opa/pr/profit-college-company-pay-955-million-settle-claims-illegal-recruiting-consumer-fraud-and
U. S. Department of Labor, Department of Commerce, Department of Education and Department of Health & Human Services, (2014, July 22). What Works in Job Training: A Synthesis of the Evidence? Retrieved from http://www.dol.gov/asp/evaluation/jdt/jdt.pdf
U. S. Government Accountability Office (2012). Employment for People with Disabilities: Little Is Known about the Effectiveness of Fragmented and Overlapping Programs. Retrieved from http://files.eric.ed.gov/fulltext/ED533494.pdf
University of California at Los Angeles, (2015). G*Power Data Analysis Examples: Power analysis for two-group independent sample t-test. Institute for Digital Research and Education. Retrieved from http://www.ats.ucla.edu/stat/gpower/indepsamps.htm
University of Limerick (2005). Cooperative education & careers division. Retrieved from http://www.ul.ie/careers/careers/stugrad/employerslookfor.shtml#grads
Walser, T. M., (2014). Quasi-Experiments in Schools: The Case for Historical Cohort Control Groups. Practical Assessment, Research & Evaluation. University of North Carolina, Wilmington, NC. Vol. (19), June 2014. ISSN 1531-7714. Retrieved from http://pareonline.net/getvn.asp?v=19&n=6
Watts, A.G. (2006), Career Development Learning and Employability, The Higher Education Academy, York.
Weaver, P. & Kulesza, M. (2014). Critical Skills for New Accounting Hires: What’s Missing from Traditional College Education? Academy of Business Research Journal, 434. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=edb&AN=100447488&site=eds-live&scope=site
Weng, W., (2015). Eight Skills in Future Work. Education, 135(4), 419-422. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=103352881&site=ehost-live&scope=site
Wiegand, H. (1968), Kish, L.: Survey Sampling. John Wiley & Sons, Inc., New York, London 1965, IX + 643 S., 31 Abb., 56 Tab., Preis 83 s. Biom. J., 10: 88–89. Doi: 10.1002/bimj.19680100122
Williams, Terri (2016, May 2). Report Reveals Big Gaps Between the Degrees Students Are Earning and In-Demand Jobs. Good Call Education News. Retrieved from https://www.goodcall.com/news/report-reveals-big-gaps-degrees-students-earning-demand-jobs-06415
Wolfowitz, J. (1942). Additive partition functions and a class of statistical hypotheses. The Annals of Mathematical Statistics, 13(3), 247-279. doi:10.1214/aoms/1177731566. Retrieved from http://projecteuclid.org/euclid.aoms/1177731566
Yin, A. C. & Volkwein, J. F. (2010). Basic skills assessment. New Directions for Institutional Research, 201065-77. doi:10.1002/ir.331. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com.library.gcu.edu:2048/login.aspx?direct=true&db=a9h&AN=48847222&site=eds-live&scope=site
Youhang, W. & Dongmao, W. (2015). Characteristics and Factors of the Grassroots Employability of College Students. Chinese Education & Society, 48(1), 23-41. DOI: 10.1080/10611932.2014.994943. Retrieved from http://library.gcu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=trh&AN=102748333&site=ehost-live&scope=site
Zhao, Y. (2015). A World at Risk: An Imperative for a Paradigm Shift to Cultivate 21st Century Learners. Society. Vol. 52, Issue 2, pp. 129-135. Doi: 10.1007/s12115015-9872-8. Retrieved from http://link.springer.com/journal/12115/52/2/page/1#page-2