background

Ask Cid - Its' All About Education



                                            

CAREEREDGE EMPLOYABILITY: 
EXAMINING IF A DIFFERENCE EXISTS BETWEEN ONLINE AND TRADITIONAL COLLEGE STUDENTS

Dr. Edward J. Files
Approved: 10/05/2017

                       Chapter 1: Introduction to the Study
								
Introduction
This is a quantitative, causal-comparative study to investigate if a difference 
exists in the scores from factor 2 (work & life experience skills) and factor 
4 (generic skills) of the CareerEDGE Employability Development Profile (EDP) survey 
instrument between online college pathways students and traditional college pathways 
students from the Bachelor program at a large Western Christian university. Per a 
study by Dacre-Pool, Qualter and Sewell (2013), “There has been little empirical 
research conducted in relation to graduate employability and diagnostic tools 
available in this area are very limited” (p. 303). 

A study from the U.S. Government Accountability Office (2012) and another by Hong, 
Polanin, Key and Choi (2014), suggest that there is need for further research in 
employability development assessment. The significance of this study is that it will 
produce statistical data that will lead to a more accurate, essential and deeper 
understanding of the relationship between education and application of employability 
skill sets of 2016 graduates. Per a combined study report from the U.S. Departments 
of Labor, Commerce, Education and Health & Human Services, on July 22, 2014, 
there is a need to “expand and improve access to labor market, occupational, and 
skills data and continue basic research on labor markets and employment” (pp. 21-22). 

The importance of this study is that it will produce statistical evidence to determine 
if a difference exists in employability skills of online college pathways students 
versus traditional college pathways students. And if a difference exists, this study 
will help to define and explore the implications of the difference. Moreover, the two 
major educational delivery pathways, online and traditional college degree modes, are 
often investigated from only one perspective, such as online college pathways students 
(Allen & Seaman, 2011, 2012, 2013, 2014). 

This study will present evidence gathered from current literature reviews describing 
both online college pathways student perspectives (Allen & Seaman, 2015; Essary, 2014) 
and traditional college pathways studentperspectives (Cataldi, Siegel, Shepherd & 
Cooney, 2014; NCES, 2014; U.S. Departments of Labor, Commerce, Education and Health & 
Human Services, 2014). This study will be using the CareerEDGE Employability: 
Development Profile (EDP) survey instrument (Dacre-Pool & Sewell, 2007) to guide the 
research questions to explore the difference of employability skills between these two 
educational groups. Per the Governments (2014) report; “More evidence is needed to fill 
gaps in knowledge, improve job training programs, inform practitioners about adopting 
promising strategies, and expand proven models to address the needs of specific groups 
of workers, industries, communities and institutions” (U.S. Departments of Labor, 
Commerce, Education and Health & Human Services, 2014, pp. 21-22).

Chapter one of this study covers the introduction to the study, background of the study, 
the problem statement, the purpose of the study, the research questions and hypotheses, 
significance of the study, as well as the advancing scientific knowledge section. The 
rationale for the methodology, nature of the research design, definition of terms used 
in the study, assumptions, limitations and delimitations of the study and the summary 
and organization of the remainder of the study follow. 

Background of the Study
This non-experimental, quantitative methodology, causal-comparative research study 
will explore the differences in employability skills of students from online college 
pathways and traditional college pathways, as determined from the Bachelor students’ 
perspective in the academic year 2016 at a large Western Christian university. Per 
numerous national and international studies (Allen & Seaman, 2014; Chanco, 2015; 
Youhang & Dongmao, 2015), employers are complaining that institutional education 
has manifests itself in a disconnect between educational skills taught and 
employability skills employers say they need for a modern workforce (Jonbekova, 2015; 
Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Sidhu & Calderon, 2014; 
U.S. Department of Education, 2014a; White House, 2015a).  

Recent studies by Chanco, (2015) suggest that college graduates’ employability skills 
in their field of study do not match those needed by executive hiring authorities. 
Per qualitative articles and literature explored, a difference exists between 
educational skill sets being taught by collegesand the employability skills employers 
say they need (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 
2015; Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 
2015a). It is universally agreed that a mismatch exists in skills taught by colleges 
and those needed by employers, but no studies exist that distinguish between the two 
educational groups. 

That all past and current studies combine or lump both educational pathways together 
when determining if there is a mismatch in skill sets. This study will statistically 
provide an answer to if a difference exists between online college pathways students 
and traditional college pathways students. The importance to the field of study is 
significantas determining if a single (or) two different curriculums are needed to 
bring best practices back to education, this can only be developed once all 
variableshave been examined separately. 

Problem Statement
It is not known if there is a difference in scores on the CareerEDGE Employability 
Development Profile (EDP) of students from the online college pathway and the 
traditional colleges pathway from the Bachelor program at a large Western Christian 
university. This issue of a mismatch between employability skills of online college 
pathways students and traditional college collage pathways students, have not been 
investigated separately (Allen & Seaman, 2011, 2012, 2013, 2014, 2015). The issue 
is that past and current studies, as well as articles and literature reviews, lump 
together or investigate separately the educational pathways leading to a bachelor’s 
degree (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; 
Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a). 
This study will determine if a difference exist between the educational pathways, thus 
allowing practitioners from education and business to determine where changes can be 
made to produce optimal benefit to both institutions. 

The effected population of approximately 21.596 million students in 2016; 21.266 
million in 2015 plus approximately 330,000 in 2016 (NCES, 2017), adds an economic 
perspective, that of a national student loan debt crisis of $1.3 Trillion dollars 
(Chopra, 2013, Consumer Financial Protection Bureau). Per a combined study report 
from the U.S. Departments of Labor, Commerce, Education and Health & Human 
Services, on July 22, 2014, there is a need to “expand and improve access to labor 
market, occupational, and skills data and continue basic research on labor markets 
and employment” (pp. 21-22).

The two educational delivery pathways, online college degrees and traditional 
college degrees, are often investigated from only one of the two perspectives, 
and comparison of the differences between each other is limited to a single 
perspective in most cases (Allen & Seaman, 2012, 2013, 2014, 2015; Brungardt, 
2011; Deepa & Seth, 2013; Robles, 2012; Sharma & Sharma, 2010; Weaver & 
Kulesza, 2014).

This study will investigate both online college pathways and traditional college 
pathways, using a causal-comparative research design, to determine if a statistical 
difference exists between the two pathways. If a statistical difference exists between 
the two pathways, educational institutions will be able to update current curriculums 
to focus on areaswhere employability skills development are lacking. Additionally, 
employers will be able to use the findings to better partner with educational 
institutions in the development of internships and/or school-work programs that focus 
on the skills development in these areas. 

An example of institutions using skills evaluation to develop partnerships is the Lindsey and 
Rice, (2015) study using a test known as the Situational Test for Emotional Management (STEM). 
Where findings showed that one or more online courses added to the traditional college pathways 
course curriculum increased student scores. Their findings showed that students benefit from the 
“time, training, experience, and practice of interpersonal skills in an online environment” 
(p. 126). Statistical evidence such as that developed by Lindsey and Rice, (2015) study 
demonstrates how valuable the findings of this study, using the CareerEDGE Employability 
Development Profile (EDP) instrument scores could be. The CareerEDGE Employability Development 
Profile (EDP) instrument was specifically designed and developed, and is in current use in Canada, 
to place students with partnered institutions and employers through internships and/or school-work 
programs (Dacre-Pool, Qualter & Sewell, 2013).  

Purpose of the Study  
The purpose of this study is to investigate if a difference exists in scores on the CareerEDGE 
Employability Development Profile (EDP) factor 2, work & life experience skills, and factor 4, 
generic skills, of students from the online college pathway and the traditional college pathway 
from the Bachelor program at a large Western Christian university. This is a quantitative 
methodology, causal-comparative research study using a general target population (N) totaling 
43,725 students, of which 7,975 are traditional college pathways students and 35,750 are online 
college pathways students from the Bachelor program at a large Western Christian university. 
The variable groups investigated are the Independent variable group consisting of online college 
pathways students and traditional college pathways students, and the dependent variable group 
consisting of the two dependent variables; factors 2 and 4 that support the CareerEDGE 
Employability Development Profile (EDP) instrument skill set questions (Dacre-Pool & Sewell, 2007; 
Dacre-Pool,Qualter & Sewell, 2013).    

This study will add to the body of knowledge called for in the U.S. Departments of Labor, Commerce, 
Education and Health & Human Services, (2014, July 22) report, which calls for further research in 
several areas aimed at employability skills development. This study will suggest solutions to the 
problem statement: It is not known if there is a difference in scores on the CareerEDGE 
Employability Development Profile (EDP) of students from online college pathways and traditional 
colleges pathways, from the Bachelor program at a large Western Christian university. Solutions to 
the problem and contributions to the field of study will be in areas such as, adjustments to 
curriculum design (Hart Research Associates, 2015), authenticating learning outcomes match 
employability skills needed (Britt, 2015), and assure interpersonal skills development matches 
employability skills needed by 2017 employers (Lindsey & Rice, 2015). 

Research Question(s) and Hypotheses 
This quantitative methodology, causal-comparative research study will add to the current body of 
research into the investigation if a statistically significant difference exists between online 
college pathways students and traditional college pathways students from the Bachelor program at 
a large Western Christian university, using their CareerEDGE Employability Development Profile 
(EDP) instrument scores. The instrument used to determine the research questions for this survey 
questionnaire study is supplied by Dacre-Pool and Sewell, (2007). The permission letter to use 
the instrument and questionnaire (survey questions) are found in (Appendix D, pp. 192 – 194). 
To assure reliability a test-retest process was used in the development of the CareerEDGE 
Employability Development Profile (EDP) instrument (Pool & Sewell, 2007; 
Dacre-Pool, Qualter & Sewell, 2013).  

RQ1: Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) 
factor 2, work & life experience skills, of students who graduate with a bachelor’s degree at a 
large Western Christian university through their online college pathway and their traditional 
college pathway?

H1o: There is no statistically significant difference in scores on the CareerEDGE Employability 
Development Profile(EDP) factor 2, work & life experience skills, of students who graduate with a 
bachelor’s degree at a large Western Christian university through their online college pathway 
and their traditional college pathway.

H1a: There is a statistically significant difference in scores on the CareerEDGE Employability 
Development Profile (EDP) factor 2, work & life experience skills, of students who graduate with 
a bachelor’s degree at a large Western Christian university through their online college pathway 
and their traditional college pathway.

RQ2: Is there a difference in scores on the CareerEDGE Employability Development Profile (EDP) 
factor 4, generic skills, of students who graduate with a bachelor’s degree at a large Western 
Christian university through their online college pathway and their traditional college pathway?

H2o: There is no statistically significant difference in scores on the CareerEDGE Employability 
Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s 
degree at a large Western Christian university through their online college pathway and their 
traditional college pathway.

H2a: There is a statistically significant difference in scores on the CareerEDGE Employability 
Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s 
degree at a large Western Christian university through their online college pathway and their 
traditional college pathway. 

The independent variable groups measured are online college pathways students and traditional 
college pathways students. The dependent variable group consist of factor 2 (work & life 
experience skills) and factor 4 (generic skills) from the CareerEDGE Employability Development 
Profile (EDP) survey questionnaire, see (Appendix D, continued, p. 195). These variables align 
to the research questions, hypotheses, and theoretical foundations of this study (Becker, 1964; 
Collins, 1979; Durkheim, 1895; Schultz, 1961, in Walters, 2004; U.S. Departments of Labor, 
Commerce, Education and Health & Human Services, (2014, July 22).   

It is not known if there is a difference in scores on the CareerEDGE Employability Development 
Profile (EDP) of students from the online college pathway and the traditional colleges pathway 
from the Bachelor program at a largeWestern Christian university. The two research questions are 
directly related to the two educational delivery methods available to all student degree seekers, 
that of online college pathways and traditional college pathways. The hypotheses are worded in 
such a fashion as to answer the central question of the study; does a causal-comparative 
significant difference exists between online college pathways students and traditional college 
pathways students from the Bachelor program at a large Western Christian university. The data 
collected from the survey questionnaire will be analyzed after completion of the 18 survey 
questions associated with, factor 2 work & life experience skills (2 questions) and factor 4 
generic skills (16 questions), by both online and traditional college pathways students from the 
Bachelor degree seeking student populations under investigation for the educational year 2016.   

Advancing Scientific Knowledge 
Per the literature reviews dating back to 2002 (Allen & Seaman, 2002, 2003) and continuing through 
current literature (U.S. Department of Education, 2014a; White House, 2015a), the new population 
of students will most likely graduate with the wrong workforce readiness skills for the jobs they 
seek (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015). This study will advance 
scientific knowledge by adding to the body of knowledge in literature with empirical statistical 
data answering the question of does a difference exist between online college pathways students 
and traditional college pathways students from the Bachelor program of a large Western Christian
university in the academic year 2016.  

Moreover, this quantitative study will narrow the sample size from the 805 (n) respondents in the 
original study by Dacre-Pool and Sewell, (2007), to a sample size, true effect size for statistical 
power analysis for validity, to 210 (n) respondents. This sampling population covers traditional 
college pathways student respondents and online college pathways student respondents, thus making 
this the newest and most inclusive study measuring workforce readiness skills using the CareerEDGE 
Employability Development Profile (EDP) survey instrument (Dacre-Pool & Sewell, 2007).   

It is universally agreed that a mismatch exist in skills taught by colleges and those needed by 
employers (Allen & Seaman, 2011, 2012, 2013, 2014; Deming, Goldin & Katz, 2012), but no studies 
statistically distinguish between the two educational groups, online and traditional college 
degree students (Brungardt, 2011; Sidhu & Calderon, 2014; U.S. Department of Education, 2014a; 
White House, 2015a). 

That most past and current studies either address the topic from one perspective or combine or 
lump both educational pathways together when determining if there is a mismatch in skill sets 
(Deepa & Seth, 2013; Kyllonen, 2013; Mannapperuma, 2015). This study will statistically provide an 
answer to if a difference exists between online college pathways students and traditional college 
pathways students’ skill sets and offer contrasting views/perspectives (Amrein-Beardsley, 
Holloway-Libell, Cirell, Hays & Chapman, 2015; Britt, 2015; Cappelli, 2015; Soulé & Warrick, 2015).  
The importance to the field of study is significant as determining a single curriculum to present 
best practices can only be developed once all variables have been examined separately 
(Lindsey & Rice, 2015).  

This study will add to the body of knowledge requested in the U.S. Departments of Labor, Commerce, 
Education and Health & Human Services, (2014, July 22) report, which calls for further research in 
several areas aimed at workforce readiness skills development. Per national and international 
studies, employers are complaining that institutional education has manifest itself in a 
disconnect between educational workforce readiness skills taught and job ready skills employers 
say they need (Allen & Seaman, 2014, 2015; Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015; 
Săveanu & Buhaş, 2015; Sidhu & Calderon, 2014; Youhang & Dongmao, 2015). 

This study will statistically identify if the need exists for further investigation into 
academic areas such as curriculum design (Sharma & Sharma, 2010), authentic learning outcomes 
(Britt, 2015) and interpersonal skills development (Lindsey & Rice, 2015).  Mastering these 
skill areas is bedrock to any successful doctorate in education, with concentrations in 
organizational and leadership development, as is being sought by this learner.  

The theoretical foundation used to support this study is Human Capital Theory (HCT) (Becker, 1964; 
Schultz, 1961; in Walters, 2004), and the two major theories associated with Human Capital theory 
(HCT) are the Functionalist theory model of education (Durkheim, 1895), and the Credentialist 
theory model of education (Collins, 1971, 1979). 

These two major sub-theories of Human Capital Theory (HCT) will be discussed in detail as they 
directly relate to the theoretical foundation of this study by demonstrating operational models 
(Collins, 1971, 1979; Durkheim, 1895). These theories will explain possibilities for the 
disconnect between those skills being taught and the skills executive hiring authorities say they 
need from their entry-level workforce. 

Significance of the Study
The significance of this study is that it will produce statistical data that will lead to a more 
accurate, essential and deeper understanding of the relationship between education and application 
of employability skill sets of 2017 graduates. Per a combined study report from the U.S. 
Departments of Labor, Commerce, Education and Health & Human Services, (2014, July 22), there is 
a need to “expand and improve access to labor market, occupational, and skills data and continue 
basic research on labor markets and employment” (pp. 21-22).

Additionally, this study will inform the business perspective (Light & McGee, 2015) and community 
environments (Sidhu & Calderon, 2014), where skills gaps were discovered by the U.S. Departments of 
Labor, Commerce, Education and Health & Human Services, (2014, July 22) report. Per the Federal 
Governments (2014) report, “More evidence is needed to fill gaps in knowledge, improve job training 
programs, inform practitioners about adopting promising strategies, and expand proven models to 
address the needs of specific groups of workers, industries, communities and institutions” (p. 21).

The need for this study is justified by the fact that college students, both domestic and 
international, are not graduating with the skill sets they need for employability as an 
entry-level worker in 2016 organizations (Cai, 2013; Iuliana,Dragoș & Mitran, 2014; Maurer, 
2015; Sibolski, 2012). Additionally, the two research questions and hypothesesaddress each 
individual student pathway of the study in the Bachelor degree program at a large Western 
Christian university. This research study will determine an answer to the problem statement: 
It is not known if there is a difference in scores on the CareerEDGE Employability Development 
Profile (EDP) of students from online college pathways and traditional colleges pathways from 
the Bachelor program at a large Western Christianuniversity. Once it is determined if and/or to 
what degree a difference exists between the two educational pathways, practical applications can 
be applied to the problem areas. 

The potential practical applications of the findings of this study will directly affect the body 
of knowledge concerning educational administration and curriculum development (Iuliana, Dragoș & 
Mitran, 2014; Sibolski, 2012), as-well-as businessworkforce readiness skills development 
(Cai, 2013; Maurer, 2015), and government initiatives to increase knowledgeand operational 
applications (U.S. Departments of Labor, Commerce, Education and Health & Human Services, 
2014;Departments of Labor and Education (2015, April 16). 

Rationale for Methodology
This quantitative methodology, causal-comparative research study will add to the current body of 
research on the investigation into workforce readiness skills acquired by students from online 
college pathways and traditional collegepathways, by examining the differences between the two 
educational groups. This will be accomplished by examinationresults of the Bachelor program at a 
large Western Christian university, and will be determined by theirCareerEDGE Employability 
Development Profile (EDP) instrument scores. 

This quantitative methodology, using thesurvey designed research method (Brungardt, 2011) is the 
best methodology compared to others available, such as Qualitative or Mixed designs (Brasoveanu, 
2012; Gay & Weaver, 2011; Gravetter & Forzano, 2012). The reason isthat it will answer, 
statistically, the degree that the researcher can defend findings that answer theresearch 
questions, and hypotheses, as-well-as address the problem statement: It is not known if there is 
adifference in scores on the CareerEDGE Employability Development Profile (EDP) of students from 
online collegepathways and traditional colleges pathways from the Bachelor program at a large 
Western Christian university.

Per Brungardt, (2011) study of the emerging academic discipline of leadership studies, suggest 
that the purpose of the survey research method is “to gather large amounts of data from groups of 
people demographically and geographically dispersed to maximize sample populations” (p. 5).  Fray 
(1996) suggest that the first step in methodology selectionshould concern what information the 
investigator wishes to know and then determine what research design method isneeded to acquire 
said data in the fewest number of questions. Additionally, Fray(1996) suggests that the survey 
questionnaire method avoids qualitative responses of open-ended question answers,which lend 
themselves to respondent bias.  

The rationale for selecting a quantitative methodology and using a causal-comparative research 
method comes directly from the issues faced in Dacre-Pool and Sewell, (2007) of low response rate 
of respondents and requires a numerical accounting of this study’s data.  Additionally, the 
causal-comparative survey designed research method aligns with Dacre-Pool and Sewell, (2007) 
original study (as a comparison value to the 8-years of recession between studies) and increases 
the probability of a larger respondent rate through the increase in (N) general target population 
size to 43,725 respondents. This will produce findings that can be generalized to a larger 
population in the fields of labor, commerce, education & health and human services (U.S. 
Departments of Labor, Commerce, Education and Health & Human Services, 2014; Departments of Labor 
and Education (2015, April 16).

Nature of the Research Design for the Study
This study is a causal-comparative research design to investigate the relationships between two or 
more groups that differ on a variable of interest and compares them with other variable(s) without 
manipulating said variables (Airasian & Gay,2003). Basically, causal-comparative designs, use two 
groups with the intent of understanding the causes orin some instances the effect for the two groups 
being different (Grand Canyon University, 2016, CIRT-BasicResearch Designs). The research data 
collection will be conducted through a questionnaire (survey designed method) instrument 
(Dacre-Pool & Sewell, 2007).  This researcher has permission from the authors to usethe CareerEDGE 
Employability Development Profile (EDP), (2007) survey instrument for this study (Appendix D, 
pp. 182 – 183). 

Because this is a study consisting of volunteers, sampling only adult students from the bachelor 
program at a large Western Christian university, a request for site authorization application will 
be submitted to the universities InstitutionalReview Board (IRB) network prior to study initiation. 
The site authorization application will describe the purpose and scope of the research, duration of 
the study, target population, impact on operations and resources, data use and potential benefit to 
said university and participants. 

The actual email survey instrument will be initiated by the large Western Christian university, 
through their Email Survey Distribution system. All email survey requests (including initial 
requests, follow-ups and reminders) that havenecessary and/or appropriate site authorization and 
IRB approval will be distributed by the email surveydistribution manager. 

An invitation to participate in an online survey will be sent to all Bachelor program students 
from the online student pathway and traditional student pathway of study for the academic year 
2016. The invitation (e-mail) introduction letterwill discuss the materials and instrument used 
in this study. The invitation email will describe thepurpose and significance of the study to the 
student population and university. Additionally, risks and benefitsare addressed concerning 
student’s participation, as well as, sample selection ideology, protection ofrights/well-being, 
maintaining of data security, sample recruitment process, data collectioninstruments and 
approaches will be discussed. A follow-up-letter will be sent via e-mail five days after the 
letter ofintroduction is sent to encourage full participation by the student population of the 
large Christian university. 

The survey instrument will be hosted by the university and a link will be provided to the website 
to complete the survey. The participants informed consent form (Appendix B, pp. 185 – 188) will 
be included as required by the IndependentReview Board (IRB) of the university. The survey will 
consist of 18 survey questions from the CareerEDGE Employability Development Profile (EDP) 
instrument from factor 2 (work & life experience skills) andfactor 4 (generic skills) for 
measurement of the two independent variable groups, traditional studentpathways and online 
student pathways from the bachelor program at a large Western Christian university. 

The general target population size (N) for the study is 43,725 respondents, upon which a sample 
true effect size (n) for statistical power analysis for validity is 210 respondents, as determined 
using the G*Power instrument from authors Faul, Erdfelder, Lang and Buchner, (2007a). 
(Figure 4, G*Power Distribution Plot,p. 195). This study will explore the difference between two 
populations, online college pathways students andtraditional college pathways students using the 
scores derived from the CareerEDGE EmployabilityDevelopment Profile (EDP) factor 2 (work & life 
experience skills)and factor 4 (generic skills) (Dacre-Pool & Sewell, 2007). 

Definition of Terms
Traditional Education: Traditional education is defined as those classes taken through the 
traditional face-to-face contact with instructors in a set location, also referred to as a 
brick and mortar schools (Raj & Al-Alawneh, 2010).

Online Learning: With the advent of online educational delivery systems starting in 1994 
(Hill, 2012) online education is delivering classes 24 hours per day, 7 days a week through 
the college or universities portal system (Intranet)that is connected to the students through 
the Internet.  

Workforce Readiness Skills: Skills or skill sets are defined as to belonging to a set of both 
soft and hard abilities that the college graduate has acquired during matriculation in a specific 
career field or through personal or workexperience (Jonbekova, 2015; Po, Jianru & Yinan, 2015; 
Săveanu & Buhaş, 2015). 

Soft skills: They are (interpersonal, emotional, cognitive, spacial skills, etc.) that describe 
what the learner acquires through education and interaction with instructors, fellow students, 
family and the community (Yin & Volkwein, 2010).  Hardskills are defined as job related skills, 
those specific to the career domain of the degree field studied(Sidhu & Calderon, 2014; U.S. 
Department of Education, 2014a; White House, 2015a). 

Independent Variable Group: The independent variable groups being measured are Online college 
pathways students and Traditional college pathways students (Hill, 2012; Raj & Al-Alawneh, 2010).

Dependent Variable Group: The dependent variables of interest are the two factors that are being 
measured by the 18 (EDP) survey questions being scored by members of the Bachelor program at a 
large Western Christian university.These variables are aligned to the research questions, and 
hypotheses, as well as theoretical foundations of this study (Becker, 1964; Collins, 1979; 
Durkheim, 1895; Schultz, 1961, in Walters, 2004). The 2 CareerEDGE Employability Development 
Profile (EDP) factors being measured by this study are Factor 2, Experience Work/Life Skills 
and Factor 4, Generic Skills (Dacre-Pool & Sewell, 2007). 

Assumptions, Limitations, Delimitations
The tool is self-report and can only measure the students’ own perceptions of their employability, 
and while there is research to suggest that many self-perceptions are associated with actual 
behavior, scrutiny should be employed wheninterrupting instrument results (e.g. Bandura, 1995; 
Bandura et al., 2003; Compte and Postlewaite, 2004). The current rationale for this assumption 
is that if the tool is to be used to evaluate interventions or further experimental studies are 
done, it is recommended that caution should be taken when interpreting these results.

In any study, there are possibilities of multiple sources for limitations, beyond that of the 
literature reviews themselves (Baruch & Holton, 2008; Branch, 2007; Podsakoff, MacKenzie & 
Podsakoff, 2012). While one can deflect the possibilityof incorrect information and data from 
primary research studies and articles by closely checking one’s references, some forms of 
limitations are less identifiable. Limitations of this study may concern factors of low response 
rate in answering the survey questions, non-completion of survey instrument questions, bias by 
respondents toward online surveys, and methodology bias, (the biasing effects that measuring two 
or more constructs with the same methodology may have on estimates of the relationships 
between them) (Podsakoff, MacKenzie & Podsakoff, 2012).

For the sake of this study, delimitation is defined as, a shortcoming of this study due to 
the researchers’ decision-making-process. One of these decision-making-processes, such as using 
a research design that incorporates a conveniencesampling-frame, which produces delimitations for 
this study concerning the population size (N) being restricted to just the bachelor student degree 
program (Baruch & Holton, 2008). Using a low general population (N) size directly affects the 
sample size response rate (n) in this survey research study. The minimum number of requirements, 
as per the large Western Christian university general rule of thumb on survey research, is 10 
subjects per survey question or 280 respondents for this study. 

For this study an a-priority Power Analysis was conducted to justify the study sample size based 
on the anticipated effect size (d = 0.03) and the critical t = 1.2857988, Df = 200 and Actual 
power= 0.8015396, which produced a respondent (n) sample size of 210 respondents were needed for 
statistical validity alone (G*Power, 2015) (Figure 4, p. 204). The survey instrument will be 
hosted by the large Western Christian universities email survey system to minimize further 
delimitations in the Data Collection and Management section.

Summary and Organization of the Remainder of the Study
Chapter 1 has been presented and it has provided an overview of the problem, the background of the 
problem, the purpose of the study, the rational for the methodology, research questions, hypotheses, 
significance of the study, advancing scientific knowledge, definition of terms, assumptions, 
limitations and delimitations, and nature of the research design of the study (Hart Research 
Associates, 2015; Nye, 2015). 

Chapter 2 presents a review of the current literature on the workforce readiness skills of students 
from both traditional college pathways and online college pathways, how to assess the workforce 
readiness skills, and the effect the workforce readiness skills taught have on education today 
(Bryant & Bates, 2015; Lysne & Miller, 2015).  

Chapter 3 focuses on the research methodology used in this study. It defines the research design, 
general and sample respondent sizes, the setting, the research instrument, data analysis, validity 
and reliability, ethicalconsiderations that should be addressed and summary (Cornacchione & 
Daugherty, 2013; Jonbekova, 2015; Săveanu &Buhaş, 2015; Youhang & Dongmao, 2015).  

Chapter 4 details the actual study raw data collected, and the real-time data analysis procedures 
completed. It also discusses any new limitations discovered during the collection of raw data and 
the procedures used to address saidlimitations. This section consists of the results of the 
research, the analysis of the results andthe authors sub-conclusions. The data will be 
illustrated in tables, figures, diagrams, charts and/orgraphs and will be arranged in such a way 
that the specific groups of data correspond to any tables,figures, diagrams, charts or graphs used 
in the analysis of the results. This section will be logical,cumulative and simple as possible. 
Sub-conclusions and findings will always be clear on which facts and/or published literature the 
sub-conclusions and findings are based (Podsakoff, MacKenzie & Podsakoff, 2012).  

Chapter 5 concludes the study by listing the discussion, conclusions, and any recommendations for 
future research, change or both based on the results of this research study.

The time line for completion of this study is as follows:
1.	Finish Chapter 1 draft by August 28, 2016, submission to DC Network.
2.	Finish Chapter 2 draft by September 18, 2016, submission to DC Network.
3.	Finish Chapter 3 draft by December 7, 2016, submission to DC Network.
4.	Finish Proposal drafts (Chapters 1, 2 and 3, plus revisions) by January 4, 2017, 
submission to DC Network. 
5.	Submit request to AQR and IRB for approval and request date for Defense of Proposal 
by March 16, 2017.
6.	Receive Approval from IRB to complete dissertation study by March 26, 2017. 

                                          Chapter 2: Literature Review
										  
Introduction to the Chapter and Background to the Problem
This non-experimental, quantitative methodology, causal-comparative research study will add to the 
current body of research on the investigation into workforce readiness skills acquired by students 
from online college pathways and traditional college pathways, by examining the differences between 
the two educational groups. Examination through investigation of the Bachelor program at a large 
Western Christian university, where a test will be administered using their CareerEDGE 
Employability Development Profile (EDP) instrument scores (Dacre-Pool & Sewell, 2007). 
This is in-line with the recent call from the U.S. Department of Labor, Department of Commerce, 
Department of Education and Department of Health and Human Services, (2014, July 22) report in 
what works in job training. In addition, per the study that supplied the CareerEDGE Employability 
Development Profile (EDP) survey instrument questionnaire for this study, “There has been little 
empirical research conducted in relation to graduate employability…” (Dacre-Pool, Qualter & 
Sewell, 2013, p. 303).  

This quantitative study will expand the general population size from the 805 (N) respondents 
originally selected to 43,725 (N) respondents by including all members of the Bachelor program at 
a large Western Christian university.The increase in general population size (N) will produce an 
effect size (n) of approximately 210respondents needed for G*Power statistical validity (Faul, 
Erdfelder, Lang & Buchner, 2007a,2007b). The literature review will address the following sections 
to produce knowledge of the topicand will describe how these sections apply to the current 
literature reviews through empirical primaryand secondary resource investigation. 

These sections of this literature review cover the introduction and background of the study. 
The next section is theoretical foundations, which covers the conceptual framework of Human 
Capital Theory (HCT), and the two sub-theoriesdirectly affecting education; Functionalist 
theory and Credentialist theory (Becker, 1964; Collins, 1971 &1979; Durkheim, 1895; 
Schultz, 1961, in Walters, 2004, pp. 99-100). Additionally, the instrumentationstheoretical 
design model, that of survey questionnaires (Dillman, Smyth, & Christian, 2014; Ornstein,2013) 
is discussed as it applies to this study.  

Survey instrument design is based in Probability Theory (specifically, Frequency Probability) 
(Brasoveanu, 2012), where (n) equals the sample size, which is not finite, as it is dependent 
on a robust general population size (N)and the Frequency Probability equals the sample space 
(the set of all possible outcomes) (Brasoveanu, 2012; Penrose, 1995). The next section is an 
expanded view (185 + primary and secondary resources) and clarification of the literature 
reviewed, variable groups and perspectives of the actors directly involved with the problem 
statement of this study. 

The rationale for the methodology is discussed, and the dependent and independent variable 
groups are described, as well as how leadership studies affect both soft and hard skills 
development (Brungardt, 2011). An a-priorieffect size test was run to determine the sample 
size appropriate for statistical validity, that numberwas 210 (n) total respondents (Faul, 
Erdfelder, Lang & Buchner, 2007a, 2007b; UCLA, 2015, Institutefor Digital Research and 
Education) (Figure 5, p. 173). A section on instrumentation is nextfollowed by definitions 
of variables, such as workforce readiness skills, soft skills and hard skills (Sidhu & 
Calderon, 2014; U.S. Department of Education, 2014a; White House, 2015a).  

Following these sections is the meat of the topic discussion/explanation where perspectives 
of workforce readiness skills are discussed from numerous viewpoints by prominent authors in 
Labor, Commerce, Education and Health and Human services (U. S. Department of Labor, Department 
of Commerce, Department of Education and Department of Health & Human Services, 2014, July 22). 
These sections also cover the global and government perspectives of workforce readiness skills 
development (Chanco, 2015; National Statistics Office of Philippines, 2015; U.S. Departments of 
Labor, Commerce, Education and Health & Human Services, (2014, July 22). 

This section discusses three pathways toward education, for-profit, non-profit and State degree 
paths (Deming, Goldin & Katz, 2012). Additionally, traditional and online learning is defined and 
discussed from multiple perspectives (Lindsey & Rice, 2015). The final sections cover workforce 
readiness skills from the employers, educational institutions and the students’ perspectives 
(Deepa & Seth, 2013; Kyllonen, 2013; Mannapperuma, 2015).  The final section of the literature 
review will discuss contrasting views/perspectives on the topic from several seminal authors 
(Amrein-Beardsley, Holloway-Libell, Cirell, Hays & Chapman, 2015; Britt, 2015; Cappelli, 2015; 
Soulé & Warrick, 2015). The final section of Chapter two will conclude with a summary and 
highlight of important points discussed prior.

The literature reviews were surveyed using the following resources: 
1.	Grand Canyon University Fleming Library, 
2.	Education Resources Information Center (ERIC), 
3.	Sage Publications Library, 
4.	American Library Directory, 
5.	Georgetown University Center on Education and the Workforce, 
6.	National Center for Educational Statistics (NCES), 
7.	ProQuest Dissertations & Theses, 
8.	Library of Congress, 
9.	EBSCOhost, 
10.	Research Gate, 
11.	Free Management Library, 
12.	Center for Institutional Evaluation, 
13.	Research and Planning (CIERP) and 
14.	Economic and Social Research Council (ESRC).

Background to the Problem 
This study will further extend past research through the investigation into workforce readiness 
skills acquired by students from online college pathways and traditional college pathways, by 
examining the differences between the twoeducational groups scores on the CareerEDGE Employability 
Development Profile (EDP) factors 2, work & life experience skills, and factor 4, generic skills, 
as determined from the Bachelor program at a large Western Christian university. The importance and 
need of studying whether workforce readiness skills being taught and those 21st century employers 
claim are not valid to entry-level employees, cannot be determined by lumping the two different 
educational pathways together. The two different pathways must be investigated separately to 
determine if a difference exist in educational skill sets and what may cause said differences 
before curriculum’s can be designed to produce better skilled graduates. 

Per a combined study report from the U.S. Departments of Labor, Commerce, Education and Health & 
Human Services, on July 22, 2014, there is a need to “expand and improve access to labor market, 
occupational, and skills data and continue basic research on labor markets and employment” 
(pp. 21-22). The call for further research concerning workforce readiness skills is exacerbated by 
the effected population of 21.266 Million U. S. students who will be attending colleges and 
universities in 2015 (NCES, 2014, Table 1, 1990 through fall 2103). Per the literature reviews 
dating back to 2002 (Allen & Seaman, 2002, 2003), this new population of students will most likely 
graduate with the wrong workforce readiness skills for the jobs they seek (Jonbekova, 2015; 
Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Sidhu & Calderon, 2014; Youhang & Dongmao, 2015; 
U.S. Department of Education, 2014a; White House, 2015a). Moreover, those figures are projected to 
increase by approximately 330,000 students each following year through 2103 (NCES, 2014, Table 1, 
1990 through fall 2103) (Table 1, p. 174). 

Per Allen & Seaman, (2002, 2003) and international studies as recent as 2015 (Chanco, 2015; 
Săveanu & Buhaş, 2015), employers are complaining that institutional education has manifest 
itself in a disconnect between educationalworkforce readiness skills taught, and job ready 
skills employers say they need (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 
2015; Youhang & Dongmao, 2015). The issue of online college pathways and traditional college 
pathways students not having the needed workforce readiness skills employers want from their 
entry-level workforce is not solely endemic to the U.S. business environment (Chanco, 2015; 
Jianru & Yinan, 2015). 

Recent studies drawn from diverse demographic and geographic global literature reviews strongly 
suggest that the issue of college students and their workforce readiness skills in their field of 
study, not matching those needed by executive hiring authorities, should be measured on a pandemic 
scale (Chanco, 2015; Jianru & Yinan, 2015). 

Studies from such diverse geographic locations as China, Romania, Tajikistan, Philippines 
and European Union, to name but a few, produce facts that show they are facing the same 
issues with hiring qualified college students as their U.S. counterparts (Jonbekova, 2015; 
Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Youhang & Dongmao, 2015). This supports the 
problem statement of this study: It is not known if there is a difference in scores on the 
CareerEDGE Employability Development Profile (EDP) of students from online college pathways 
and traditional colleges pathways. The significance of this study is that it will produce 
statistical data that will lead to a more accurate, essential and deeper understanding of 
the relationship between education and application of employability skill sets of new graduates.

Theoretical Foundations and/or Conceptual Framework
The conceptual framework for this non-experimental, quantitative methodology, causal-comparative 
research study is based in Human Capital Theory (HCT) (Becker, 1964; Schultz, 1961; in Walters, 
2004, pp. 99-100). This section willpostulate theoretical foundations and theories associated with 
academic training directly relatingto the development of workforce readiness skills (Becker, 1964; 
Collins, 1979; Durkheim, 1895;Schultz, 1961; in Walters, 2004, pp. 99-100). This theoretical 
foundation supports the answering of theresearch questions, and hypotheses, as-well-as the 
problem statement derived from the background to theproblem section, where literature reviews 
illustrate two pronounced but differing theories ineducation relating to the study topic. 

Instrumentation Theoretical Model:
The model upon which the study instrument is built is that of a questionnaire, normally regarded 
as a survey design (Dillman, Smyth, & Christian, 2014; Ornstein, 2013). The theory supporting 
this design is based on Probability Theory -specifically that of Frequency Probability 
(Brasoveanu, 2012) where (n) equals the sample size population,which is not finite, as it is 
dependent on a robust general population size (N) and the FrequencyProbability equals the sample 
space (the set of all possible outcomes) (Blakstad, 2015; Brasoveanu, 2012;Penrose, 1959). 

Theoretical Foundation Used:
The theoretical foundation used to support this study is Human Capital Theory (HCT) 
(Becker, 1964; Schultz, 1961, in Walters, 2004), and the two major theories associated with 
Human Capital theory (HCT) are the Functionalist theorymodel of education (Durkheim, 1895), 
and the Credentialist theory model of education (Collins, 1971,1979). 
 
Human Capital Theory (HCT):
What is HCT? Human Capital Theory of education (Becker, 1964; Schultz, 1961; in Walters, 
2004, pp. 99-100) is an economic variant of the later work, Technical Functional Theory 
(Collins, 1971). The foundation of Human Capital Theory suggeststhat increases in education 
are directly related to increased demand for skilled labor. Hence, collegestudents will 
continue to seek increases in educational degree paths until the opportunity costs of 
procuring additional education surpasses the benefits that it provides (Cornacchione & 
Daugherty, 2013; Walters,2004). 

How it fits this study! HCT is directly aligned to this study, as it is bedrock to pedagogy; 
the teacher/student paradigm is prevalent for both online college pathways and traditional 
(ground campus) college pathways andworkforce readiness skills attainment for college students. 
Attached to this theory areoperationalized theory models associated with perspectives from 
educators and seminal authors in thefield(s) of education, economics and social theory 
(Emile Durkheim, 1858-1917). 

Theories Supporting Theoretical Foundation:
Two major theories (sub-theories) of Human Capital Theory (HCT) will be discussed in detail as 
they directly relate to the theoretical foundation of this study by demonstrating operational 
models (Collins, 1979; Durkheim, 1895)that could explain possibilities for the disconnect 
between those skills being taught and theskills executive hiring authorities say they need 
from their entry-level workforce.  

Functionalist Theory Operational Model:
What is Functionalism? One of the oldest theories most closely associated with HCT in education 
is that of Functionalism (Durkheim, 1895). A functionalist theory in education posits; “That it 
is the role of education to transmitcore values and social control through the attributes that 
support the political and economicsystems that fuel education” (Durkheim, 1892, in Walters, 
2004, p. 99). Per functionalist theory,technology and economic innovations raise skill levels 
that are required to perform jobs of thefuture (Walters, 2004). 

How a functionalist theory fits this study? 
The research topic for this study explores the differences in workforce readiness skills obtained 
by students from online college pathways and traditional college pathways, as determined from the 
bachelor program students at a largeWestern Christian university. A Functionalist theory 
(Durkheim, 1892) is directly aligned with theeconomic and technological innovation needed to raise 
the workforce readiness skill levelsrequired to perform jobs in 2016 and beyond. Additionally, this 
study is a quantitative analysis ofthose workforce readiness skills being taught by colleges and 
universities to online and traditionalcollege students, as determined by their CareerEDGE 
Employability Development Profile (EDP) instrument scores.

Credentialist Theory Operational Model:
What is a Credentialist theory model? Per Credentialist theory model, learning in 2016’s 
educational institutions is more about accepted standards of social norms than with 
contributory learning and cognitive skill sets (Collins, 1979).Collins (1979) 
Credentialist theory model posits that employers use credentials (degrees, certifications, 
etc.) as a contributory factor to elevate educated workers to more lucrative pay and career 
positions. 

How Credentialist theory model fits this study: Currently employability is judged on several 
factors, one of those being what credentials (degrees, certifications, etc.) an entry-level 
college graduate has attained in their chosen field of study. The credentials offered by 
entry-level graduate employees will determine the position they receive and the starting salary 
of that position (Collins, 1979). Comparing Credentialist theory and Functionalist theory is 
important to connecting theory and application. 

Connecting Theory and Application:
The subject topic of this study, if a difference exists in skill sets between students of online 
college pathways and traditional college pathways, is bedrock to these two theories. Functionalist 
theory (Durkheim, 1895) supports educations operational model for teaching, transmitting core values 
and social control through the attributes that support the political and economic systems. While 
Credentialist theory supports businesses operational model of hiring, where credentials (degrees 
and certifications, etc.) must be supported by the workforce readiness skills that the employers 
demand from graduate entry-level employees. 

This study will explore, through statistical analysis of the two pertinent factors, factor 2 and 
factor 4, using 18 CareerEDGE Employability Development Profile (EDP) skills questions 
(Dacre-Pool & Sewell, (2007), that are associated with both higher educational credentialing 
(Collins, 1979) and teaching economic and technological functionalism (Durkheim, 1895). The 
variables associated with this study are directly related to the workforce readiness skills 
development that entry-level college students need to attain employability upon graduation. 
The results of this study will add to the body of knowledge as justified by the U.S. Departments 
of Education, Labor, Commerce, and Health and Human Services, (2014, July 22) report, calling for 
further research studies in workforce readiness skills development. 

Variable Association to Theoretical Foundations
The two (EDP) factor variables (dependent variable group) being measured are: 
Factor 2 - Experience Work/Life
Factor 4 - Generic Skills (Dacre-Pool & Sewell, 2007).

The rationale for usage of each theory in this study directly relates to quantitative 
methodology and causal-comparative research design to further address statistically, 
the gap within the literature related directly to theory verses operationalized 
practice challenges. This study will add to the body of knowledge of Human Capital 
Theory and both the Functionalist and Credentialist theories by determining if there is 
a difference in the workforce readiness skills of online and traditional college students. 

Moreover, what specific skill sets each possess, and which could directly affect 
administration and curriculum design in education, as-well-as employability 
possibilities of future hires. The scores derived from analysis of the survey instrument 
will aid in the answering of the research questions, and hypotheses, as-well-as produce 
knowledge relevant to the areas of labor, commerce, education and health and human services, 
as called for by the U.S. Departments of Labor, Commerce, Education and Health and Human 
Services, (2014, July 22) report.   

Review of the Literature
This non-experimental, quantitative methodology, causal-comparative designed study will 
add to the current body of research on the investigation into workforce readiness skills 
acquired by students from online college pathways and traditional college pathways, by 
examining the differences between the two educational groups, as determined from the 
bachelor program at a large Western Christian university, from their CareerEDGE 
Employability Development Profile (EDP) instrument scores. 

Employers are vigorously complaining that online distance learning and face-to-face 
contact models of education, have manifest their self in a disconnect between educational 
skills taught and job ready skills employers need (Sidhu & Calderon, 2014; U.S. 
Department of Education, 2014a; White House, 2015a).  

Because of the increased popularity of online classes, the total effected population from 
both online and traditional educational pathways is 21.266 Million students who attended 
college in the academic year 2015 alone (NCES, 2014, Fast Facts). The importance and need 
of studying whether workforce readiness skills being taught and those 21st century employers 
claim are not valid to entry-level employees, cannot be determined by lumping the two 
different educational pathways together. The two different pathways must be investigated each 
to determine if a difference exist in educational skill sets and what may cause said 
differences before curriculum’s can be designed to produce better skilled graduates. 

The significance of this study is that it will produce statistical data that will lead to a more 
accurate, essential and deeper understanding of the relationship between education and application 
of employability skill sets of future graduates.

This study will examine the gap in knowledge from literature reviews that show a disconnect 
between the academic skills colleges and universities are teaching and the skills today’s 
organizations say they need from their entry-levelworkforce (Bessolo, 2011; Brungardt, 2011; 
Cai, 2013; Lindsey & Rice, 2015; Robles, 2012; Soulé & Warrick, 2015). This gap in knowledge 
is supported through the studies of Sidhu and Calderon, (2014) where they found that, “more 
than one-third (39%) of business leaders are not confident that U.S. college students are 
graduating with the skills and competencies that their businesses need” (p. 1). The literature 
review further supports this disconnect, as three quarters (75%) of chief academic officers 
from educational institutions consider their pedagogical offering as highly valued 
(Allen & Seaman, 2013).

Per recent studies drawn from global references, this disconnect of skills taught and skills 
needed by organizations entry-level workforce, should be measured on a pandemic scale 
(Chanco, 2015; Săveanu & Buhaş, 2015). Studies from China, Romania, Tajikistan, Philippines and 
the European Union produce results that show they are facing the same issues with hiring talent 
as their U.S. counterparts (Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & 
Buhaş, 2015; Youhang & Dongmao, 2015).  

Additionally, this problem is further exacerbated by the fact that 21.266 Million U. S. students 
entered colleges and universities in 2015 (NCES, 2014, Fast Facts), and those figures are 
projected to increase by approximately 330,000 students each following year through 2023 
(NCES, 2014, Table 1, p. 174). This disconnect between what college chief academic officers 
believe is a credible academic offering and what organizations, government and global studies 
show are not credible, demands further empirical study (Chanco, 2015; Jonbekova, 2015; Po, 
Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; Sidhu, & Calderon, 2014; Youhang & Dongmao, 2015; 
U.S. Department of Education, 2014a; White House, 2015a).   

Background to the Problem:
Numerous studies (Brungardt, 2011; Deepa & Seth, 2013; Lindsey & Rice, 2015; Robles, 2012; Weaver & 
Kulesza, 2014) concerning workforce readiness skills were published in the past 8 years since 
Dacre-Pool and Sewell, (2007) study. However, most of these studies are approached from one of five 
perspectives. 

They are (a) the employer (Iyengar, 2015; Kelly, 2015; Kyng, Tickle & Wood, 2013; 
Zhao, 2015), (b) that of the educational institutions (Iuliana, Dragoș & Mitran, 
2014; Soulé & Warrick, 2015), (c) that of the student (Hart Research Associates, 
2015; Iliško, Skrinda & Mičule, 2014), (d) that of Government (U.S. Departments of 
Labor, Commerce, Education and Health & Human Services, (2014, July 22) and (e) that 
of the global perspective (Chanco, 2015; Youhang & Dongmao, 2015). 

The two major delivery paths, online college pathways and traditional college pathways, are 
often investigated from only one of these perspectives, and comparison of their differences 
between each other is minimal inmost cases (Allen & Seaman, 2012, 2013, 2014). This study will 
investigate both online college pathways and traditional college pathways from all five 
perspectives. 

Rationale for Methodology:
This quantitative methodology, causal-comparative research study, will add to the current body of 
research on the investigation into workforce readiness skills acquired by students from online 
college pathways and traditional college pathways, by examining the differences between the two 
educational groups, as determined from the Bachelor degree program at a large Western Christian 
university, using their CareerEDGE Employability Development Profile (EDP) instrument scores. This 
quantitative methodology is best (Brasoveanu, 2012; Gay & Weaver, 2011) because it will answer the 
research questions, and hypotheses, as-well-as the problem statement: It is not known if there is 
a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from 
online college pathways and traditional colleges pathways. 

This is the best methodology compared to others available, such as qualitative or mixed 
methodology because it will answer, statistically, the degree that the researcher can defend 
findings that answer the research questions andhypotheses (Gravetter & Forzano, 2012). By 
examining, a list of the acquired workforce readiness skills ofboth online and traditional 
college pathways students’ raw scores, fromthe (EDP) survey instrument, a definitive answer 
to the question of whether a differenceexists will be attained. This will be accomplished 
through statistical analysis using a twotailed independent samples t-test (Blakstad, 2015; 
Light & McGee, 2015).  

Dependent Variable Groups:
Factor 2 - Experience Work/Life Skills
Factor 4 - Generic Skills

Independent Variable Groups:
The independent variable groups being measured are online college and university pathways 
students and traditional college and university pathways students. These variables are 
aligned to the research questions and hypotheses, aswell-as theoretical foundations, thus 
aligning with the Problem Statement: It is not known if there is adifference in scores on 
the CareerEDGE Employability Development Profile (EDP) of students from online college 
pathways and traditional colleges pathways.  

Theoretical Foundations Association with Study

Instrumentation Theoretical Model:
The model upon which the study instrument is built is that of a questionnaire, normally regarded 
as a survey research method design (Dillman, Smyth, & Christian, 2014; Ornstein, 2013). The theory 
supporting this design is based onProbability Theory - specifically that of Frequency Probability 
(Brasoveanu, 2012), where (n) equals the samplepopulation, which is not finite, as it is dependent 
on a robust population size (N) and the FrequencyProbability equals the sample space (the set of 
all possible outcomes) (Blakstad, 2015; Brasoveanu, 2012; Penrose, 1995). 

Human Capital Theory (HCT):
What is HCT? Human Capital Theory of education (Becker, 1964; Schultz, 1961; in Walters, 2004, 
pp. 99-100) is an economic variant of the Technical Functional Theory (Collins, 1971). The 
foundation of Human Capital Theory suggeststhat increases in education are directly related to 
increased demand for skilled labor.Hence, college students will continue to seek increases in 
educational degree paths until theopportunity costs (Cornacchione & Daugherty, 2013) of procuring 
additional education is surpassingthe benefits that it provides (Walters, 2004). 

How it fits this study?  HCT is directly aligned to this study, as it is bedrock to education; 
the teacher/student paradigm is prevalent for both online and traditional ground campus education 
and workforce readiness skillsattainment for college students.  Attached to this theory are 
operationalized theory models associated toperspectives from educators and seminal authors in the 
field(s) of education, economics andsocial theory (Emile Durkheim, 1858-1917). Additionally, 
Bessolo, (2011) suggest that there is “agrowing recognition of, and value placed on human capital 
particularly the impact on highereducation” (p. 2).

Theories Supporting Theoretical Foundation:
Two major sub-theories of Human Capital Theory (HCT) will be discussed in detail as they directly 
relate to the theoretical foundation of this study by demonstrating operational models (Durkheim, 
1895; Collins, 1979) that couldexplain possibilities for the disconnect between those skills being 
taught and the skills executivehiring authorities say they need from their entry-level workforce.  

Functionalist Theory Operational Model:
What is Functionalism? One of the oldest theories most closely associated with HCT in education is 
that of Functionalism (Durkheim, 1895). A functionalist theory in education posits, “That it is 
the role of education to transmit corevalues and social control through the attributes that support 
the political and economic systems that fuel education” (Durkheim, 1895, pp. 60-81). Per the 
functionalist theory, “economic andtechnological innovation generally raises the skill levels 
required to perform jobs” (Walters, 2004, p. 99). 

How a functionalist theory fits this study? The research topic for this study explores the 
differences in workforce readiness skills obtained by students from online and traditional 
colleges, as determined from the Bachelor program at alarge Western Christian university, as 
determined by their CareerEDGE Employability Development Profile(EDP) instrument scores. A 
Functionalist theory is directly aligned with the economic andtechnological innovation needed 
to raise the workforce readiness skill levels required to perform jobs, and thisstudy is a 
quantitative analysis of those workforce readiness skills being taught by colleges and 
universities.

Credentialist Theory Operational Models:
What is a Credentialist theory model? Per Credentialist theory models, learning in today’s 
educational institutions is more about accepted standards of social norms than with 
contributory learning and cognitive skill sets(Collins, 1979). Collins (1979) Credentialist 
theory model posits that employers usecredentials (degrees, certifications, etc.) as a 
contributory factor to elevate educated workersto better jobs, and these higher educated 
workers are enjoying more lucrative jobs.  How Credentialist theory model fits this study. 
Currently employability is judged on severalfactors, one of those being what credentials 
(degrees, certifications, etc.) an entry-levelworkforce college graduate has attained in 
their chosen field of study. The credentialsoffered by entry-level graduate employees will 
determine the position they receive and thestarting salary of that position (Collins, 1979). 

Education/Employer Collaborating and Credentialist Theory
Grant, (2015) presents her own example of a solution to the skills gap through a Learning 
Blueprint, which is a set of five program requirements that are based on Credentialist 
Theory (Collins, 1979). The program guidelinesrequire that employers and educational 
institutions collaborate (form partnerships) toenhance employee credentialing and training, 
as-well-as increasing workforce readinessskills development (Grant, 2015). The idea behind 
her partial solution to the problem ofworkforce readiness skills being taught by colleges 
and universities, not matching what hiringauthorities say they need, is through partnerships 
between business and educationalinstitutions. 

The two-way communication would “allow present employees to earn college credits for the 
experiential theory and application of their work-product” (Grant, 2015, p. 76). Additionally, 
educational institutes will conduct anacademic evaluation that would determine college credit 
equivalencies for said work-product.Educational institutions benefit from both the new influx 
of students getting college equivalenciesand the input from the partnering businesses concerning 
what workforce readiness skills theyneed in the future (Grant, 2015).  Businesses are benefitted 
by having employees gain academiccredentialing, as-well-as having direct input to the educational 
institute with which they arecollaborating. 

Leadership Studies and Soft Skills Development:
Brungardt, (2011) used literature reviews from nine major studies to assess soft skills (also 
referred to as “Teamwork Skills” in their study) of students of business schools compared to 
leadership education figures from twoother groups. The study emphasizes the need for “leadership 
training in all courses or degreepaths as an aid to better soft skills development” (Brungardt, 
2011, p. 14). Three hypotheses weretested: those students with no leadership training (Ho1), 
those with two years’ college and/or acertificate in leadership (Ho2), (i.e. those with 9 
credit hours or less of leadership training) and third, those students with both business skills 
education, ‘hard skills’ and leader ship classes (Ho3), (i.e. those with 12 credit hours of 
leadership classes or more). 

Their findings suggest that no significant difference was recorded in 
the first two hypotheses Ho1 and Ho2,but hypotheses Ho3 showed “significant difference in soft 
skills or teamwork skills development ofthose with leadership studies training” (Brungardt, 
2011, p. 13). Brungardt, (2011) study emphasizesthat “leadership courses, such as Organizational 
Leadership and/or Leadership Development, added to any degree path, will contribute to the 
development of more effective soft skills development overall” (p. 14). 

Sample Population Defined
The sample population size (n) of 210 respondents (G*Power, 2015, Figure 5, p. 203), is drawn 
from a general population size of 43,725 (N) respondents possible from the bachelor programs 
online and traditional college path ways students at a large Western Christian university. This 
is determined by their CareerEDGEEmployability Development Profile (EDP) instrument scores. 

The true effect size (d = 0.5), (α = 0.05)alpha level, Power (1-β err prob) = 0.95, Allocation 
ratio N2/N1 = 1, Critical t =1.9714347, and Actual power = 0.9501287 was needed for statistical 
power analysis for validity and will produce a sample size of 210 (n) respondents (Faul, 
Erdfelder, Lang & Buchner, 2007a, 2007b). 

This G*Power analysis is needed because of the probability of correctly rejecting the 
test hypotheses “can have severe consequences as Type I and Type II Errors are possible” 
(Faul,Erdfelder, Lang & Buchner, 2007b, p. 51). G*Power test the probability of correctly 
rejectingthe test hypotheses when the alternative hypotheses is true (Faul, Erdfelder, 
Lang & Buchner,2007a, 2007b).  

Instrumentation:
The (EDP) survey questionnaire instrument for this quantitative study was supplied by 
permission of Dacre-Pool & Sewell, (2007), (see Appendix D, p. 188). The survey instrument 
was previously certified for validity andreliability through testing using Exploratory Factor 
Analysis and Confirmatory Factor Analysis(Dacre-Pool & Sewell, 2007; Dacre-Pool, Qualter & 
Sewell, 2013). Additionally, In the case ofthe original CareerEDGE Employability Development 
Profile (EDP) instrument (Dacre-Pool &Sewell, 2007), the reliability of the instrument was 
determined by a test-retest reliabilityprocedure. 

The test-retest reliability procedure examined the extent to which scores from onesample are 
stable over time from one test administration to another (Creswell & PlanoClark, 2011). The 
instrument was administered to 19 individuals. The same instrument wasthen administered to 
the same individuals several weeks later. The arrangement of the questionschanged during the 
second administration. The scores from Time 1 and Time 2 were used to compute the Pearson 
correlation coefficient as a measure of the test-retest reliability of the instrument. 
ThePearson correlation coefficient is represented as ΓTime1∙Time2 =??? (Creswell & Plano 
Clark, 2011; Green, 2015). 

EDP Factor Subscale
                             N = 19              Time 1 	       Time 2 		
       	                    Mean (SD)          Mean (SD)	    t-test & p-value 
								
Career Development Learning 24.58 (4.51)       29.47 (3.47)     t = 4.483, p = 0.000 
Work/Life Experience         8.37 (2.93)        9.53 (2.20)     t = 2.226, p = 0.039
Degree Subject Knowledge    25.32 (4.02)       28.37 (2.01)     t = 3.522, p = 0.002
Generic Skills 		    16.89 (2.40)       17.12 (2.26)     t = 0.482, p = 0.635
Emotional Intelligence (EI) 63.42 (6.89)       63.95 (8.31)     t = 0.446, p = 0.661
& Self-Management 
(Dacre-Pool, Qualter & Sewell, 2013, p. 309) (Appendix F, pp. 198-199).

The use of the survey instrument in this study will contribute directly to answering the 
research questions and hypotheses through analysis of the instrument scores of the independent 
variable groups. The surveyquestionnaire consists of 18 (EDP) skills questions, dependent, 
dependent variable group concerning theindependent variable groups, online and traditional 
college pathways students. This study willexplore the difference between two populations and 
the scores derived from the (EDP) skillsinstrument scoring (online students [O] = x1, and 
traditional students [T] = x2) by theBachelor program students at a large Western Christian 
university, as determined by theirCareerEDGE Employability Development Profile (EDP) instrument 
scores.  

Workforce Readiness Skills Definitions:
Skills or skill sets are defined as belonging to either soft or hard skills “developed aptitude 
or ability” (Merriam-Webster Dictionary, 2015, p. 1), that the colleges graduate has acquired 
during matriculation in a specific careerfield and/or from life experience. The term workforce 
is used to pinpoint a domain relevant to employability, employers and/or employees. 

Soft Skills Defined: Soft skills are interpersonal qualities, also known as people skills, 
and personal attributes that one possesses (Robles, 2012). Hard Skills Defined. Hard skills 
are the technical expertise and knowledge needed for a job (Robles, 2012). Hard skills are 
defined as job related skills, thosespecific to the career domain of the degree field studied 
(Sidhu & Calderon, 2014; U.S. Departmentof Education, 2014a; White House, 2015a). 

For-Profit, Non-Profit and State Educational Pathways:
Per the U.S. Departments of Commerce, (2014, July 22) report, educational student debt has reached 
an epidemic level of $1.3 Trillion dollars. This report and others posit that For-Profit, 
Non-Profit and State educational pathwaysequally share in the blame. Per Deming, Goldin & 
Katz (2012) for-profit institutions are U.S. highereducation’s fastest growing sector in both 
increased enrollments from “0.2 percent to 9.1 percent from 1970through 2009” (Deming, et. al., 
2015, p. 1) and “for-profits leavestudents with far larger student loan debt burdens” 
(Deming, et. al., 2015, p. 4). Additionally, forprofit students have higher unemployment rates 
and lower earning, comparison tostudents from Community colleges, State colleges and private 
Non-Profit institutions (Deming, et. al., 2015).  

Moreover, “for-profit students incur far greater default rates due to lower starting salaries than 
these institutions suggest they will earn with their degree paths” (Deming, et. al., 2015, 
pp. 4-5; U. S. Department of Justice, 2015,p. 1). Additionally, “for-profit students 
self-report lower satisfaction with courses and are lesslikely to consider their education 
and loans worth the price-tag” (Deming, et. al., 2015, pp. 45). The for-profit sector 
disproportionately serves what The Executive Office of the President,(2014, January), 
refers to as “low-income students” (pp. 2-47), those in the categories of“older students, 
women, African Americans, and Hispanics” (p. 8). 

A Complete College America (2014) report suggests that students not graduating in 4-year degree 
paths are because for-profits, non-profits and state colleges and universities are “adding 
additional classes that extend graduation up to anadditional 2 to 3 years in some cases” (p. 9). 
One statistic shows that 60% of bachelor’s degreerecipients from for-profits, non-profits and 
state colleges and universities change colleges at least onceduring matriculation and nearly 
half of those transfer students lose most or all their earned credits from the institution they 
leave because of “broken transfer policies” (p. 9). These facts account for “$600 Million 
dollars lost each year if only two courses per transfer student fail to transfer all credits 
earned” (p. 9). Moreover, the policy is determined at “each individual college, not at the 
Federal level, as to how many credits will transfer” (Complete College America, 2014, pp. 8-11).

Of the top 25 for-profits, non-profits and state colleges and universities that have produced the 
most college student debt, “the top 13 account for $109 Billion or almost 10% of all federal 
student loans” (Looney & Constantine, 2014,Bookings Institute Papers on Economic Activity, 
p. 2). The top 13 colleges consist of nine for-profitschools, #1 being the University of 
Phoenix-Phoenix campus ($35 Billions) and #11 Grand Canyon University($5.9 Billions) and 
only four non-profit or state public institutions are included in that number. Those institutions 
are #3 Nova Southeastern University ($8.7 Billion), #8 New York University ($6.3Billion), #12 
Liberty University ($5.7 Billion) and #13 University of Southern California ($5.3 Billion) 
(Looney & Constantine, 2014, Bookings Institute Papers on Economic Activity). 
(See Figure 6 this study, p. 199for complete list of top 25.)  

Additionally, per the Bookings Institute paper, “U.S. student loan debt has quadrupled in the 
past 12 years to the $1.3 Trillion now owed, with most of the debt being held by non-traditional 
borrowers attending for-profit and nonselective institutions” (Bookings Institute Papers on 
Economic Activity, p. 2; Looney & Constantine, 2014).Per the report; “In contrast, most 
borrowers at four-year public and private non-profit institutionshave relatively low rates of 
default, solid earnings, and steady employability rates” (p. 2). 

Traditional Education:
Traditional education is defined as those classes taken through the traditional face-to-face 
contact with instructors and students in a set location, also referred to as “brick and mortar 
schools” (Raj & Al-Alawneh, 2010, p.5). While technical and vocational institutions are 
showing increases in enrollment,traditional college pathway students are showing a steady 
decline in enrollment, which may be due tothe greater increase in online education 
(Burns, 2011, p. 1). While this decline has several contributing factors, the number one 
overriding reason is high tuition cost, resulting in large student debt and long-term payments, 
which are often tied to high interestrates on said loans (Burns, 2011). 

One of the reasons for this increase in student debt is thatnon-profit institutions often 
change the status of students from in-school-deferment status (ifthe student takes any break 
longer than two weeks) to a forbearance status, which allowsthe lending institution to raise 
the interest rate from the original 2.5% to as high as 9.75%,such as Wells Fargo Bank, N.A. 
for example (Burns, 2011).  Per Weaver & Kulesza, (2014), in a study of accounting 
students in business schools (traditional college pathway) show “a persistent gap exists 
between what is taught and what skills employers expect in students” (p.34). 

Per the authors; “A series of studies show that increasingly employers desire soft skills 
such as critical thinking, problem solving, and communication skills in addition to essential 
accounting education” (p. 34). This study will add to the body of knowledge concerningthese soft 
skills through the scoring of the (EDP) skills survey (dependent, dependent variablegroup) 
questionnaire, which will determine if differences exist between online andtraditional 
college pathways students. 

Online Education:
Online educational delivery systems started in 1994 (Hill, 2012). Online education delivers 
classes 24 hours per day, 7 days a week through the college or universities portal system 
(Intranet) connected to students through theInternet (Hill, 2012). Allen & Seaman, (2014) 
report tracking online education fromfor-profits, non-profits and state colleges and 
universities in the U.S. (annual report since2002) shows evidence that online learning is 
critical to their institution’s long-termstrategy and the numbers validate their findings: 
“2002 student online enrollment was 48.8%but by 2015 that number had mushroomed to a 
staggering 70.8%” (p. 4). 

A question of whether online offerings are comparable to face-to-face traditional offerings has 
been a question of concern for numerous years when obtaining data for their reports (Allen & 
Seaman, 2014). Past studies found that“chief academic officers rated the learning outcomes for 
online education ‘as good as or better’than those for face-to-face instruction until 2013” (p. 5). 
The newest report for 2015, (using2013 and 2014 data), now suggest that trend is reversing. 

While 77.0% of academic leaders in 2012believed, their offering was as good as or better than 
face-to-face instruction, 2013 and 2014 data shows that figure has dropped to 74.1% for two years 
running. Allen & Seaman (2014) study uses data acquired each year from the U.S. Department of 
Education, National Center for Education Statistics, Integrated Postsecondary Education Data 
System (IPEDS), which collects data from the institutions eligible for Title IV financial aid.

Per Mannapperuma (2015) study of online for-profit educational industries shows, “little regulation 
and numerous accusations that it places profits before the interests of its students” (p. 541). 
Mannapperuma (2015) study addresses thedistance learning industry from the legal and regulatory 
perspective. The author found that“proposed regional interstate compacts promise to standardize 
oversight of the for-profit distancelearning industry; but it fails to include states that 
regulate the industry the least and thusfails to protect students who are most likely to need 
protection” (p. 541).  In the years“2007-2008 nearly 4.3 Million undergraduate students (20% of 
all undergrad students), took at leastone distance (online) educational course” 
(Mannapperuma, 2015, p. 543). 

Deming, Goldin & Katz’s (2012) study of for-profit educational institutions (referred to as 
“Chains” in their study) because of the number of online students they enroll, “no more than 
33% of students from one U.S. State isallowed” (p. 3) but not enforced, found that “for-profit 
institutions leave students with far greaterstudent loan debt burdens” (p. 4). Lindsey & 
Rice (2015) study of the interpersonal abilities (softskills) of online students versus 
traditional students by evaluating their Emotional Intelligence (EI)using the Situational Test 
of Emotional Management (STEM), showed surprising results. Their findingsshowed that “students 
who completed at least one online course scored significantly higher on thetest than students 
from a traditional college pathway (face-to-face instruction only)” (p. 134). Thestudy did not 
differentiate between for-profit, non-profit or State educational pathways,“only student business 
majors and minors were surveyed with a sample population of 865 respondents” (p. 127). 

Economic Perspective of Student Educational Debt        
With student loan debt reaching 1.3 Trillion (Carnevale, Strohl & Gulish, 2015; Complete 
College America, 2014; NCES, 2015, Education Statistics May 7), there is a Federal Government 
(2014) request for further research in the areas of “academic training, labor market outcomes, 
and economics (student educational debt)” (U.S. Departments of Labor, Commerce, Education and 
Health & Human Services, 2014, p. 21).  This literature review would not be complete without 
briefly identifying some history and legal requirements (or lack thereof), that have led to the 
current situation.  

Mannapperuma (2015) identifies the legal issues that provided the factual support for the author’s 
conclusions that for-profit Educational Management Organizations (EMO’s) need greater regulation 
through “tying federal Title IV funds to the Interstate Compact System” (p. 589). There are few 
protections available to students attending for-profit distance learning institutions, those that 
are causing the epidemic in student debt through continually raising the cost of tuition and fees 
because of non-regulation of this educational sector at the State and Federal level.  

The issue of regulation of education in this country is built on the same fragmented approach that 
the Department of Education has used starting in 2010, entitled the “Program Integrity Rules” 
(Mannapperuma, 2015, p. 544). The issue is that one of these rules that deal with enforcement, 
“the Federal Online State Authorization Rule (FOSAR), is not actively enforced by the U.S. 
Department of Education, in any State” (Mannapperuma, 2015, p. 545). The (FOSA) rule states “that 
higher education institutions offering distance-learning courses must obtain that States 
authorization through an accreditation processto do business within the State’s borders or risk 
losing federal funds under Title IV”(Mannapperuma, 2015, p. 545).   

Per the federal Title IV program, it requires institutions to disclose information related to 
several areas relevant to consumer protection, “including institutional information and 
characteristics, student financial aidinformation, health and safety programs, student outcomes, 
athletic programs, and student loan information” (Commission on the Regulation of Postsecondary 
Distance Education, 2013, p. 24). The U.S. Department of Education decides eligibility for 
Title IV participation and assigns a financial responsibility composite score to the State 
educational institution or its Education Management Organization (EMO) that must be between 
1.0 – 1.5 (Commission on the Regulation of Postsecondary Distance Education, 2013). Regardless 
of the fact of non-enforcement of the rule, if the institution’s eligibility score is less 
than that specified, the “institution can continue to receive Title IV funds for up to three 
years before losing eligibility” (Commission on the Regulation of Postsecondary Distance 
Education, 2013, p. 5).  

Per an article by Hentschke, Oschman and Snell (2002), “Education Management Organizations 
(EMOs) are for-profit firms that provide whole-school operation services to public school 
agencies” (p. 1). Despite objections within the education profession, “EMO’s have grown 
exponentially in the last two decades, reaching a total of approximately 36 companies 
(estimated in 2002) operating in more than 24 States (estimates in 2002) and affecting some 
368 institutions of higher learning (estimates in 2002)” (Hentschke, et. al., 2002, pp. 1-16). 
This practice is referred to as “privatization of public schools” (p. 15) and since the article 
was published in 2002, the numbers in both the U.S. and abroad now have expanded to the point 
that an accurate accounting is not available as these EMO’s now include elementary schools, 
charter-schools and public-school districts (Hentschke, et. al., 2002).  

A stunning U.S. Department of Justice (2015, November 16) decision was reached in a landmark 
global settlement with Education Management Corp. (EDMC), the second-largest for-profit 
education management company (EMO) in the U.S. (U.S. Department of Justice, 2015, November 16). 
The ongoing 8-year-old case (began in 2007) reached a settlement with the Department of Justice 
and Pennsylvania’s Attorney General’s office, when they agreed to pay $95.5 million to settle 
claims it “illegally paid recruiters and exaggerated the career-placement abilities of its 
schools” (p. 1) located in the U.S. and Canada. In addition, the Education Management Corp. 
(EDMC), which runs 110 schools in 32 states and Canada, will forgive an additional $102.8 million 
in students’ loans it made to 80,000 former students, per U.S. Attorney General Loretta Lynch 
(Mandak & Tucker, 2015, Associated Press 2:49 PM EST). 

Global Perspective on Workforce Readiness Skills
The issue of online and traditional college students not having the needed skill sets employers 
want from recent students is not solely endemic to the U.S. business environment. Per recent 
studies drawn from global references, suggestthe issue of college students and their skill sets 
in their field of study not matching those needed by organizations should be measured on a 
pandemic scale. Studies from China, Romania, Tajikistan, Philippines and European Union 
produce facts that show they are facing the same issues with hiring talent as their U.S. 
counterparts (Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & 
Buhaş, 2015; Youhang & Dongmao, 2015).  Chanco (2015) study, based on a National 
Statistics Office of Philippines, (2015, October) study investigated the correlation of 
college students’ skill sets not matching the requirements for jobs, compared to the 
unemployment rate for the Philippines and “found that 22.2% percent of the unemployed were 
college students with Master level degrees or higher” (p. 1). 

Jackson (2014) study of factors influencing job attainment in recent Bachelor degree students 
in Australia; “…illustrate the true significance of workforce readiness skills development or 
the lack thereof, on the global economiclabor markets” (Jackson, 2014, p. 136). Using a survey 
completed each year by the Australian Graduate Survey (AGS) in 2011 and 2012, where a sample 
population (n) of 28,246 (2011) and (n) of 28,009 (2012), found that Bachelor degree students 
noted a 9% decline in graduate full-time employability since 2008 (Graduate Careers Australia, 
2012b). This decline is attributed to two economic factors: “…the global financial crisis and 
economic stagnation that is ongoing in the UK and U.S. labor markets” (Jackson, 2014, p. 136). 
Studies by Accenture (2013) and Purcell, Elias, Atfield, Behle, Ellison & Luchinskaya (2013) 
in Jackson (2014) suggests, “…that the decline in graduate full-time employability and lower 
starting salaries can be blamed on the economic stagnation in the UK and the U.S.” (p. 136).  

Bondarenko (2015) study of the shortage of professional skills and qualities of workers in the 
Russian labor market found that the qualification structure of the employees is not well balanced. 
Per Bondarenko (2015), “The problem of lack of balance is linked to, (1) the deficiency of workers’ 
qualifications (workers have qualifications that are lower than what is required by the employers), 
and (2) personnel’s over-qualification (qualifications that are higher than what is required)” 
(Bondarenko, 2015, p. 120). This deficiency of worker’s qualifications (workforce readiness skills) 
was attributed to multiple factors with the highest deficiency rating of 75%, attributed to workers 
not able to “solve unforeseen job problems and accomplished tasks on their own” 
(Bondarenko, 2015, p. 133).  

Another study completed in post-Soviet Tajikistan by Jonbekova (2015), University of Cambridge, 
examines employer’s perspectives of university students’ skills and preparation for employability 
(Jonbekova, 2015). This thematic analysis of employers and secondary data, “points to a 2-year 
decline in the quality of higher education in Tajikistan, thus widening the gap between learned 
workforce readiness skills and those required by employers” (p. 169). The findings of Jonbekova 
(2015) study produces an answer to the question of workforce readiness skills being taught, not 
aligning with employer’s needs, and makes a valid argument for a global solution. Per Jonbekova 
(2015), “Employers’ perspectives suggest that the reform of the education sector without the 
creation of more decent job opportunities will likely exacerbate the current skills mismatch in 
Tajikistan” (p. 169).

Not only are recently formed nations having an issue with workforce readiness skills not matching 
what employer’s need but older civilizations, such as Turkey, are feeling the effect of the problem. 
Per a study by Alpaydin (2015), “There are various findings indicating that the mismatch between 
qualification (degrees) and skill is significantly high in Turkey” (p. 945). Turkish officials and 
employers are calling for further studies in three areas; “labor forecasting, skills need relating 
to said forecast and participation and contributions of parties in the educational process” 
(Alpaydin, 2015, p. 945).

Government Perspective of Workforce Readiness Skills
Per a combined study report from the U.S. Departments of Labor, Commerce, Education and 
Health & Human Services, on July 22, 2014, there is a need to “expand and improve 
access to labor market, occupational, and skills data and continue basic research on labor 
markets and employment” (pp. 21-22). Per the Federal Governments (2014) report, “More 
evidence is needed to fill gaps in knowledge, improve job training programs, inform 
practitioners about adopting promising strategies, and expand proven models to address the 
needs of specific groups of workers, industries, communities and institutions” 
(U.S. Departments of Labor, Commerce, Education and Health 
& Human Services, 2014, p. 21).

The need for this study is justified by the fact that college students, both domestic and 
international, may not be graduating with the skill sets they need for employability as an 
entry-level worker in today’s organizations. This supports the purpose of this study, which 
is to investigate the differences between workforce readiness skills acquired by students 
from online college pathways and traditional college pathways, from the Bachelor program at 
a large Western Christian university, as determined by their CareerEDGE Employability 
Development Profile (EDP) instrument scores. This research study will investigate this 
disconnect and apparent gap in knowledge to define to a measured degree the problem statement; 
It is not known if there is a difference in scores on the CareerEDGE Employability Development 
Profile (EDP) of students from online college pathways and traditional colleges pathways. 

The Executive Office of the President, (2014, January 7) has outlined four (4) barriers facing 
low-income students to entering college, which they suggest will increase college attendance and 
graduation for that class of student.

The first initiative listed is “I. Connecting more low-income students to colleges where they 
can succeed and encouraging completion once they arrive on campus.” (Executive Office of the 
President, 2014, January, p. 4). The language of the report suggests that students are to blame 
for their failure to graduate because they choose colleges that have smarter and better-prepared 
students than those coming from low-income families. Quoting the study, “many low-income students 
choose a college that does not match their academic ability” (Executive Office of the President, 
2014, January, p. 4). 

The second initiative is “II. Increasing the pool of students preparing for college.” 
(The Executive Office of the President, 2014, January, p. 6), want this pool of student customers 
to start at the 8th grade level. Quoting the study; “we also need to reach students earlier to 
increase the pool of low-income students ready for college” (pp. 6-7). 

The third initiative is “III. Reducing inequalities in college advising and test preparation.” 
(Executive Office of the President, 2014, January, pp. 7-8). The report states that counselors 
from schools that service low-income families see “1,000 students per counselor versus 470 
students per counselor nationally” (Haskins, Holzer & Lerman, 2009, Economic Mobility 
Project, pp. 43-44). 

The fourth initiative is “IV. Seeking breakthroughs in remedial education.” (Executive Office of 
the President, 2014, January, pp. 8-9). The reasoning being that students attending college from 
low-income families are going to “enter college underprepared to succeed, and remediation needs at 
four-year institutions are greatest for low-income students” (Executive Office of the President, 
2014, January, pp. 8-9).

The White House has released numerous reports, initiatives and studies after President Obama sent 
Congress a Blueprint for Reform of the Elementary and Secondary Education Act (ESEA) 
(U.S. Department of Education, 2010, March 13), which Congress has not acted on to date. 
The (ESEA) Act was to address issues created by the No Child Left Behind Act of 2001. 
When Congress failed to act on the bill, the administration moved forward by providing 
States flexibility within the law – “as authorized by provisions in the law itself - to 
pursue comprehensive plans to improve educational outcomes for all students, close achievement 
gaps, and improve the quality of teaching” (The White House, 2014, Ready to Work, p. 1). 

To date 43 States and the District of Columbia have received ESES Flexibility (U.S. Department of 
Education, 2010). ESES Flexibility has as its goal, “State and local innovation aimed at increasing 
the quality of instruction and improving student academic achievement” (ESEA Flexibility, 2012, 
pp. 1-3) and has numerous sections on the requirements for State Educational Agencies (SEA’s) and 
Local Educational Agencies (LEA’s) (U.S. Department of Education, 2012, June 7, pp. 4-35). 

A U.S. Department of Labor, Department of Commerce, Department of Education and Department of 
Health & Human Services, (2014, July 22) government-wide report, under the guidance of Vice 
President Joseph Biden, calls for further study to “determine what information is lacking and 
identify future research and evaluation that can be undertaken to ensure the Federal programs 
invest in effective practices” (p. 1). Some of the findings suggest that for adults, 
“a post-secondary educational degree related to jobs in demand, is the most important determinant 
in earnings and incomes” (p.1). The closer training is related to actual job or occupation 
real-world requirements, the better the results of said training. Employer and industry 
engagement strategies will help improve alignment of training to employer’s needs. 

Employers’ Perspective of Workforce Readiness Skills
Carnevale, Gulish & Strohl, (2015) study shows the breakdown of the employers’ role in the 
$1.3 Trillion postsecondary education and training system, spent by educational institutions 
and employers annually on formal and informal higher education and training. Of that amount, 
“educational institutions spend $407 billion and employers spend $177 billion, up from $140 
billion in 1994” (p. 3).  “Employer spending on education and training increased by 26% since 
1994 (or 1.238 percent increase each year)” (p. 3).  By comparison, colleges and university 
spending rose by “82% in the same period” (p. 3). Additionally, “employers spend 58% of training 
dollars on Bachelor’s degree-holders, ages 25 to 54, which typically complements a traditional 
college education” (p. 5) and accounts for the core workforce. Annualizing these figures would 
suggest that employers spend approximately 1.762 billion per year on upgrading training of 
critical employees, but no breakdown was found of what type of training (hard or soft skills 
development) this amount accounts for in total training dollars. 

Sidhu & Calderon, (2014) study results show that, “More than one-third of business leaders 
(39%) are not confident that U.S. college students are graduating with the skills and competencies 
that their businesses need” (p. 1). A separate 2013 Lumina/Gallup poll report published 
February 25, 2014, “finds that 96% of chief academic officers at higher education institutions say 
their institution is ‘very or somewhat’ effective at preparing students for the world of work” 
(p. 1). The logical assumption from these facts is that educational institutions believe they are 
teaching the 21.266 million students entering college in 2015 with the correct workforce readiness 
skills needed for employability but at the least 40% of all employers say that their offering is 
not adequate (NCES, 2015, Education Statistics, May 7). 


“Seventy-one percent (71%) of business leaders who participated in the Lumina/Gallop poll suggest 
that they would hire someone without a post-secondary degree or credentials over an individual with 
a degree, when prioritizing skillsand knowledge in the hiring decision process” (Lumina/Gallop Poll, 
2014, p. 23). The conclusionsdrawn from the survey by the authors suggest that higher educational 
institutions need to reexamine theirdegree and credentialing programs and bring them in line with 
the skill sets that businesses needmost. Per most business leaders, “(71%) say they currently do 
not partner with any highereducational institution, with only 29% stating that they do have a 
partnership in place” (p. 25). When businessleaders were asked what talent, skills and knowledge 
higher educational institutions should develop,the most popular answer was “internships and 
practical on-the-job experience for students at 14%”(Lumina/Gallop Poll, 2014, p. 30).  

The issue per literature review of the Lumina/Gallop report seems to be that employers most want 
students to have practical internship and/or experience on-the-job, but 71% do not attempt to 
rectify the situation. Additionally, the report lists the “second most sought after skill sets as 
communication skills/English speaking and writing skills at 12%” (p. 30). These are the skills that 
employers suggest are necessary and must be taught by current higher educational institutions to 
affect change in the current labor hiring market. In the Lumina/Gallop poll (2014) the employers 
were asked; “What is your business currently doing to help increase the proportion of Americans to 
attain postsecondary degrees, certificates, or credentials?” (p. 31). The responses covered 
10 items (see Lumina/Gallop poll, p. 31), and the response proportions were from “2% to 6% as to 
what employers were doing now, but the largest response was 58% of employers were doing nothing” 
(p. 31). “Only 1 in 10 employers provide tuition reimbursement, scholarships or 
internships/mentoring/training or certification opportunities for employees” (p. 31). 

Parasuraman & Prasad, (2015) extensive study of acquisition of corporate employability 
skills, found convincing evidence to support the conclusions of the Sidhu & Calderon, 
(2014) and Lumina/Gallop Poll, (2014). The three studies suggest that employers and higher 
educational institutions must start collaborating and/or forming partnerships to ensure that 
newly graduating students have the experience and workforce readiness skills needed by today’s 
employers. Grant, (2015) not only agrees with Parasuraman & Prasad, (2015), Sidhu & 
Calderon, (2014) and Lumina/Gallop Poll, (2014), but also presents her own example, a Learning 
Blueprint, which is a set of five program requirements that are based on Credentialist Theory 
(Collins, 1979).  The program requirements guidelines are based on the requirement that employers 
and educational institutions collaborate (form partnerships) to enhance employee credentialing 
and training, as well as increasing workforce readiness skills development (Grant, 2015). 

Using the facts previously stated, one could extrapolate, those 21.266 million students who 
entered colleges in 2015, that will not be hired upon graduation by 40% of employers, equates 
to roughly 8.51 million unemployed students notable to secure employability because they lack 
the needed workforce readiness skills that they are nowrequired to pay for. This is in line 
with the statistics from the U. S. Department of Labor,Department of Commerce, Department of 
Education and Department of Health & Human Services, (2014, July22) report calling for 
further research in each department’s area of interest.

Educations’ Perspective of Workforce Readiness Skills 
Zhao, (2015) study, A World at Risk: An Imperative for a Paradigm Shift to Cultivate 21st Century 
Learners, argues that the most popular reforms in education in the U. S. have focused on fixing 
the past mistakes education has madeand new strategies are simply “doing the wrong thing more 
right” (p. 1). The author suggests thatto address the ever changing global and technological 
environments of business, “a neweducational paradigm must be initiated” (p. 2). One of the 
imperatives for a paradigm shift comes fromLindsey & Rice, (2015), where the authors 
collected data from 865 students, which was analyzed using the Situational Test of Emotional 
Management (STEM) to validate findingsusing Item Response Theory and Latent Class Analysis, 
by Matthias von Davier (2014) in (Lindsey& Rice, 2015). Their findings suggest that students 
taking at least one online classscored significantly higher on the test and benefitted from the 
“time, training, experience andpractice of interpersonal skills (soft skills) development” (p. 127). 

Iuliana, Dragoș Mihai & Mitran, (2014) discusses employability skills as a starting point for 
the redesign process of educational curriculums. The authors believe that university and college 
management, through developingcurriculums that concentrate on transversal skills (soft skills), 
could exert pressure on therepresentatives of the labor market (hiring authorities), “in the 
context of the policies and strategiestargeted per social needs” (pp. 237-238). Their study 
analyzed eight national and internationalstudies, dating from 1998 to 2012, to identify soft 
skills that employers believe to be mostimportant to their firms. 

Iuliana, et. al. (2014) suggest a list of 15 soft skills that employers felt were most important 
for newly graduated economics students, but these skills would aid all educational fields as well. 
Of the 15 soft skills listed, somesplit certain disciplines into two categories, such as 
“TC 14 – ability to propose effectivesolutions; TC 15 – ability to generate effective decisions” 
(pp. 240-241). This study’s researchquestions cover nearly all the Iuliana, et. al. (2014) list, 
as-well-as some additional workforcereadiness skills and demographic questions not covered by 
Iuliana, et. al. (2014).

Shea (2015), Editor-in-Chief of the Journal of Online Teaching and Learning (JOLT), in the June 1, 
2015 Online Learning Journal (OLJ) edition, examined several papers concerning collaborative online 
learning environments, such as a Community of Inquiry-based instructional approach (CoI) framework 
(Hayes, Smith & Shea, (OLJ) in Shea, 2015) and Transformational theory in curriculum design. 
This study suggest that a productive conversation should be forth-coming by educational faculty and 
staff concerning the commonalities and distinctions between the two models, particularly in the 
context of online learning. 

The issue of educational institutions not preparing students with the needed soft skills 
development has a history in multiple global literature review sources. Sharma & Sharma, 
(2010) study examined engineering students in India and determined that despite being in one 
of the more sought-after professions, “these students are struggling with the problem of 
centralization and archaic examination systems at India’s educational institutions, which is 
detrimental to student learning” (p. 39). Sharma & Sharma, (2010) suggest that curriculum 
and methodologies need to be restructured at the institutional level by standardizing the 
training content across India, so the needs of employers will be meet regardless of where the 
student comes from. The authors suggest that the design of new curriculums must include soft 
skills such as, “communication skills, interpersonal skills, group dynamics skills, teamwork 
skills, body language skills, business etiquette skills, selling skills, presentation skills 
and confidence building skills” (p. 41).  

Essary, (2014) study of how Athens State University, a small Alabama college, can gain 
competitive advantage shows that by identifying external factors in education, such as 
“changing student demographics and students’ demand for flexibility, can increase 
enrollment, increase revenues, reduce cost and help small colleges and universities to 
remain competitive” (p. 134). Athens State University (ASU) administrators, staff and 
faculty were included in their qualitative case study, using interview questions to 
determine what areas in traditional and distance learning courses will secure competitive 
advantage at (ASU). Their findings suggest that “online distance learning courses are 
increasing due to non-traditional students need for flexibility, as non-traditional 
students (those 25 and older with either full or part-time jobs or family commitments) 
made up the majority (85.4%) of their Spring 2009 student body, up from 69.61% of their 
Fall 2009 enrollment figures” (p. 131). 

Weng, (2015) article, Eight Skills in Future Work, posits that there are eight future job 
skills that students will need to be effective in attaining employability in any industry 
or specific field of study. The eight skill sets are designed around the technologies and 
global interactions needed by a 21st Century workforce. These skills, per the author, will 
define cross-cultural competences to, “(a) functions effectively within a new cultural 
context and/or (b) interact effectively with people from different cultural backgrounds” 
(Wilson, Ward, and Fischer (as cited in Chiu, Lonner, Matsumoto, Ward, 2013, p. 844). 

The list of eight skills starts with technologies, which will be indispensable and have a 
dramatic influence on human life, per the author.  The next two are computational thinking 
and new-media literacy, where math and social science skills are acquired. These are 
followed by “sense-making, developing three intelligences (SI, El and CQ), design mindset, 
novel and adaptive thinking, and management of cognitive load” (p. 421).  

Students’ Perspective of Workforce Readiness Skills: 
Per Brill, Gilfoil & Doll (2014) study, “minimal work has been done to develop and validate 
the tools that are needed to assess soft skills” (p. 175). Their study examined forty (40) 
graduate MBA students from nine (9) courses using the McCann Soft Skills Assessment Tool 
(MSSAT) (McCann.edu, 2015) to evaluate the students in six (6) soft skills areas, “leadership, 
teamwork, critical thinking, logical reasoning, communication, and holistic thinking” 
(Brill, et. al., 2014, p. 175). 

The written test examined student’s abilities in the above skill areas, which was verified by 
their instructors rating scores of the same student sample population. The results showed 
significant correlations between leadership and communication from both the test scores and 
the instructors rating scores, but empirical validation did not exist for the remaining four 
(4) skill areas. This means that students believe they possess teamwork, critical thinking, 
logical reasoning and holistic thinking skills needed for employability, but their instructors 
do not believe this is an accurate appraisal. 

A study by Burns, (2011) that centers on the adult learner, posits that an “estimated 76 
million workers from the baby boomer’s generation will retire by the end of 2010” (p. 2). 
Per Reeves (2005) in Burns, (2011), these workers will leave a void in competent workers in 
age groups 35-44 by 19%, while increasing workers in the age group 45-54 by 21%. Their 
findings suggest that the more education an individual has, the more likely they will be 
employed, and these findings are substantiated by the U.S. Department of Labor (2010) report. 

A study by Mitchell, Pritchett & Skinner, (2013) of MBA students suggest that the 
integration of soft skills in the curriculum, particularly those skills in communication 
(both written and oral), ethics, diversity, and leadership were statistically significant 
to this population. Additionally, a study by Iyengar, (2015) investigated MBA degree holders 
and the soft and hard skills that matter to employers. The author suggests that MBA holders 
should have hard skills pertinent to their chosen area of specialization, such as “creativity, 
quantitative analytical, as-well-as strategic skills and competencies to manage innovation and 
policy” (p. 10). Moreover, soft skills such as problem solving, communication skills (both 
written and oral), leadership abilities to inspire, guide, steer, and manage teams to work 
towards common goals and lead by example are expected by today’s organizations. The author 
suggest that requirements change, as markets change, and educational institutions need 
curriculums to change to better reflect current trends and market demands. 

In an article by Kyllonen, (2013) on a discussion panel of recent college students and two 
employers, conducted by host CNN’s Christiane Amanpour, “suggest that testing for cognitive 
skills were not important to employers as much as they are to education” (p. 18). The results 
indicated that an “increased awareness of non-cognitive skills, such as those associated with 
human-capital theory (Collins, 1979) are appearing more often in economics literature” (p. 18).   

Iliško, Skrinda & Mičule, (2014) investigated Latvia education as a ‘future-facing activity’ 
(Facer & Sandford, 2010, p. 74). The authors suggest that students play a significant role 
within the interconnected framework of cultural, economic, political and ecological dimensions. 
This interaction leads students when deciding what career path, they study and should engage 
educators in the process of developing more nuanced and alternative trajectories of preferable 
future scenarios by “defining responsibilities and consequences in one’s personal actions” 
(p. 91). 

Mulig, (2015) study of the high cost of graduate school loans presents some troubling figures 
concerning whether a graduate degree is worth the increasing cost of attainment. The author 
suggests that these cost increases, even as enrollment soars (which should lower cost to students), 
continues due to a “controversial practice called differential tuition” (p. 21), where educational 
institutions charge higher tuition for courses that are more popular. 

Contrasting Views: 
Per a study by Soulé & Warrick, (2015), 21st Century learning must encompass core knowledge 
instruction and essential skills (soft skills) for success in today’s labor markets. The core 
knowledge and essential skills, “known collectively as the 4Cs: critical thinking and 
problem-solving, communication, collaboration, creativity and innovation” (p. 181) need to be 
incorporated into any educational framework for the future. The authors suggest that a 21st 
Century educational framework must encompass these disciplines’ “working together, not in 
isolation, which supports the teaching and learning of 21st century skill outcomes” (p. 183). 

Additionally, Britt, (2015) study of online education posits a contrasting view of how educational 
institutions can better serve students effectively engaging with 21st Century workforce readiness 
skills. The author suggests that educational institutions should “require the creativity and 
imagination of the instructor to redesign the learning experience and adapt it to the online 
platform” (p. 399). Amrein-Beardsley, Holloway-Libell, Cirell, Hays & Chapman, (2015) article 
agrees with Britt, (2015) even though their discussion panel’s focus was that of “teacher 
evaluation of rational rule-based teaching as promoting teacher expertise” (p. 3). 

Amrein-Beardsley, et. al. (2015) suggest that “current models of teacher evaluation do not fairly 
evaluate teacher behaviors that increase instructional flexibility, creativity, and risk-taking” 
(p. 3). Teacher and student testing are the two-main standardized and quantifiable tools used to 
measure and evaluate teachers on “instructional design, pedagogy, educational outcomes, student 
learning and achievement” (p. 1). The issue with these observational rubrics is that the teacher 
is being measured on students’ scores on test getting better each time the student is tested; “but 
no thought is given to whether the student actually cares what score they receive is included in 
the evaluation” (p. 2).  

The second method is observations of teachers in practice, where issues are also found as teacher 
qualities and practices are measured using tangible, measurable domains (e.g., preparation, 
organization, classroom, time management) (Amrein-Beardsley, et. al., 2015). The major issue with 
student and teacher testing is that rubrics do not take into consideration that teaching is a much 
more complex social practice and “not one lending itself to reductionism” 
(Amrein-Beardsley, et. al., 2015, p. 3). 

An article by Cappelli (2015) concerning workforce readiness skills of students not matching what 
employers say they need, produces a startlingly contrasting view of the situation. Cappelli, (2015) 
posits that there is not such skill gap or skill shortage or skill mismatch of any type but rather 
the issue is with “student over-education” (p. 251).  Cappelli (2015) suggest that students, 
especially K-12 public education, due to policy decisions, are not graduating with the basic skills 
they should have. He further suggests that the second complaint is with job-related skills 
associated with engineering and information technology (IT) specialist and refers to this as a 
“skills shortage” (p. 252).

The final concern, which the author states is more common outside the United States, is that at 
any given time the supply of skills and the demand for skills could be disharmonious in one 
direction or the other: “oversupply or undersupply” (p. 252).  Cappelli (2015) explains that this 
situation could occur with respect to “either labor markets or educational credentialing” (p. 252) 
and should be referred to as “skills mismatches” (p. 253). Cappelli (2015) discourse with the 
studies completed by other authors (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & 
Buhaş, 2015; Youhang & Dongmao, 2015; Sidhu & Calderon, 2014; U.S. Department of Education, 
2014a; White House, 2015a), does not stop at workforce readiness skills alone, “but disputes the 
predictions of labor shortages occurring by 2010 as well” (p. 256). 

Cappelli (2015) references several seminal authors, such as Carnevale (2006), who predicted that a 
labor shortage was coming. Cappelli (2015) dismisses these assertions as simply misreading of the 
facts, suggesting that only the rate of increase in the labor force was expected to slow due to 
baby boomers retiring. The author references the Society of Human Resource Management (SHRM, 2003) 
report that large numbers of employers in the early 2000’s was preparing for a “labor shortage 
predicted to occur by 2010” (p. 256). The author dismisses the facts presented by the SHRM 
organization, (the organization that tracts labor market demographics as a part of their basic 
work product), as “projections that never came true” (p. 256). 

Cappelli (2015) seems to be unable to discern the difference (or lack thereof) pertaining to 
personnel labor shortages and workforce readiness skills shortages, lumping the two into different 
categories without realizing that the two are synonymous in today’s labor market. The major issue 
with Cappelli (2015) article is that the author never discusses the workforce readiness skills 
employers say they need, or does he distinguishes between soft or hard skill sets. The author does 
present an exhaustive historical background to bolster his assumptions, as he believes they are, 
concerning the issues of workforce readiness skills not matching what employers say they need in 
a 21st Century economy.  

Summary
This non-experimental, quantitative methodology, causal-comparative research study is best because 
analyzing the survey questions should be accomplished through statistical analysis that produce 
numerical values, which will produce findings supported through statistical testing of all 
pertinent variables (Blakstad, 2015; Dillman, Smyth, & Christian, 2014; Ornstein, 2013). 
Per the literature reviews dating back to 2002 to present day (Allen & Seaman, 2002, 
2003 - 2011, 2012, 2013, 2014, 2015), this new population of students will most likely graduate 
with the wrong workforce readiness skills for the jobs they seek. This position is further 
supported by more current local, state, federal and global studies (Dongmao, 2015; Jonbekova, 
2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; U.S. Department of Education, 
2014a; White House, 2015a; Youhang & Sidhu & Calderon, 2014).

Numerous studies concerning workforce readiness skills were published in the past 3 years since 
Dacre-Pool, Qualter & Sewell (2013) study (Deepa & Seth, 2013; Lindsey & Rice, 2015; 
Weaver & Kulesza, 2014).  However, most of these studies are approached from only one of five 
perspectives (Hart Research Associates, 2015; Iliško, Skrinda & Mičule, 2014; Iuliana, Dragoș 
& Mitran, 2014; Iyengar, 2015; Kelly, 2015; Kyng, Tickle & Wood, 2013; Po, Jianru & 
Yinan, 2015; Săveanu & Buhaş, 2015; Soulé & Warrick, 2015; U.S. Departments of Labor, 
Commerce, Education and Health & Human Services, 2014; Youhang & 
Dongmao, 2015; Zhao, 2015). 

The call for further research concerning workforce readiness skills is exacerbated by the effected 
population of approximately 21.266 Million U. S. students who attended colleges and universities 
in 2015 (NCES, 2014, Table 1, 1990 through fall 2103). Moreover, those figures are projected to 
increase by approximately 330,000 students each following year through 2103 (NCES, 2014, Table 1, 
1990 through fall 2103, p. 174). This study will further extend past research through the 
investigation into employability skills of online college pathways students and traditional 
college pathways students, to determine if a difference is found between the two educational 
groups. This study will add to the body of knowledge called for in the U.S. Departments of Labor, 
Commerce, Education and Health & Human Services, (2014, July 22) report; which illustrates a 
need to “expand and improve access to labor market, occupational, and skills data and continue 
basic research on labor markets and employment” (pp. 21-22). 

Recent studies drawn from diverse demographic and geographic global literature reviews strongly 
suggest that the issue of college students and their employability skills in their field of study, 
not matching those needed by executive hiring authorities, should be measured on a pandemic scale 
(Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 2015). Studies from such diverse geographic 
locations as China, Romania, Tajikistan, Philippines and European Union, to name but a few, produce 
facts that show they are facing the same issues with hiring qualified college graduates as their 
U.S. counterparts (Jonbekova, 2015; Po, Jianru & Yinan, 2015; Săveanu & Buhaş, 2015; 
Youhang & Dongmao, 2015). This supports the problem statement of this study; It is not known if 
there is a difference in scores on the CareerEDGE Employability Development Profile (EDP) of students 
from online college pathways and traditional colleges pathways. 

Theoretical Foundations Association with Study
The model upon which the study instrument is built is that of a questionnaire, normally regarded as 
a survey research method design (Dillman, Smyth, & Christian, 2014; Ornstein, 2013). The theory 
supporting this design is based on Probability Theory - specifically that of Frequency Probability 
(Brasoveanu, 2012) where (n) equals the sample population, which is not finite, as it is dependent on 
a robust population size (N) and the Frequency Probability equals the Sample Space (the set of all 
possible outcomes) (Blakstad, 2015; Brasoveanu, 2012; Penrose, 1995).

Human Capital Theory (HCT):
What is HCT?  Human Capital Theory of education (Becker, 1964; Schultz, 1961; in Walters, 2004, 
pp. 99-100) is an economic variant of the Technical Functional Theory (Collins, 1971).  The 
foundation of Human Capital Theory suggests that increases in pedagogy are directly related to 
increased demand for skilled labor.

Theories Supporting Theoretical Foundation:
Two major theories (sub-theories) of Human Capital Theory (HCT) were discussed in detail as they 
directly relate to the theoretical foundation of this study by demonstrating operational models 
(Durkheim, 1895; Collins, 1979). The theories explain possibilities for the disconnect between 
those skills being taught and the skills executive hiring authorities say they need from their 
entry-level workforce.  

Functionalist Theory Operational Model:
What is Functionalism?  A functionalist theory in education posits, “that it is the role of 
education to transmit core values and social control through the attributes that support the 
political and economic systems that fuel education” (Durkheim, 1982).

Credentialist Theory Operational Models:
What is a Credentialist theory model?  Collins (1979) Credentialist theory model posits that 
employers use credentials (degrees, certifications, etc.) as a contributory factor to elevate 
educated workers to better jobs, and these higher educated workers are enjoying more lucrative 
jobs.

Sample Population
The general population size is 43,725 (N) respondents from the Bachelor program at a large 
Western Christian university. The true effect sample size for validity was 210 (n) respondents, 
determined using the G*Power 3.1.9.2 instrument (Faul, Erdfelder, Lang & Buchner, 2007a, 
2007b). G*Power test the probability of correctly rejecting the test hypotheses when the 
alternative hypotheses is true (Faul, Erdfelder, Lang & Buchner, 2007a, 2007b). This study 
will explore the difference between two populations, online college pathways students [O] = x1, 
and traditional college pathways students [T] = x2, by examining the CareerEDGE Employability 
Development Profile (EDP) survey instruments test scores of Bachelor program students at a 
large Western Christian university (Laerd Statistics, 2015). 

Research Variables Defined
Dependent Variable Group. The 2 dependent variable groups being investigated, and the two groups 
directly being analyzed in this study are, Factor 2, Experience Work/Life skills, and Factor 4 
Generic Skills (Dacre-Pool & Sewell, 2007) (Appendix F, pp. 186–187). The 18 CareerEDGE 
Employability Development Profile (EDP) survey instrument skills questions are listed in 
(Appendix D continued, pp. 182 – 184).  

Independent Variable Group
The independent variable groups being measured are online college and university pathways 
students and traditional college and university pathways students. These variables are 
aligned to the research questions, hypotheses, and theoretical foundations, thus aligning 
with the Problem Statement: It is not known if there is a difference in scores on the 
CareerEDGE Employability Development Profile (EDP) of students from online college pathways 
and traditional colleges pathways. The study need was determined to be of great importance 
after careful analysis of 185 plus studiesand numerous article reviews dealing with the 
topic of employability and workforce readinessskills, which supports the argument for a 
current need to add to the body of knowledgeassociated with the study topic. After analysis 
of the literature concerning whichemployability skills are considered most important by 
employers (Carnevale, Gulish & Strohl,2015), education (Lindsey & Rice, 2015; Zhao, 
2015), students (Brill, Gilfoil & Doll,2014), government (U.S. Departments of Labor, 
Commerce, Education and Health & HumanServices, 2014) and the global perspective 
(Jonbekova, 2015; Săveanu & Buhaş, 2015), thefollowing research questions needed to 
be answered. 

RQ1:    Is there a difference in scores on the CareerEDGE Employability 
Development Profile (EDP) factor 2, work & life experience skills, of students who 
graduate with a bachelor’s degree at a large Western Christian university through their 
online college pathway and their traditional college pathway? 

H1o:    There is no statistically significant difference in scores on the 
CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience skills, 
of students who graduate with a bachelor’s degree at a large Western Christian university 
through their online college pathway and their traditional college pathway.

H1a:    There is a statistically significant difference in scores on the 
CareerEDGE Employability Development Profile (EDP) factor 2, work & life experience 
skills, of students who graduate with a bachelor’s degree at a large Western Christian 
university through their online college pathway and their traditional college pathway.

RQ2:    Is there a difference in scores on the CareerEDGE Employability 
Development Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s 
degree at a large Western Christian university through their online college pathway and their 
traditional college pathway?

H2o:    There is no statistically significant difference in scores on the 
CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who 
graduate with a bachelor’s degree at a large Western Christian university through their 
online college pathway and their traditional college pathway.

H2a:    There is a statistically significant difference in scores on the 
CareerEDGE Employability Development Profile (EDP) factor 4, generic skills, of students who 
graduate with a bachelor’s degree at a large Western Christian university through their online 
college pathway and their traditional college pathway. 

The conclusions drawn from the survey by the Lumina/Gallop Poll, (2014) suggest that higher 
educational institutions need to reexamine their degree and credentialing programs and bring 
them in line with the skill sets that businesses most need. When business leaders were asked 
what talent, skills and knowledge higher educational institutions should develop, the most 
popular answer was “internships and practical on-the-job experience for students at 14%” 
(Lumina/Gallop Poll, 2014, p. 30). 

The issue seems to be a lack of communication and action between educational institutions and 
hiring authorities, “as a majority (71%) of business leaders say they currently do not 
collaborate or partner with any higher educational institution, with only 29% stating that 
they do have a partnership in place” (Lumina/Gallop Poll, 2014, p. 25). The following section, 
Methodology, will illustrate and describe in greater detail the actual research design, 
quantitative methodology used, variables investigated, data collection processes, data analysis 
processes, research questions, survey questions, as-well-as hypotheses used to answer the 
problem statement of this study.

                                    Chapter 3: Methodology
									
Introduction:   
This is a quantitative, causal-comparative research study to investigate if a difference exists 
between online college pathways students’ skill sets and traditional college pathways students’ 
skill sets. Chapter 3 focuses on the research methodology used in this study. It defines the 
research design, sample population, the theoretical foundations (Becker, 1964; Schultz, 1961; 
in Walters, 2004, pp. 99-100), the research instrument (Dacre-Pool & Sewell, (2007), data 
analysis, validity and reliability, and any ethical considerations that should be under 
consideration by the large Western Christian universities, Institutional Review Board (IRB) 
network. The invitation (e-mail) introduction letter will discuss the large Western Christian 
universities Internal Review Board (IRB) and Informed Consent approval process. The individual 
steps used in the collection of the data is covered in the Data Collection Procedures section 
and Data Analysis Procedures section describes the details involved in computation and analysis 
of the raw data (Blakstad, 2015; Brasoveanu, 2012; Boone & Boone, 2012). 

Sidhu and Calderon, (2014) study results show that, “More than one-third of business leaders 
(39%) are not confident that U.S. college students are graduating with the skills and competencies 
that their businesses need” (p. 1).  The need for this study is illustrated by the general 
population affected by this problem. There are approximately 21.266 million students attending 
American colleges and universities in the academic year 2015 (NCES, 2014). The purpose of this 
study is to investigate if a significant difference exists between students from online college 
pathways and traditional college pathways, using the Bachelor program at a large Western Christian 
university, from their CareerEDGE Employability Development Profile (EDP) instrument scores 
(Pool & Sewell, 2007).  The significance of this study is that it will produce statistical data 
that will lead to a more accurate, essential and deeper understanding of the relationship between 
education and application of employability skill sets of current graduates. Per a combined study 
report from the U.S. Departments of Labor, Commerce, Education and Health & Human Services, 
on July 22, 2014, there is a need to “expand and improve access to labor market, occupational, and 
skills data and continue basic research on labor markets and employment” (pp. 22-24). 

The expectations, (i.e. prospects for acquisition of unknown knowledge acquired from this 
study), is to determine if a statistically significant difference in skill sets exist, 
determined by online and traditional college pathways students in the Bachelor program. 
The remainder of this chapter will discuss subjects concerning the statement of the current 
problem under investigation, and will list the research questions, hypotheses, variables and 
instrumentation that will be used to gain empirical supporting data. 

Statement of the Problem
It is not known if there is a difference in scores on the CareerEDGE Employability Development 
Profile (EDP) of students from the online college pathway and the traditional colleges pathway 
from the Bachelor program at a large Western Christian university. The pathways described are 
the major modes of delivery of the curriculum to the approximately 43,725 (N) general population 
of students from the Bachelor program of the large Western Christian university in 2017. The two 
different pathways must be investigated together to determine if there is an existing difference 
in educational skill sets being taught. It is important to determine the extent of the difference 
before curricula can be designed to produce better skilled graduates.  

The issue of online and traditional college students not having the needed employability skills 
executive hiring authorities want from recent graduates is not solely endemic to the U.S. business 
environment (Sidhu & Calderon, 2014).  Per recent studies drawn from global references the 
issue of college students’ employability skills in their field of study, that do not match those 
needed by the employer should be measured on a pandemic scale (Chanco, 2015; National Statistics 
Office of Philippines, 2015).  A sampling of literature reviews (studies) from China, Romania, 
Tajikistan, Philippines and European Union produce facts that show they are facing the same issues 
with hiring talent as their U.S. counterparts (Chanco, 2015; Jonbekova, 2015; Po, Jianru & Yinan, 
2015; National Statistics Office of Philippines, 2015; Săveanu & Buhaş, 2015; Youhang & 
Dongmao, 2015).  

Research Question(s) and Hypotheses: 
This quantitative, causal-comparative research study will add to the current body of research by 
investigation of whether a statistically significant difference exists between online college 
pathways students and traditional college pathways students from the Bachelor program at a large 
Western Christian university, using their CareerEDGE Employability Development Profile (EDP) 
instrument scores. The instrument used to determine the research questions for this survey 
questionnaire study are supplied by Dacre-Pool and Sewell, (2007).  The permission letter to use 
the instrument and questionnaire (survey questions) are found in (Appendix D, pp. 192–194). 

The research questions align directly with the problem statement: It is not known if there is a 
difference in scores on the CareerEDGE Employability Development Profile (EDP) of students from 
online college pathways and traditional colleges pathways. The research questions also align with 
the purpose statement of this study: The purpose of this study is to investigate if a statistically 
significant difference exists between students from online college pathways and traditional college 
pathways from the Bachelor program at a large Western Christian university, using their CareerEDGE 
Employability Development Profile (EDP) instrument scores. 

The research questions will be analyzed after completion of the survey questionnaire by both online 
college pathway students and traditional college pathway students separately. This means that each 
student will answer 18 survey questions from the two dependent variable groups; factor 2, work & 
life experience skills and factor 4, generic skills will be evaluated for this study. The Work & 
Life Experience skills, dependent variable group, consist of two survey questions and the Generic 
skills dependent variable group, consist of sixteen survey questions. 

All 18 survey questions will be answered by the volunteer sample population of this study, but the 
two dependent variable groups, Factor 2 and Factor 4, will be the focus of the hypotheses testing 
and analysis for this study. The decision to use only two of the five factors for analysis comes 
from the gap in knowledge from literature reviews that show a disconnect between the academic 
skills colleges and universities are teaching (Cai, 2013; Iuliana, Dragoș & Mitran, 2014; 
Maurer, 2015) and the skills current organizations say they need from their entry-level workforce 
(Bessolo, 2011; Brungardt, 2011; Cai, 2013; Lindsey & Rice, 2015; Robles, 2012; Soulé & 
Warrick, 2015). 

Research Questions:
RQ1: Is there a difference in scores on the CareerEDGE Employability Development 
Profile (EDP) factor 2, work & life experience skills, of students who graduate with a 
bachelor’s degree at a large Western Christian university through their online college pathway and 
their traditional college pathway?

H1o: There is no statistically significant difference in scores on the CareerEDGE 
Employability Development Profile (EDP) factor 2, work & life experience skills, of students who 
graduate with a bachelor’s degree at a large Western Christian university through their online 
college pathway and their traditional college pathway.

H1a: There is a statistically significant difference in scores on the CareerEDGE 
Employability Development Profile (EDP) factor 2, work & life experience skills, of students 
who graduate with a bachelor’s degree at a large Western Christian university through their online
college pathway and their traditional college pathway.

RQ2: Is there a difference in scores on the CareerEDGE Employability Development 
Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a 
large Western Christian university through their online college pathway and their traditional 
college pathway?

H2o: There is no statistically significant difference in scores on the CareerEDGE 
Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a 
bachelor’s degree at a large Western Christian university through their online college pathway and 
their traditional college pathway.   

H2a: There is a statistically significant difference in scores on the CareerEDGE 
Employability Development Profile (EDP) factor 4, generic skills, of students who graduate with a 
bachelor’s degree at a large Western Christian university through their online college pathway and 
their traditional college pathway. 

Operationalized Variables: 
The independent variable indicates if a participant is an online college and/or university pathway 
student or traditional college and/or university pathway student from the Bachelor program at a 
large Western Christian university. These variables align to the purpose statement, problem 
statement, research questions, hypotheses and theoretical foundations of this study. The two 
dependent variables groups, factors 2 and 4, associated with the CareerEDGE Employability 
Development Profile (EDP) instrument (Dacre-Pool & Sewell, 2007) are operationalized to 
produce a scaled score index. A scaled score is the total number of correct questions (raw score) 
that have been converted onto a consistent and standardized scale. "Scaled scores reflect the 
difficulty of the questions when reporting student results. Scale scores are meant to help with 
the interpretation of test results" (Katz & Warner, 2017).

Scaled Score Index:
Factors Under Investigation            Items         Range       Scaled Score
Factor 2 - Experience, Work & 
Life	                               6 – 7	     2 – 14       165 - 153
Factor 4 - Generic Skills	     10 – 25        16 – 112       139 – 43
Total Score		              1 – 26        26 – 196      __________	 

Research Methodology 
A quantitative methodology was selected for this study because a statistical analysis of the 
differences in the dependent variable groups, Factor 2 and Factor 4, are needed. An association 
with the independent variable groups, online and traditional college and university pathways 
in the Bachelor degree program at a large Western Christian university, was needed to determine 
if a difference exists and the possible cause(s) for it. The two different pathways must be 
investigated separately to determine if a statistically significant difference exists in 
educational skill sets being taught and those 21st Century employers say they need. 

A quantitative methodology is used when one needs to determine an answer of or relating to the 
quantity or amount of something. This is the best methodology compared to others available, such 
as qualitative or a mixed-methodology, because it will determine if a statistically significant 
difference exists in the independent variable groups online and traditional college or university 
pathways students. Fray (1996) suggested that the first step in methodology selection should 
concern what information the investigator wishes to know and then determine what research design 
method is needed to acquire said data in the fewest number of questions. Additionally, 
Fray (1996) suggested that the survey questionnaire method avoids qualitative responses of 
open-ended question answers, which lend themselves to respondent bias. Additionally, the 
rationale for use in this study is it will statistically define if the researcher can defend 
results that produce answers to the research questions using applied social research methods 
(Trochim, William M.K., 2006).    

Research Design:
In selecting any research design, seven factors are considered; a) selection of the problem, b) 
selection of participants (sample size), c) selection of instrumentation, d) selection of study 
research design, e) selection of procedure(s), f) selection of data-analysis tool(s) and g) 
interpretation of the findings or results (Gay, 1987). This causal-comparative design is often 
referred to as an ‘ex post facto’ study design because the effect and the alleged cause have 
already occurred and must be studied in retrospect (Gay, Mills & Airasian, 2006). This is the 
most common research design in educational research since it describes conditions that already 
exist (Gay, et. al., 2006). The basic approach starts with investigation, which involves dependent 
variables (the EDP instrument survey Factors 2 and 4) and one (or more) independent variable groups, 
online and traditional college degree pathways (Gay, et. al., 2006). 

This causal-comparative study design is the best choice for this study, as it will use pre-existing 
groups to investigate differences between or among those groups (Schenker and Rumrill, 2004). 
Additionally, the variables often examined in causal-comparative research studies “cannot 
(should not) be manipulated for practical or ethical reasons” (Schenker and Rumrill, 2004, p. 117). 
This is most important in cause-effect experimental studies, but it is equally important to a 
causal-comparative, non-experimental study such as this one, so that future research of an 
experimental type can simply build on the current research design, data collection applications, 
and data analysis applications using the two-tailed independent samples t-test.

The quantitative methodology, causal-comparative study design (Gay, Mills & Airasian, 2006; 
Schenker & Rumrill, 2004) is the best design for this study compared to others available because 
it will determine statistically if a significant difference exists between the two groups of 
individuals (independent variables group) affected by the degree of workforce readiness skills 
(dependent variable group). The two groups affected are traditional college and university 
pathways students and online college and university pathways students. 

The dependent variable groups are the two factors of the CareerEDGE Employability Development 
Profile (EDP) instrument (Dacre-Pool, Qualter & Sewell, 2013). The 18 survey instrument 
skills questions are those that “employers state is not being taught to the degree that 
employers need by today’s educational institutions” (NACE, 2014, p. 325). Causal-comparative 
research designs focus on determining if a cause-effect (or effect-cause) exists between groups 
of individuals - the independent variables - and a second factor or set of factors - the 
dependent variables. This is the guiding rationale and purpose for the use of this study 
research design (Schenker and Rumrill, 2004). 

The closest alternative study design to causal-comparative in quantitative research is the 
correlational research design (Cohen, Cohen, West & Aiken, 2013). Correlational research 
attempts to determine the relationship of two or more variables (Gay, 1987). Additional study 
designs often used after a non-experimental, causal-comparative study produces results are 
Experimental and Quasi-Experimental research designs (Gay, Mills & Airasian, 2006). With 
non-experimental research designs, there is no expected relationship assumed between variable 
groups. The reasoning for using hypotheses with this non-experimental, causal-comparative 
research design is for greater clarity in the study results section, chapter 4 of the final 
dissertation, and to make future research conducted using either Experimental or 
Quasi-Experimental researchdesigns more insightful (Cohen, Cohen, West & Aiken, 2013; 
Schenker & Rumrill, 2004).   

Population and Sample Selection
The total general population for this study is approximately 43,725 (N) students from the 
bachelor program using both the online college degree pathway and the traditional college 
degree pathway in 2017, at a large Western Christian university. With this large of a general 
population to draw from, statistical power for validity and reliability are assured. An A 
priori G*Power test, using effect size d = 0.5 (upper bounds) and actual power = 0.9501287, 
which produced a sample size (n) of 210 participants (G*Power 3.1.9.2 Software, 2015) 
(Figure 5, p. 173). Screenshot below. 

GPower Sample Size Calculations (upper bounds) screenshot:
Test covering both one and two range responses (lower bounds and upper bounds effect size), 
were run to determine the sample size (n) from the estimated general population used in this 
study (G*Power, 2015, ver.3.1.9.2 Software). The rationale for this decision was that by 
selecting an upper and lower effect size percentage, the most accurate sample population size 
(n) could be attained to assure research results were generalizable to the broadest audience.  

Because exact numbers of the Bachelor program students from online and traditional college 
degree pathways were not available, the following data was used in estimating the general 
populations. A general population of 79,500 (N) is enrolled in the three programs offered, 
Bachelor, Masters and Doctoral, by the large Western Christian university. The 2016 breakdown 
of students was 14,500 Traditional & 65,000 Online students across all three programs 
determined by pathway. The following percentages of the student total population were assigned 
in determining only the Bachelor programs online and traditional college degree pathways 
general populations (N).  

Students % of Population/Pathway:
Online	Gen Pop Est.			Traditional Gen Pop Est.
35,750 Bachelor Students @ 55%	      	7,975 Bachelor Students @ 55%
26,000 Masters Students @ 40%	      	5,800 Masters Students @ 40%
3,250 Doctoral Students @ 5%	        720 Doctoral Students @ 5%
 
An a-priori power analysis was used to determine the true sample size needed for ‘statistical 
power’ for validity and reliability using the G*Power instrument (G*Power 3.1.9.2 Software, 2015). 
Using input parameters of two-tailed independent samples t-test to determine the statistical 
difference between two independent means (two groups), the author used an A-priori power 
analysis to compute input parameters for the required sample size, given: α (alpha) = 0.5 and 
(α err prob) at 0.05, power = (1-β err prob) at 0.95 and effect size d = 0.5 (upper bounds) and 
allocation ratio N2/N1 = 1. The output parameters were Noncentrality parameter δ = 3.6228442, 
Critical t = 1.9714347, df (degrees freedom) = 208, Sample size group 1 = 105, Sample size 
group 2 = 105, total sample size = 210 and Actual power = 0.9501287 (Figure 5, p. 173, 
power plot 2). 

The lower bound test used input parameters, α (alpha) = 0.20 (α err prob) at 0.20, power = 
(1-β err prob) at 0.80 and effect size d = 0.30 (lower bounds) and allocation ratio N2/N1 = 1. 
The output parameters were Noncentrality parameter δ = 2.1319006, Critical t = 1.2857988, df 
(degrees freedom) = 200, Sample size group 1 = 101, Sample size group 2 = 101, total sample 
size = 202 and Actual power = 0.8015396 (Figure 4, p. 172, power plot 1). With adjusted effect 
size of 0.3 (low), and 0.5 (high), where effect size 0.5 (upper bounds) was chosen as it 
produced 210 needed respondents for statistical validity and reliability (G*Power 3.1.9.2 
Software, 2015) (Figure 4 & 5, pp. 172-173). 

Because this is a study consisting of a sample size (n) of volunteers, sampling only adult 
students from the Bachelor program at a large Western Christian university, a request for 
site authorization application is required to be submitted to the university prior to study 
initiation. A request to conduct research of any type will need prior approval by the 
universities Institutional Review Board (IRB). After permission to conduct research has been 
attained by the IRB network, site authorization can begin. The site authorization application 
will describe the purpose and scope of the research, duration of the study, target population, 
impact on operations and resources, data use and potential benefit to the large Western 
Christian University.  

A sample size is a subset of the population being studied.  It is representative of a larger 
population and is used to draw inferences concerning that general population. It is a research 
technique widely used in social science without having to measure the entire population 
(Ross, 2005). Because this studies sampling frame consist of Bachelor degree seeking students 
from a large Western Christian university, it is believed it will represent a population that 
can be generalized to the larger U. S. college population. Because all U.S. colleges respondent 
population (N) would not be possible to survey individually, (because a complete list of U.S. 
colleges cannot be compiled due to cost, time restraints and/or unforeseen elements), a general 
population size (N) of approximately 43,725, (7,975 Traditional ground campus and 35,750 Online 
students) from a large Western Christian university was selected. 

Ross, (2005) guidelines for a convenience sampling frame states, “Representativeness, is where 
a set of sample data refers specifically to the marker variables selected for analysis” (p. 4). 
These guidelines were used to assure a high degree of representativeness between the general 
U.S. population and that of the U.S. college and university populations. The operational 
definition for this sampling technique is ‘convenience sampling’ because it is applied to all 
bachelor students from one large Western Christian university in the U.S. which is representative 
of the larger U. S. educational population. A convenience sample is one of the main types of 
non-probability sampling methods used in social science research (Allen & Seaman, 2015). 
A convenience sample is made up of people who are easy to reach and the cost is relatively low. 
The sample selection in this study is supplied by the large Western Christian university’s email 
survey manager. The email survey manager will direct students from the Bachelor program of the 
large Western Christian university to use a link to the survey instrument page.  

Instrumentation:
The survey questionnaire instrument for this causal-comparative study is supplied by Dacre-Pool 
and Sewell (2007) using their CareerEDGE Employability Development Profile (EDP) instrument, which 
measures workforce readiness skills from five factor elements (Appendix D, pp. 189-192). The EDP 
was designed specifically for developmental work with students of any higher education institution 
(Dacre-Pool & Sewell, 2007). This diagnostic tool is a self-report questionnaire that asks 
students to rate themselves on different aspects of employability, “as defined by the CareerEDGE 
model” (Dacre-Pool & Sewell, 2007, p. 305). The CareerEDGE Employability Development Profile 
(EDP) instrument questionnaire consists of 18 questions covering a wide range of employability 
skills. The focus of the skills questions for this study concerns dependent variable groups 
Factor 2, Work & Life Experience and Factor 4, Generic skills.

Examples of how the questions are worded are listed below and the answer rubric consists of a 
7-point Likert scale.

Examples:
Factor 2 = SQ-6. I have a lot of work-relevant experience.
Factor 4 = SQ-10. I have good oral communication skills.

The samples used for construct validity were exploratory factor analysis (EFA), comprised of 
(n = 403) respondents and confirmatory factor analysis (CFA) comprised of (n = 402) respondents 
(Dacre-Pool, Qualter & Sewell (2013, p. 307), (Appendix F, pp. 198-199). The two factor 
reliabilities of the dependent variable groups used in this study were mean 0.78 (Factor 2 - 
Experience Work/Life skills), and mean 0.63 (Factor 4 - Generic skills). The final model 
producing the best fit (root mean square error of approximation) RMSEA = .057, where values 
between .05 to .08 are deemed best fit, (normed fit index) NFI = .96, where values above 
.90 are deemed acceptable and the confirmatory fit index, CFI = 0.91, where values above 
0.90 are deemed best fit (Dacre-Pool, et. al., 2013). The use of the survey instrument in 
this study will contribute directly to answering the research questions through statistical 
analysis of the scores attained by the two independent variable groups, online and traditional 
college and university pathways students. The survey questionnaire consists of eighteen 
dependent variables, identified as the CareerEDGE Employability Development Profile (EDP) survey 
questions from factor 2, work & life experience skills and factor 4, generic skills. 

Validity:
The survey questionnaire instrument for this quantitative designed study was supplied by 
permission of Dacre-Pool & Sewell, (2007) (Appendix D, pp. 191-194). There are three 
types of validity measurements; content validity, dependent validity, and in this case, 
construct validity. Construct validity in the case of a complex theory means it is built 
from several simpler elements or given conditions. Modern validity theory defines construct 
validity as the overarching concern of validity research, subsuming all other types of 
validity evidence (Law & Watts, 1977). In the case of a causal-comparative research 
study the author is interested in measuring whether the theoretical concept matches up with 
the conditions the researcher wants to measure. The main theoretical model that has 
underpinned the CareerEDGE EDP model is the DOTS model (Law & Watts, 1977), which 
consists of planned experiences designed to facilitate the development of: “Decision 
learning – decision making skills, Opportunity awareness – knowing work opportunities exist 
and what their requirements are, Transition learning – including job searching and 
self-presenting skills, and Self-awareness – in terms of interests, abilities, values, 
etc.” (Watts, 2006, pp. 9-10). 

Construct validity simply refers to whether a scale or test measures the intended construct 
adequately (Brown, 1996, p. 231). Notice that the evidential basis for validity includes both 
test score interpretation and test score use (Messick's, 1988, 1989). There are two basic types 
of construct validity, differential-groups study (used with experimental demonstrations) and 
the type used by the authors of the CareerEDGE (EDP) instrument to validate their model, 
Intervention study, wherein a group that is weak in the construct is measured using the test, 
then taught the construct, and measured again (Brown, 1996). If a statistically significant 
difference is found between the pretest and posttest, that difference can be said to support 
the construct validity of the test. 

Do not confuse construct validity type with construct validity tests. The types of test that 
are used for the two different types of validity include, i.e. factor analysis and 
pretest-posttest intervention studies. The test used by the authors of the CareerEDGE (EDP) 
instrument to validate their model involved pretest-posttest and factor analysis using 
Statistical Package for Social Science (SPSS ™) software (International Business Machines 
(IBM), 2015) (Dacre-Pool, Qualter & Sewell, 2013). The findings suggest that the EDP is 
multidimensional and maps clearly onto the CareerEDGE model of graduate employability 
(Dacre-Pool, Qualter & Sewell, 2013).

Factor analysis, using IBM’s Statistical Package for Social Science (SPSS ™) software 
(International Business Machines (IBM), (2015) was used for the validation sample of the 
two dependent variable groups of interest in this study, producing an overall mean of 0.76 
(Factor 2) and overall mean of 0.57 (Factor 4) (Dacre-Pool, Qualter & Sewell, 2013, 
p. 309). The pretest-posttest of the two factors that are the focus of this study produced 
a mean and standard deviation (SD) for (Factor 2), Work/Life Experience Skills was, 
pretest = 8.37 (2.93) and posttest = 9.53 (2.20). The mean and standard deviation (SD) for 
(Factor 4), generic skills were, pretest = 63.42 (6.89) and posttest = 63.95 (8.31).    

Reliability:
The survey questionnaire instrument for this quantitative study was supplied by permission of 
Dacre-Pool and Sewell, (2007) (Appendix D, pp. 191-194). There is a growing concern that 
objectivity in research is no longer possible (AQR, 2016). Where reliability is concerned, 
this refers to “the closeness to fact that a scientific test or piece of research measures 
what it sets out to, or how well it reflects that reliabilities claims” (AQR, 2016, p. 1). 
The two factors (dependent variable groups) that are the focus of this study are Factor (2) 
Experience Work/Life Skills and Factor (4) Generic Skills (Dacre-Pool, Qualter & Sewell, 
2013, pp. 307-309) (Appendix F, pp. 196-197). 

The quantitative method requires statistical results as proof of reliability, and in statistics, 
particularly classical test theory, Cronbach's (alpha) is that test. It is a (lower bound) 
estimate of the reliability of a psychometric test, (i.e. mental capabilities and/or behavioral 
style, etc.). Cronbach's alpha is a measure of internal consistency, or how closely related a set 
of items are as a group. It is a measure of scale reliability (Institute of Psychometric Coaching, 
2016). Cronbach's (alpha) coefficient is also called the internal consistency of said reliability 
of the test (Tavakol & Dennick, 2011). Coefficient alpha simply represents a ratio of true 
score variance (reliable/consistent), to total variance (How2Stats, 2015, Jan 19). If the items 
in a test are correlated to each other, the value of alpha is increased (Tavakol & Dennick, 2011). 

Inspection of the correlation matrix for each sample revealed the presence of coefficients of 0.30 
and above (Tabachnick and Fidell, 2007). Unidimensionality implies the presence of only one factor 
in the data (and is determined with factor analysis); coefficient alpha (consistency) assumes 
unidimensionality, but it cannot test for it (How2Stats, 2015, Jan 19). Reliability can be viewed 
as the expected correlation of two tests that measure the same construct, hence its closeness in 
theory and practical application to validity testing (Institute of Psychometric Coaching, 2016; 
Leeuw, 2005). 

A statistical test for Cronbach’s (alpha) was run using the upper limits of the test scores 
(5, 6 & 7) producing the following results. Using the Statistical Package for the Social 
Science (SPSS) (International Business Machines (IBM), 2016) a test was run to determine if 
using the top three scores from the answers available using a Likert grading scale would 
produce a Cronbach’s score in the range of .700 or higher. The measure for the SPSS testing 
for the Cronbach’s Alpha was set to Ordinal on the variables page, though many scholars suggest 
that a scale measurement is also permitted and often used. Testing of the five factors 
(dependent variable groups) produced results of Cronbach’s (alpha) = .931 for both Factor 2, 
Experience Work/Life and Factor 4, Generic Skills, (the focus of this study). The test produced 
an Inter-Item Correlation Matrix of 1.000 and Item-Total Correlation of .726 each, compared to 
Cronbach’s Alpha if item deleted at .932. Item statistics produced scores of Means = 6.57 and 
Standard Deviation = .573 on the five items. Scale statistics produced scores of Means = 32.86, 
Variance = 6.423 and Standard Deviation = 2.534 on the five factors. (Appendix G, pp. 197-198). 

In the case of the original CareerEDGE Employability Development Profile (EDP) instrument 
(Dacre-Pool & Sewell, 2007), the reliability of the instrument was determined by a 
test-retest reliability procedure. The test-retest reliability procedure examined the extent 
to which scores from one sample are stable over time from one test administration to another 
(Creswell & Plano Clark, 2011). The instrument was administered to 19 individuals. The same 
instrument was then administered to the same individuals several weeks later.  The arrangement 
of the questions changed during the second administration. The scores from Time 1 and Time 2 
were used to compute the Pearson correlation coefficient as a measure of the test-retest 
reliability of the instrument. 

The Pearson correlation coefficient is represented as Time1∙Time2 =??? (Creswell & 
Plano Clark, 2011; Green, 2015). The EDP factor subscale for the test-retest procedure 
consist of 19 respondents. These respondents were first tested as Time 1 and the Mean (SD) 
and t-values and p-values were recorded. Several weeks later the same 19 respondents were 
tested again with the order of the questions changed. The second test, Time 2 and the Mean 
(SD) and t-values and p-values were recorded again and the two tests were compared. The two 
test of reliability produced results of Factor 2, Experience Work/Life, being comparable to 
Factor 4, Generic Skills, with the Means (SD) and the t-test and p-values within accepted 
limits (Dacre-Pool, Qualter & Sewell, 2013) (Appendix F, pp. 194-195).  

Data Collection and Management:
This quantitative, causal-comparative study will add to the current body of research on the 
investigation into workforce readiness skills acquired by students from online college pathways 
and traditional college pathways, by examining the differences between the two educational groups 
EDP scores, as determined from the Bachelor program at a large Western Christian university. 
An invitation to participate in an online survey will be sent to all students from the bachelor 
degree program at the large Western Christian university, using their Email Survey Distribution 
system. The total general population size (N) for the study is approximately 43,725 respondents 
possible (7,975 traditional and 35,750 online students). 

Site Authorization Process: 
All email survey requests (including initial requests, follow-ups and reminders) that have 
necessary and/or appropriate site authorization and IRB approval, will be distributed by 
the email survey distribution manager. The invitation (e-mail) introduction letter will 
discuss the large Western Christian university, Internal Review Board (IRB) approval process 
and discuss informed consent from participates. Completion of the survey instrument by 
research respondents will be considered as written affirmation of informed consent by the 
independent variable groups, online college and university pathways students, and traditional 
college and university pathways students, in accordance with Common Rule, 2009, January 15. 
Additionally, risks and benefits are addressed concerning participants and the organization. 
An IRB review, sample selection ideology, protection of rights/well-being, maintaining of 
data security, sample recruitment process, data collection instruments and approaches 
will be discussed. 

Step-by-Step and Timeline:
1.	Prepare and submit email survey requests (including initial requests, follow-ups and 
reminders) and any other needed documentation, to the IRB committee and the manager of the 
email survey system. Timeline: 7 to 10 work days.

2.	The invitation (e-mail) introduction letter is sent through the universities email 
survey system. Timeline: 7 to 10 work days.

3.	A follow-up-letter will be sent via e-mail five to seven days after the letter of 
introduction is sent, encouraging full participation. Timeline: 5 to 7 work days. 

4.	The test period begins, two weeks for initial survey instrument test live on server. 
Collect raw data from manager of the email survey system department in SPSS.sav format 
(SPSS™, International Business Machines (IBM), 2015). Timeline: 7 to 14 work days depending 
on volume of material. 

5.	The raw data are tested using two independent samples t-test to produce the statistical 
values needed to accept or reject the null hypotheses, thus answering the two research 
questions associated with the focus of the study, Factor 2 and Factor 4 of the CareerEDGE 
Employability Development Profile (EDP) survey instrument. Timeline: 7 to 21 work days. 

Data Management Safety, Storage and Destruction Processes:
The demographic and geographic data collected (Appendix E, p. 194) will not be used as a 
measurement tool in the actual study but will be used in Chapter 4 (Data Analysis and Results) 
of the final dissertation in the Comments from Survey Participants section. Per applicable 
Collaborative Institutional Training Initiative (CITI) regulations (Collaborative 
Institutional Training Initiative, 2015), study data will be stored on a flash drive and 
locked in a safe. The survey results and all electronic data collected and analyzed will 
be secured on a portable external drive, which will be secured in the researchers private safe 
in accordance to (CITI) review for a period of three years. At the end of the three-year 
period all paper notes and all electronic data held on computers, websites and flash drives 
will be destroyed in compliance with (CITI) published standards.

Data Analysis Procedures:
The purpose of this study is to investigate if a difference exists between students from online 
college pathways and traditional college pathways Bachelor program at a large Western Christian 
university.Using the students CareerEDGE Employability Development Profile (EDP) instrument scores 
from factor 2 (work & life experience skills) and factor 4 (generic skills) an accurate 
assessment will be produced (Dacre-Pool & Sewell, 2007). This quantitative causal-comparative 
research study will analyze the scores of factors 2 (work & life experience skills) and 
factor 4 (generic skills) by the Bachelor students at a large Western Christian university 
using the CareerEDGE Employability Development Profile (EDP) instrument (Dacre Pool & 
Sewell, 2007). 

The respondent is asked to rate the same employability development profile survey questions for 
each type of graduate pathway, traditional or online. Following the statistical analysis of 
Factors 2 and 4, through the scouring of the eighteen-employability development profile survey 
questions using a Likert scale, the results will be compared to the research questions RQ1 and 
RQ2. Additionally, raw data will be analyzed and illustrated, both written and/or through tables, 
figures, diagrams, charts or graphs (Blakstad, 2015; Brasoveanu, 2012; Boone & Boone, 2012).

This research instrument will use a 7-point Likert scale of: 7) Strongly Agree, 6) Agree, 
5) Slightly agree, 4) Neither Agree nor Disagree, 3) Slightly disagree, 2) Disagree and 1) 
Strongly Disagree (Laerd Statistics, 2016). The numerical scale given to the 18 survey questions 
from factor 2, work & life experience skills and factor 4, generic skills, was designed to 
facilitate scored responses to be used with a causal-comparative research design, and a 
two-independent samples t-test to determine if an association exist (Gay, Mills & 
Airasian, 2006; International Business Machines (IBM), 2016; Laerd Statistics, 2016; 
Schenker & Rumrill, 2004; Statistical Package for Social Sciences (SPSS), 2015).

RQ1: Is there a difference in scores on the CareerEDGE Employability Development Profile 
(EDP) factor 2, work & life experience skills, of students who graduate with a bachelor’s 
degree at a large Western Christian university through their online college pathway and their 
traditional college pathway?

H1o: There is no statistically significant difference in scores on the CareerEDGE 
Employability Development Profile (EDP) dependent variable group work & life experience skills, 
of students who graduate with abachelor’s degree at a large Western Christian university through 
their online college pathway and theirtraditional college pathway.

H1a: There is a statistically significant difference in scores on the CareerEDGE 
Employability Development Profile (EDP) dependent variable group work & life experience skills, 
of students who graduate with a bachelor’s degree at a large Western Christian university through 
their online college pathway and their traditional college pathway.

RQ2: Is there a difference in scores on the CareerEDGE Employability Development 
Profile (EDP) factor 4, generic skills, of students who graduate with a bachelor’s degree at a 
large Western Christian university through their online college pathway and their traditional 
college pathway?

H2o: There is no statistically significant difference in scores on the CareerEDGE 
Employability Development Profile (EDP) dependent variable group generic skills, of students who 
graduate with a bachelor’s degree at a large Western Christian university through their online 
college pathway and their traditional college pathway.


There is a statistically significant difference in scores on the CareerEDGE Employability 
Development Profile (EDP) dependent variable group generic skills, of students who graduate 
with a bachelor’s degree at a large Western Christian university from an online college or 
university pathway and those who graduate from a traditional college or university pathway. 

Construct of Factors:
Construction of the two factors in this study are based on the variables under investigation being 
ordinal, as determined using the Likert scale, and the numerical values are without a true zero 
point, the opposite of ratio’s, thus they are Interval and continuous as they can represent any 
number, even decimal points (Jamieson, 2004). The measurement level of the two factors (dependent 
variable group) are determined by the following formula: Factor 2, (work/life experience skills) 
has two items (survey questions) and Factor 4, (generic skills) has 16 items (survey questions). 
Factor 2 = (item 1 + item 2) divided by the number of items in that factor and factor 4 = 
(item 3 + item 4 + … item 18) divided by the number of items in that factor. 

Two-independent samples t-test:
To answer the research questions, the researcher will use a two-independent samples t-test using 
the Statistical Package for Social Science (SPSS ™) software (International Business Machines 
(IBM), 2015). Descriptive statistical test tools, such as those chosen to determine central 
tendency, use measurement scales to produce the mean, standard deviation and variance values 
needed by the researcher to answer this study’s research questions. The t-test is an 
inferential statistical test that determines whether there is a statistically significant 
difference between the means in two unrelated groups (Laerd Statistics, 2016). The 
two-independent-samples t-test is also referred to as a between-groups design (Tabachnick & 
Fidell, 2007). This is accomplished by testing the null and alternative hypotheses, signified by 
the assumptions, Ho: u1 = u2 and Ha: u1 ≠ u2. Now that we have defined the null and alternative 
hypotheses, we state the alpha level.

The alpha level, also referred to as the significance level, which is α = 0.05 for this study. 
Where normally an a priori alpha level is typically based on sample size by using either 0.05 
or 0.01 significance (alpha) levels. And some conservative alpha levels, such as 0.01 and 0.001, 
are commonly used to evaluate the assumption of normality discussed below (Tabachnick & 
Fidell, 2007). Next, one calculates the degrees of freedom used to attain the Critical Value 
(Appendix H, pp. 192-194). Applying the following equation, one determines both the degrees of 
freedom and the t-distribution critical value, df = (n1 – 1) + (n2 – 1). To determine the 
t-distribution critical value a t-distribution table is used. One applies the degrees of 
freedom numeric value (left side of table) to the corresponding significance (alpha) level, 
α = 0.05 (along the top of table). The corresponding numeric value of the two predetermined 
values results in the critical value (t-Distribution Table, p. 176). 

Assumptions:
Assumptions underlying the two-independent-samples t-test are predicated on three major 
assumptions, that of independence, normality and homogeneity of variance (Tabachnick & 
Fidell, 2007). The scores from the items (survey questions) are independent of each other, 
that is, scores of one participants, say online pathway students, are not systematically 
related to scores of the traditional pathway student participants. This is commonly referred 
to as the assumption of independence. Next, the dependent variable group is normally 
distributed within each of the two populations, online and traditional pathway students, 
independent variable groups. This is commonly referred to as the assumption of normality. 
Lastly, the variances of the test (dependent) variable in the two populations are equal. This 
is commonly referred to as the assumption of homogeneity of variance (Tabachnick & 
Fidell, 2007). 

Testing of Studies Assumptions:
Testing for assumptions of independence, normality and variance in this study will be conducted 
using Statistical Package for Social Science (SPSS ™) software (International Business Machines 
(IBM), 2015). This is accomplished using SPSS’s Explore command found by clicking Analyze, 
Descriptive Statistics and Explore. You will activate the explore dialogue box which will present 
you with both the dependent and independent variable groups. The two dependent variable groups, 
factor 2 (work/life experience skills) and factor 4 (general skills) along with the two 
independent variable groups online and traditional student pathways are available for testing. 
One selects an option from the dependent variable group (either factor 2 or both factors 2 and 4) 
then moves (drag-and-drop) it or them into the Dependent List box (if testing only one or more 
variables for normality). 

Optionally, in this study the researcher will also select both independent variables, online and 
traditional student pathways and move them into the Factor List box to test if the variables groups
are normally distributed for each level of the independent variables. Next select Both in the 
display box and click the Statistics dialogue box which will load the Explore: Statistics dialogue 
box, then select the Descriptive box and record a 95% setting in the confidence interval for the mean. 
Then click the Continue button at the bottom of the Explore: Statistics box. Next select the Plots 
button in the Explore: Statistics box and change the options to Factor-Levels-together in the 
Boxplots section, check the Stem-and-leaf option in the Descriptive section, check the Normality 
plots with test option and then click Continue and click the Ok button. Additionally, because SPSS 
is a robust tool it can be used to produce both statistical and graphical test of normality, such 
as the Shapiro-Wilk test and the Normal Q-Q Plot (graphically illustrated). SPSS also offers 
testing for Skewness and Kurtosis before running other test to determine validity of assumptions. 

Kurtosis is the state or quality of flatness or height of the curve describing a frequency 
distribution around its mean and mode. The kurtosis which is either, leptokurtic (has a high, 
narrow concentration about the mode) and is more concentrated about the mean than the 
correspondingnormal distribution or platykurtic (has a wide, rather flat distribution about 
the mode) and is less concentrated about the mean than the corresponding normal distribution 
(University of Bedfordshire, AC, UK). If the data is not approximately normally distributed 
and/or groups sizes differ greatly, one can run the Mann-Whitney U test which is a non-parametric 
test that does not require the assumption of normality (Laerd Statistics, 2016, Descriptive and 
Inferential Statistics). 

SPSS to Analyze & Summarize Demographics:
Demographics entail many perspectives relating to the structure or sector of statistical data of 
a specific or general population. The demographics that will be computed for this study include 
Degree Concentration, Degree Attainment Date (year), Degree Pathway (T or O), Gender (M or F), 
Age (years old), Internship (Y or N) and Geographic Location (State or County) (Appendix E, p. 163). 
Because the student sample size will be drawn from the bachelor’s program at a large Western 
Christian university, the Degree Concentration column will have a list of the career fields 
available in the Bachelors program. These programs will be numbered #1 through #10 to allow 
for a numeric accounting of the students’ career field choice. Additional demographic categories 
are Degree Attainment Date, Degree Pathway (Traditional or Online), Gender, Age, Internship 
(Yes or No) and Geographic Location (State or County) (Appendix I, Demographic Directory, p. 167).

Analyzing and summarizing of demographic data will be accomplished using Statistical Package for 
Social Science (SPSS ™) software (International Business Machines (IBM), 2015). Once all table 
category titles and variables are recorded in the Variable View in the appropriate columns, we 
will switch panels to the Data View where this researcher will transcribe the statistical data 
recorded from the descriptive statistics category page (Appendix E, p. 163). Once all statistical 
data is recorded this researcher will use the Analyze, Descriptive statistics, Frequencies option 
button to start the analysis process. The Frequencies dialogue box will appear with all the 
variables listed in the left-hand pane, then click on the arrow button to move the selected 
variables for analysis into the Variables pane on the right side of the box. Select the 
‘Display frequency tables’ box and then click the Statistics button. The statistics dialogue box 
will appear; click on the desired statistics that you want to perform. When you have selected all 
the desired statistics (e.g. mean, median, mode, standard deviation, variance, range, etc.), 
click on the Continue button. Select which chart(s) you want to display by clicking on the 
Chart button. The chart dialog box will appear and because this research is using the Frequency 
command, we will choose the Histogram and click on the Continue button. 

SPSS offers two options in the Descriptive Statistics window, Frequencies and Descriptive. 
The two options deliver basically the same statistical data and graphical charts, but the 
descriptive command does not display a frequency breakdown of each variable or show a Median 
and Mode statistic. Additionally, this research will use the ‘Split File’ command to compare 
groups based on one variable (say Gender), by placing it in the ‘groups based on’ box. Next, 
re-run the descriptive command again apart from the Gender variable. The output will now show 
the descriptive statistics for all the variables by Gender. This is most helpful when 
explaining and displaying the results and conclusions for reports of the study 
(University of Bedfordshire, AC, UK).

Descriptive statistics use data to provide descriptions of the population, using numerical 
calculations, which are represented through tables, figures, diagrams, charts and graphs. 
Inferential statistics makes inferences and/or predictions about a population based on a sample 
of data taken from the population in question (Laerd Statistics, 2016, Descriptive and Inferential 
Statistics). The significance level, (5% = 0.05) set prior to the test of the independent variables 
(online and traditional college and university pathway students’ scores), will determine how 
closely the correlation between the data (95% Confidence Level) will aid in determining possible 
causes of the effected outcome observed; a lack of needed skills of new college and university 
graduate employees. Hypotheses testing of the CareerEDGE Employability Development Profile (EDP) 
survey instrument scores, will be set at > 0.05, which means that if the significance 
(alpha level) is greater than > 0.05, the null hypotheses is rejected. If the significance is 
equal or less than = < 0.05, then the null hypotheses is accepted.  

Ethical Considerations:
Addressing issues of anonymity, confidentiality, privacy, coercion, and any potential conflicts, 
such as name of respondent, address, telephone number, race, organizational affiliations will not 
be collected. All raw data, survey instrument responses, computer files, demographic and geographic 
information will be secured on external drives and kept in a safe and only viewed by the primary 
investigator, Edward Files, BSM, MBA. This researcher will request that all students from both 
online and traditional degree pathways, from the Bachelor program at a large Western Christian 
university in the school year 2017 to participate. This research does not involve the physical 
evaluation or manipulation of human subjects. It does require human subjects to participate in 
an online survey administered through the colleges email survey system; therefore, the potential 
risk associated with the study is low.

Participants that will be involved in the survey are required to read and sign an informed consent 
statement (Appendix B, pp. 187-190, this document). The consent form will define for them the 
purposeand procedures of the proposed study, as well as address the risks and benefits of the 
study to both the students and university.  The research method, including the invitation to 
participate email, survey instrument, and follow-up email, must have the approval of the large 
Western Christian universities Internal Review Board (IRB) to conduct this research. All 
participants are volunteering for the e-mail survey and “must be capable of informed consent; 
no surrogates or proxies can be accepted” (Gordon, Levine, Mazure, Rubin, Schaller, & 
Young, 2011, p. 25).

Respondents will not use their name on the survey instrument, as only those members of the student 
body presently active in the bachelor’s degree seeking program will be invited to participate in 
the academic year 2017 survey process at the large Western Christian university. Any additional 
information collected (Appendix E, Demographic and Geographic Information, p. 163) garnered through 
the survey instrument, will be coded in accordance with the Code of Federal Regulations (Protection 
of Human Subjects), referred to as the Common Rule (U.S. Department of Health and Human Services, 
2015). For all participating departments and agencies, the Common Rule outlines the basic provisions 
for IRBs, informed consent, and Assurances of Compliance (U.S. Department of Health and Human 
Services, 2015). 

The survey results and all analysis of raw data and results, both electronic and paper, will be 
secured in a safe in accordance with the Association for Qualitative Research (AQR) review for a 
period of three years. At the end of the three-year period, the information will be destroyed 
through wiping of electronic data and burning of any paper notes, comments, analysis, etc. As 
this study uses a quantitative methodology the above requirements are not mandatory but are 
associated with both validity and reliability concerns discussed earlier.  

Limitations and Delimitations:
In any study, there are possibilities of multiple sources for limitations, beyond that of the 
literature reviews themselves (Baruch & Holton, 2008; Branch, 2007; Podsakoff, MacKenzie 
& Podsakoff, 2012). While one can deflect the possibility of incorrect information and data 
from primary research studies and articles by closely checking one’s references, some forms of 
limitations are less identifiable. Limitations of this study may concern factors of low response 
rate in answering the survey questions, non-completion of survey instrument questions, bias by 
respondents toward online surveys, and methodology bias, (the biasing effects that measuring two 
or more constructs with the same methodology may have on estimates of the relationships between 
them) (Podsakoff, MacKenzie & Podsakoff, 2012). 

For the sake of this study, delimitation is defined as, a shortcoming of this study due to the 
researchers’ decision-making-process. One of these decision-making-processes, such as using a 
research design that incorporates a convenience-sampling-frame, which produces delimitations for 
this study concerning the population size (N) being restricted to just the Bachelor student 
degree program (Baruch & Holton, 2008). Using a low general population (N) size directly 
affects the sample size response rate (n) in this survey research study. The minimum number of 
requirements, as per the large Western Christian university general rule of thumb on survey 
research is 10 subjects per surveyquestion or 280 respondents for this study. 

For this study an a-priority Power Analysis was conducted to justify the study sample size based 
on the anticipated effect size (d = 0.5) and the critical t = 1.9714347, Degrees freedom (Df) 
= 208 and Actual power = 0.9501287, which produced a respondent (n) sample size of 210 
respondents were needed for statistical validity alone (G*Power, 2015) (Figure 4, p. 204). 
The survey instrument will be hosted by the large Western Christian universities email survey 
system to minimize further delimitations in the Data Collection and Management sections. The 
following section will summarize the preceding sections and describe the organization of the 
remainder of the study. 

Summary 
Chapter 3 focuses on the research methodology used in this study. It defines the research design, 
sample population, the theoretical foundations (Becker, 1964; Schultz, 1961; in Walters, 2004, 
pp. 99-100), the research instrument (Dacre-Pool & Sewell, (2007), data analysis, validity 
and reliability, and any ethical considerations that should be under investigation by the large 
Western Christian universities, Institutional Review Board (IRB) network. The invitation (e-mail) 
introduction letter will discuss the large Western Christian universities Internal Review Board 
(IRB) and Informed Consent approval process. 

The individual steps used in the collection of the data is covered in the Data Collection
Procedures section and Data Analysis Procedures section describes the details involved in 
computation and analysis of the raw data (Blakstad, 2015; Brasoveanu, 2012; Boone & 
Boone, 2012). The methodology is quantitative, and the research method is causal-comparative 
and data collection will be conducted through a questionnaire (research survey design method) 
instrument (Dacre-Pool & Sewell, 2007). The instrument was tested as validated and 
pronounced reliable through Exploratory Factor analysis (EFA) and Confirmatory Factor analysis 
(CFA) (Dacre-Pool & Sewell, 2007). This researcher has permission to use Dacre-Pool & 
Sewell, (2007) survey instrument for this study (Appendix D, pp. 190-193). 

An invitation to participate in an online survey will be sent to all students from both online 
and traditional degree pathways in 2017 and covers the Bachelor program at the large Western 
Christian university. Per Dacre-Pool, Qualter and Sewell (2013), little empirical research has 
been conducted in relation to graduate employability and/or the tools that measurement them. 
This is the study that supplied the CareerEDGE Employability Development Profile (EDP) 
Instrument for this study (Dacre-Pool & Sewell, 2007). Additionally, per Deepa and 
Seth (2013), more research is needed to explore the current gaps between educational curriculum 
design and the standards hiring authorities say they need. 

Quantitative is the best methodology compared to others available because it will answer, 
statistically, the degree that the researcher can defend findings that answer the hypotheses 
thus answering the research questions, as well as addressing the problem and purpose statements 
(Gravetter & Forzano, 2012). By analyzing a list of test scores from the survey question 
instrument by using a two- independent samplest-test for statistical analysis to measure skill 
differences (Laerd Statistics, 2016; Light & McGee, 2015; Shuttleworth, 2015; Zhao, 2015). 

The significance of this study is that it will produce statistical data that will lead to a more 
accurate, essential and deeper understanding of the relationship between education and application 
of employability skill sets of today’s graduates. The selected research design and instrument 
(causal-comparative and survey questionnaire) using the 18 CareerEDGE Employability Development 
Profile (EDP) skill set questions, derived from Factor 2 (work/life experience skills) and Factor 
4 (general skills) are justified as they align with the research questions, hypotheses, variables, 
problem and purpose statements and defined gap from the literature reviews. 

These factors will be measured against the independent variable groups, online and traditional 
college pathway students. The research questions align with the problem and purpose statements, 
methodology, research design, instrumentation, data collection and analysis approach, which 
illustrates the need for this study as determined by the 2014, U.S. Government multi-departments 
call for further research in numerous areas concerned with education, economics and labor, as well 
as the communities affected (U.S. Departments of Labor, Commerce, Education and Health & Human 
Services, 2014, p. 21, Para 1). The instrument was tested as validated and pronounced reliable 
through Exploratory Factor analysis (EFA) and Confirmatory Factor analysis (CFA) (Dacre-Pool & 
Sewell, 2007). This researcher has permission to use Dacre-Pool and Sewell, (2007) survey 
instrument for this study (Appendix D, pp. 190-193).
Dr. Files strongly believes that our current U.S. school system is deleterious to pedagogy!