PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of phimLink to Publisher's site
 
Perspect Health Inf Manag. 2010 Summer; 7(Summer): 1a.
Published online 2010 September 1.
PMCID: PMC2921299

Development of an Instrument to Measure Students' Perceptions of Information Technology Fluency Skills: Establishing Content Validity

Marcia Sharp, MBA, RHIA, an assistant professor in the College of Allied Health Sciences

Introduction

This article describes the process undertaken to develop and validate a tool to measure students' perceptions of their information technology (IT) fluency skills. Why is this important? There is a growing concern that students today are not prepared to live, learn, and work in a technology-rich society.13 Today's college students do not have the necessary IT fluency skills despite their widespread use of the Internet.46 Studies that assess students' IT fluency skills show gaps between the perception and reality of these skills.7,8 These studies use many assessments and instruments to evaluate students' IT fluency skills; however, tools have not been developed specifically to assess allied health students' perceptions of their IT fluency skills.

The purpose of this study is to establish the content validity of an instrument to measure students' perceptions of their IT fluency skills using a rigorous judgment-quantification process. The IT fluency instrument developed and validated herein will be used for future studies comparing allied health students' perceptions of their IT fluency skills with their actual IT fluency skills.

Review of the Literature

The assessment of students' perception of IT fluency skills derives from the National Research Council's IT fluency report of 1999. The report challenges the use of the term computer literacy, which implies just having a particular skill or basic knowledge, whereas fluency involves deep understanding and critical thinking skills with the ability to adapt to changes in technology.9 The Computer Science and Telecommunications Board of the National Research Council (NRC) devised the concept of fluency with information technology (FITness) to measure the ability of an individual to handle information technology. While computer literacy is defined with a focus on computer skills, specifically the ability to use a few computer applications, FITness requires that people understand information technology well enough to apply it productively in work situations and in their daily lives, to recognize when information technology may assist or hinder the achievement of goals, and to adapt to changes in and the advancement of information technology.10,11

FITness requires three kinds of knowledge: contemporary skills, foundational concepts, and intellectual capabilities.12 Contemporary skills, the ability to use today's computer applications, enable an individual to apply information technology immediately.13,14 Contemporary skills are an essential component of job readiness. Foundational concepts explain the how and why of information technology. Foundational concepts are defined as the ability to understand the basic principles of computers, networks, and information systems.15,16 Intellectual capabilities are the higher-level thinking skills needed to apply information technology in complex and sustained situations. For instance, the ability to identify where errors exist in a database and solve such problems requires more than just the ability to enter data into a database. Also, the ability to understand the changing technology industry allows an intellectually capable individual to investigate alternatives to antiquated products and processes. Intellectual capabilities empower people to manipulate the medium to their advantage and to handle unintended and unexpected problems.17,18 Because foundational concepts, intellectual capabilities, and contemporary skills are essential to the IT fluency concept, they serve as the three constructs from which this tool was developed. Although many assessment instruments exist to measure students' IT fluency skills, no studies have been undertaken in the field of allied health, more specifically health information management.

Methodology

Overview

Content validity is an essential step in the development of new empirical measuring devices because it represents a beginning mechanism for linking abstract concepts with observable and measurable indicators.19 Lynn (1986) describes content validation as a two-step process beginning with the developmental stage and ending with the judgment-quantification process.20 Stage one of the process, the development stage, requires a comprehensive review of the literature to identify content for the instrument and establish relevant domains. In this study, the literature review identified approximately 30 to 40 articles on the subject of information technology fluency. After the literature was reviewed and the items were constructed, the entire instrument was developed with instructions and scoring guidelines.

The second stage, judgment-quantification, occurs when a panel of experts, working independently, evaluates the instrument and rates items of relevance according to the content domain.21 In addition, item content and clarity, as well as overall instrument comprehensiveness, are evaluated in this stage. Berk (1990) suggests that expert panel members evaluate how representative the items are of the content domain.22 As part of this process, expert panel members should be asked to provide revisions for items that are not consistent with conceptual definitions.23 Clarity of items is another element for content experts to evaluate.24 Finally, the instrument should be evaluated, as a whole, for overall comprehensiveness. As Grant and Davis (1997) note, “This step is necessary because an instrument may have acceptable interrater agreement, but still not cover the content domain.”25

When measuring content validity, it is necessary to utilize a quantitative measure, the content validity index (CVI).2628 The CVI is calculated by tallying the results of the expert reviewers. The degree to which the expert panelists agree on the relevance determines whether the items are relevant or irrelevant. A Likert-type scale is used to determine relevance. Items that are irrelevant are scored with a 1, items that are somewhat relevant are scored with a 2, items that are quite relevant are scored with a 3, and items that are highly relevant are scored with a 4. Only items scored 3 and 4 are considered relevant and thus are used to calculate the actual CVI.

Instrument

This research required the drafting of a Perceptual IT Fluency Skills Student Survey for use with allied health students. Before information was gathered, local Institutional Review Board (IRB) approval was obtained from the University of Memphis and the University of Tennessee Health Science Center. During the summer and fall of 2009, information on IT fluency and establishing content validity was gathered. Instruments and information from various sources were reviewed, and draft survey items were created. The draft survey included measures of students' perceptions of their IT fluency skills based on their contemporary skills, foundational concepts, and intellectual capabilities. The contemporary skills section was composed of eight multiple choice questions related to the student's ability to set up a computer, use a word processor to create a document, use technology to find information, or create a spreadsheet. The foundational concepts portion contained six multiple-choice questions that focus on the student's knowledge of computer operations, networks, and e-mail. The intellectual capabilities section included five multiple-choice questions to elicit the student's ability to manage computer problems, adapt to new technology, and communicate concepts.

Sample

A panel of experts was used to validate the draft Perceptual IT Fluency Skills Student Survey. The content validation process described by Lynn (1986) was used.29 A panel of experts including allied health educators and health information managers was asked to participate in the process of validating the instrument for content validity to measure allied health students' perceptions of their IT fluency skills based on the NRC definition of IT fluency. The panel of experts was selected based on their knowledge of information technology, their chosen profession within the health information field, and their having at least five years of experience monitoring and assessing students' IT fluency skills. The members consisted of individuals from education and private healthcare entities and were selected because of their involvement in developing programs for teaching information technology skills to allied health students.

According to Lynn (1986), no more than 10 panel members should be used.30 This panel consisted of seven members: three educators and four healthcare professionals. Educators held the rank of assistant professor or above, and the healthcare professionals were a director of clinical information systems, a director of health information management, a manager of veteran services, and a senior systems analyst. All panel members contacted agreed to evaluate the instrument and provide feedback. All feedback from the experts was received within two months of initial contact.

Data Collection

A cover letter explaining the purpose of the instrument; literature defining IT fluency concepts such as contemporary skills, foundational concepts, and intellectual capabilities; and instructions on how to complete the rating form were e-mailed to the panel of experts in November 2009. The researcher made a follow-up phone call to the experts to verbally explain the process and to ensure understanding of the process. The panel was asked to judge the items for clarity, relevance, and item content using a 1-to-4 scale as described above. The members were asked to provide suggestions for any revisions or changes needed. The Content Validity Setup designed by Lynn (1986) was used as a model for this task.31

After all correspondence was received regarding content validity for each item, a focus group was held to evaluate the instrument for overall comprehensiveness. Six of the seven panel members participated in the focus group. The objective of the focus group was to reach consensus on the overall comprehensiveness of the instrument, that is, to determine whether the experts felt the instrument measured what it was intended to measure.

Findings

The calculation or proportion that is sufficient for determining content validity agreement was explored in the literature. A CVI of 0.70 represents average agreement; 0.80, adequate agreement; and 0.90, good agreement.32, 33 According to Lynn (1986), when there are six or more judges, the CVI should be no lower than 0.78 for an item to be judged acceptable.34 A CVI of 1.00 indicates 100 percent agreement between raters. A CVI was calculated for each item (see Table Table1)1) and for the overall instrument.

Table 1
Content Validity Index (CVI) of Survey Items

Results from the panel of experts yielded a 0.87 overall content validity index. Six items had a CVI below 0.78 and were deleted from the instrument. Two experts suggested minor revisions regarding the clarity or wording of the items, and those revisions were incorporated into the instrument. One expert suggested that the word “connected” be defined in question 5 under contemporary skills (this question was subsequently deleted from the survey draft because its CVI was below 0.78). Another expert suggested that the word “system” be changed to “application” in question 7 of the contemporary skills category (also subsequently deleted). Once all items had been evaluated and all changes were made, the revised instrument was sent to the panel of experts to evaluate the overall instrument.

The focus group discussed the instrument for overall comprehensiveness. None of the experts suggested additional content or changes at this time. The CVI for the revised instrument (which can be found in Appendix A) was 1.00. Based on the CVI for each item as well as that for the overall instrument, it is believed that the instrument contains questions relevant to students' perceptions of their IT fluency skills.

Conclusion

Content validity is a critical step in the selection and administering of an instrument. The two-step method used in this study, consisting of a developmental stage and a judgment-quantification stage, required a comprehensive literature review, item creation, and agreement from a specific number of experts about the items' and the entire instrument's validity. Seven experts were asked to identify omitted areas and to suggest areas for improvement, and these revisions were made. The process used to determine content validity proved to offer consistency, rigor, and structure to the instrument development. High CVI scores were generated for those items judged relevant to the content domain as well as for the overall instrument. The results support the content validity of this instrument as a tool for measuring students' perceived information technology (IT) fluency skills.

Appendix

Perceptual IT Fluency Skills Student Survey: Final Draft for Expert Panel Evaluation of Content Relevance

StatementContent Relevance (please circle or highlight your choice)
Contemporary Skills: ability to use today's computer applications, enable people to apply information technology immediately. Participants will answer these questions as no knowledge, some knowledge, average knowledge, expert knowledge.
  • 1.
    When it comes to using basic operating system features, I consider myself to have _____.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • 2.
    When it comes to using a software program to create a text document, I consider myself to have _____.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • 3.
    When it comes to using a graphics or art package to create illustrations, slides, or image-based expression of ideas, I consider myself to have _____.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • 4.
    When it comes to using a spreadsheet to model simple processes of financial tables, I consider myself to have _____.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • 5.
    When it comes to using instructional materials to learn how to use a new application or features, I consider myself to have _____.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
Foundational concepts: basic principles and ideas of computers, networks, and information. Participants will answer these questions as strongly disagree, disagree, neutral, agree, strongly agree.
  • 1.1 can explain how a computer operates.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • 2.
    I can identify a computer hardware problem.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • 3.1 can identify a computer software problem.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3= quite relevant but needs minor revisions
  • 4 = highly relevant
  • 4.1 can define computer storage.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
Intellectual capabilities: the ability to apply information technology in complex and sustained situations … fosters more abstract thinking about information and its manipulation. Participants will answer these questions as strongly disagree, disagree, neutral, agree, strongly agree.
  • 1.
    If something went wrong with my computer or a computer I was using, I would likely:
  • Ignore the problem
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Troubleshoot the problem myself
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Find a way to work around the problem
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Use online support and/or knowledge bases to solve the problem
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Use printed reference manuals to identify and solve the problem
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Ask a friend or family member for help
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • 2.1 can easily learn new software applications.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • 3.1 feel comfortable and confident when using new technologies.
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • 4.
    When I want to use a new function or feature in a software application, I would likely:
  • Suggestions for changes:
  • Use the application help screens
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Read the user manual
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Call a help desk attendee
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Ask a friend or family member
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Access online resources
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant
  • Figure it out myself
  • Suggestions for changes:
  • 1 = not relevant
  • 2 = somewhat relevant
  • 3 = quite relevant but needs minor revisions
  • 4 = highly relevant

Notes

Dougherty J., Kock N., Sandas C., Aiken R. “Teaching the Use of Complex IT in Specific Domains: Developing, Assessing and Refining a Curriculum Development Framework.x201d; Education and Information Technologies. 2002;7(2):137–54.
Educational Testing Service. Succeeding in the 21st Century: What Higher Education Must Do to Address the Gap in Information and Communication Technology Proficiencies. 2003. Available at http://www.ets.org/ictliteracy/succeeding1.html (accessed October 12, 2009).
Salaway G., Caruso J. B. “Students and Information Technology in Higher Education.x201d; EDUCAUSE Center for Applied Research. 2007;6:1–124.
Hilberg, S. “Fluency with Information and Communication Technology: Assessing Undergraduate Students.” Doctoral dissertation, Wilmington College, 2007.
Katz I. “Beyond Technical Competence: Literacy in Information and Communication Technology.x201d; Educational Technology. 2005;45(6):44–47.
Resnick M. “Rethinking Learning in the Digital Age.” In: Kirkman G., editor. The Global Information Technology Report 2001–2002: Readiness for the Networked World. New York: Oxford University Press; 2002.
McEuen S.F. “How Fluent with Information Technology Are Our Students?x201d; Educause Quarterly. 2001;4:8–17. Available at http://www.educause.edu/ir/library/pdf/eqm0140.pdf (accessed October 18, 2009).
Stone J., Madigan E. “Inconsistencies and Disconnects.x201d; Communications of the ACM. 2007;50(4):76–79.
National Research Council Computer Science and Telecommunications Board . Being Fluent with Information Technology. Washington, DC: National Academy Press; 1999.
Lin H. “Fluency with Information Technology.x201d; Government Information Quarterly. 2000;17(1):69–76.
National Research Council Computer Science and Telecommunications Board. Being Fluent with Information Technology
National Research Council Computer Science and Telecommunications Board. Being Fluent with Information Technology
Dougherty, J., N. Kock, C. Sandas, and R. Aiken. “Teaching the Use of Complex IT in Specific Domains: Developing, Assessing and Refining a Curriculum Development Framework.”
National Research Council Computer Science and Telecommunications Board. Being Fluent with Information Technology
Dougherty, J., N. Kock, C. Sandas, and R. Aiken. “Teaching the Use of Complex IT in Specific Domains: Developing, Assessing and Refining a Curriculum Development Framework.”
National Research Council Computer Science and Telecommunications Board. Being Fluent with Information Technology
Dougherty, J., N. Kock, C. Sandas, and R. Aiken. “Teaching the Use of Complex IT in Specific Domains: Developing, Assessing and Refining a Curriculum Development Framework.”
National Research Council Computer Science and Telecommunications Board. Being Fluent with Information Technology
Wynd C.A., Schmidt B. A., Schaefer M. A. “Two Quantitative Approaches for Estimating Content Validity.x201d; Western Journal of Nursing Research. 2003;25(5):508. [PubMed]
Lynn M. “Determination and Quantification of Content Validity.x201d; Nursing Research. 1986;35:382–85. [PubMed]
Lynn M. “Determination and Quantification of Content Validity.x201d; Nursing Research. 1986;35:382–85. [PubMed]
Berk R. “Importance of Expert Judgment in Content-Related Validity Evidence.x201d; Western Journal of Nursing Research. 1990;12:659–71. [PubMed]
Lynn, M. “Determination and Quantification of Content Validity.” [PubMed]
DeVillis R. F. Scale Development: Theory and Applications. Newbury Park, CA: Sage; 1991.
Grant J.S., Davis L. “Selection and Use of Content Experts for Instrument Development.x201d; Research in Nursing & Health. 1997;20:271. [PubMed]
Anders R.L., Tomai J. S., Clute R. M., Olson T. “Development of a Scientifically Valid Coordinated Care Path.x201d; Journal of Nursing Administration. 1997;27:45–52. [PubMed]
Summers S. “Establishing the Reliability and Validity of a New Instrument: Pilot Testing.x201d; Journal of Post Anesthesia Nursing. 1993;8:124–27. [PubMed]
Wynd, C. A., B. A. Schmidt, and M. A. Schaefer. “Two Quantitative Approaches for Estimating Content Validity.” [PubMed]
Lynn, M. “Determination and Quantification of Content Validity.” [PubMed]
Lynn, M. “Determination and Quantification of Content Validity.” [PubMed]
Lynn, M. “Determination and Quantification of Content Validity.” [PubMed]
Lynn, M. “Determination and Quantification of Content Validity.” [PubMed]
Wynd, C. A., B. A. Schmidt, M. A. Schaefer “Two Quantitative Approaches for Estimating Content Validity.” [PubMed]
Lynn, M. “Determination and Quantification of Content Validity.” [PubMed]

Articles from Perspectives in Health Information Management are provided here courtesy of American Health Information Management Association