You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

Efficacy of an ergonomics intervention for remote college students



The COVID-19 pandemic caused a rapid adaptation of online education, requiring university students to complete their schoolwork remotely. There is a gap in the evidence-based literature regarding these novel home workstations and the potential to help students understand ergonomics and adjust their workstations.


We aimed to determine if a remote ergonomics intervention would encourage students to make improvements to their workstation and increase their knowledge of ergonomics.


Participants completed an ergonomics quiz, workstation evaluation, activity time log, and photographs of their workstation. There were three randomly assigned groups, the control group of 26 participants and the first and second intervention groups with 25 participants each. The first and second intervention groups received information sheets regarding proper workstations. The second intervention group was also required to participate in an ergonomics workshop. Six weeks after receiving the interventions, the control group and two intervention groups completed the materials once again. Eight participants from the control group, 12 from the information intervention group, and 14 from the participatory intervention group completed the study.


One-way ANOVA tests between the three groups suggest there was no significant difference in ergonomic knowledge or changes made to workstations. However, the remote participatory ergonomics group increased their level of knowledge about ergonomics.


An ergonomics intervention did not impact one group to change their workstation more than others in six weeks. Future studies on this topic should be conducted over a longer amount of time and with more participants to allow for more opportunities for behavior and workstation changes.


In mid-March, 2020, the COVID-19 pandemic began shutting down schools in the United States. University students were sent home and all coursework was pivoted to online formats. An urban university in the Northeast followed suit, closing all facilities and requiring coursework to be completed online. Beginning in the fall of 2020, this urban university adopted a Learn from Anywhere (LfA) model, a hybrid approach giving students the autonomy to select returning in person or continuing with virtual or asynchronous instruction. However, students who chose to engage remotely or asynchronously were not provided with any education or information on how to set up an ergonomically correct workstation.

Research suggests that there are common issues among at-home computer workstations [1]. Davis and colleagues conducted a study to analyze the home offices of faculty members at the University of Cincinnati [1]. Their research found that many workstations were not set up appropriately [1]. This is the first in a quickly growing and needed body of literature regarding the impact of the COVID-19 pandemic on the setup of home workstations.

Setting up a home workstation must use ergonomic principles to help prevent students from developing computer-related musculoskeletal pain, discomfort or injury [1]. A previous study conducted in 2002 evaluated the frequency of computer-related musculoskeletal discomfort among university students [2]. Hamilton and colleagues found that 80.6% of the participants experienced pain and discomfort related to computer use. The prevalence of computer-related musculoskeletal pain and discomfort among non-remote learners suggests remote learners may also experience these issues, especially as computer use increases during the COVID-19 pandemic. The prevalence results from Hamilton and colleagues’ study were consistent with a study conducted by Katz et al. in 2000 identifying upper extremity musculoskeletal disorders among college students [3].

As students are conducting their work remotely, research on their workstation setups had to be done using telehealth technology. Research supports the ability to evaluate a workstation remotely using telehealth technology [4–6]. In these studies, researchers have utilized workstation checklists and photographs to evaluate workstations [1, 4, 5, 7]. Telehealth includes remote synchronous technology to conduct live assessments of an individual’s workstation. Professional ergonomists/occupational therapists were able to correctly identify all issues with participants’ workstations, but on occasion identified ergonomic issues that were not present [4]. Evaluating a workstation using telehealth technology is considered reliable and accurate [4, 5, 7].

This research aims to answer the question: will remote learners who receive an ergonomic intervention via telehealth technology make adjustments to their workstations? We developed a study to analyze the potential impact of an ergonomic intervention on the setup of students’ home workstation. We investigated whether education regarding ergonomics helped to minimize computer related musculoskeletal discomfort, pain, or injury. We hypothesized that educating students about ergonomics and offering recommendations will encourage them to make changes to their workstation and take periodic breaks, resulting in the prevention of discomfort, pain, or injury.



Participants were all undergraduate or graduate students at an urban university located in the Northeast. The study took place in the spring of 2021. Participants submitted the initial set of materials during the first week of February and the post-intervention set of materials in mid-March. The participants were recruited using a digital flyer that was distributed by organizations and clubs, posted on residence hall bulletin boards, and advertised on social media. To participate, students were required to be current undergraduate or graduate students at this specific university, completing at least 80% of their work remotely, and using technology such as Zoom. Seventy-seven students enrolled in the study, were assigned ID numbers, and randomly split into three groups: the control group, the first intervention group, and the second intervention group. All participants were asked to submit photographs of their workstation as well as complete a quiz, an evaluation, and an activity time log. Forty-six of the original 77 participants returned the initial materials and 35 participants returned all of the necessary materials. Participants were not offered any incentive to participate. Table 1 illustrates the demographic make up of each group.

Table 1

Demographic information

CharacteristicsControl (n = 8)Information intervention (n = 12)Participatory intervention (n = 14)
Primary location
  Family home200
  Off-campus apartment459
  On-campus apartment123
Year in school
  Undergraduate first or second year111
  Undergraduate third year153
  Undergraduate fourth year4610
  Graduate student201
Type of notes
Hours spent on computer per day
  4–6 hours343
  6–8 hours446
  8–10 hours012
  10+ hours111


An ergonomics quiz was used to understand participants’ knowledge of ergonomics and proper workstation setups. This quiz contained 10 questions and asked about various aspects of a workstation (see Table 3). Participants completed the quiz prior to receiving the intervention in order to establish a baseline assessment of their ergonomics knowledge. All questions included on the quiz were developed using the information sheets that were later provided to intervention groups. This was the first instance of using this specific quiz, though quizzes have been used in previous studies to establish pre-knowledge of ergonomics and the change in participants’ knowledge of ergonomics after an intervention [3, 8].

Table 3

Ergonomics quiz questions

CategoryNumber of questionsSample questionSample answer
Monitor4Your computer screen should be . . . ”Perpendicular to the window
Chair2In your chair, you should . . . ”Sit so that your back is against the back of the chair and slightly reclining
Breaks1When should you take a break from your computerEvery 20 minutes
External devices1Which external input devices are people encouraged to use?Both an external mouse AND keyboard
Arm position2Wrists should be . . . ”Kept straight and hovering above keyboard

The Workstation Evaluation Questionnaire used in this study is a simplified version of the Army-Work Area Evaluation Checklist adapted by Karen Jacobs from “Creating the Ideal Computer Workstation: A Step-by-Step Guide” [9]. The version used in this study included 65 questions about characteristics of an individual’s home workstation. The subgroups included work area, desk evaluation, chair evaluation, footrest evaluation, monitor evaluation, keyboard evaluation, lighting and glare, demographic characteristics, and mouse, trackball, and other input devices. Participants completed the questionnaire both before and after the intervention was given.

Participants completed an activity time log over the course of a week before the intervention and six weeks after the intervention. Participants outlined the activities they engaged in hourly. In this activity time log, participants indicated where they completed their work and their level of discomfort or pain. The majority of participants completed the activity time log digitally, with some participants manually filling it out and digitally submitting it. The time log was open ended with space to indicate activities, pain, and location of conducting work for each hour over the course of a week. Participants were asked to report pain location and duration. Severity of musculoskeletal pain and discomfort was not measured in this study.


2.3.1Group 1 – Information intervention group

After providing all of the necessary materials, the first intervention group was sent two information sheets regarding ergonomics. The two information sheets were the Ergonomic Mousepad [10] and the Home Office Ergonomic Tips [11] published by the American Occupational Therapy Association. Participants were encouraged to review the information on the sheets at their convenience. No measures were used to track review of the documents.

2.3.2Group 2 – Participatory intervention group

The second intervention group received the same information sheets the first group received and also engaged in a participatory ergonomics workshop. Between two and five participants attended each participatory workshop. The workshops lasted approximately 30-minutes and included a short lecture, practice questions, and problem solving exercises. The researcher explained the elements of an ergonomically correct workstation before asking the participants to identify common issues and possible solutions. The intention was to give participants the skills to troubleshoot their own workstations and come up with potential solutions.


Researchers obtained Institutional Review Board (IRB) approval for the study. The principle investigator (PI) randomly assigned each participant a code number. The PI created three randomly assigned groups: the control group with 26 participants, the first intervention group with 25 participants, and the second intervention group with 25 participants. The researcher then distributed the material and asked participants only to refer to themselves as their code numbers as to maintain anonymity. Approximately one week later, the researcher hosted five ergonomic workshops for the second intervention group. Participants were asked to attend one of the five workshops. Attendance was taken at these workshops and a total of 15 participants attended the participatory workshops. Ten members of each intervention group did not complete their initial materials and were therefore deemed ineligible to receive the intervention. The researcher distributed the information sheets to the remaining 15 members of the first intervention group and the 15 members of the second intervention group. Approximately six weeks after offering the intervention, the same evaluation materials were distributed to the participants in order to understand any changes participants made to their workstation and rest schedule and any changes in computer related musculoskeletal discomfort or pain.

3Data analysis

Data analysis was conducted using Microsoft Excel. The ergonomic quiz was scored on a scale of 10 points to gauge ergonomic knowledge. Participants’ workstations were given a score out of 58 based on how ergonomically correct their workstation was according to the workstation evaluation. Two-tailed paired t-tests were used to examine differences between pre- and post-intervention scores on the ergonomics quiz and workstation evaluation within each group. One-way analysis of variance (ANOVA) tests were run to determine any baseline differences in quiz scores and workstation setups between the three groups. Post-intervention ergonomics quiz scores, changes in ergonomic scores, and changes in workstation setup were compared using one-way ANOVA tests, no post-hoc analyses were required. The photographs participants submitted were analyzed using the workstation evaluation as a checklist to determine if the changes made to workstations were improvements or not productive changes. The researchers manually counted pain duration recorded on activity time logs.


The results show that at baseline there was no difference in scores on the evaluation (p = 0.35) or ergonomics quiz (p = 0.43). Analysis of the post-intervention scores on the evaluation and quiz resulted in no significant difference between groups on both the evaluation (p = 0.39) and the quiz (p =0.10).

The mean score on the ergonomic quiz before the intervention for all three groups ranged from 3.5 to 4.36 out of 10, with the control group scoring the lowest and the second intervention group scoring the highest (see Table 4). The control group and first intervention group scores did not significantly change (t (6) = 0.08, t(10) = 0.10, p < 0.05). The second intervention group that received a participatory intervention significantly improved their scores from pre-intervention to post-intervention (t(12) = 0.01, p < 0.05).

Table 4

Statistical results (mean and standard deviation)

Pre-intervention scorePost-intervention scoreChangeT-tests (t-value)
Ergonomics quiz
  Control group (n = 8)3.5 (1.69)4.88 (0.99)1.38 (1.92)0.083
  Information intervention (n = 12)3.83 (1.59)5 (1.81)1.17 (2.21)0.095
  Participatory intervention (n = 14)4.36 (1.39)6.07 (1.27)1.71 (1.94)0.005
  Control (n = 8)32.63 (7.33)33.88 (4.29)9.380.401
  Information intervention (n = 12)28.08 (6.501)30.17 (6.49)110.101
  Participatory intervention (n = 14)29.86 (6.60)30.21 (7.50)100.823
Pain (hours)
  Control group (n = 8)5.2572.80.735
  Information intervention (n = 12)15.3315.3–2.330.336
  Participatory intervention (n = 14)76.71–0.630.557

Mean baseline scores on the workstation evaluation ranged from 28.08 to 32.63 out of 58 possible points. The control group, first intervention group, and second intervention group all made changes to their workstations (M = 9.38, M = 11, M = 10). None of the three groups statistically significantly changed their workstations (see Table 4). The average number of positive changes that improved participants’ workstations was 1.25, 1.917, and 0.357 for the control, first intervention, and second intervention groups, respectively. The largest issues were that most participants lacked a footrest and an external mouse. The most common changes participants made were to their chairs.

One-way between groups ANOVA tests were conducted to determine if participation in an intervention group significantly impacted the number of improvements made to a workstation and ergonomic knowledge levels. There was no significant difference in changes made to individuals’ workstations between the control, first intervention, and second intervention groups (p = 0.72). Results concluded that there was not a significant difference in ergonomic knowledge levels between groups (p = 0.79). Table 2 contains the complete ANOVA results.

Table 2

ANOVA results

Ergonomics quiz

Pain duration of varying levels of pain did not significantly change from pre-intervention to post-intervention (see Table 2). A one-way between groups ANOVA did not indicate a statistically significant change in pain levels between the participants in the three groups who reported discomfort or pain (p = 0.22).


The results of our study suggest that receiving an ergonomic intervention does not impact students’ ergonomic knowledge or changes made to their workstations more than students who do not engage in an intervention. Neither the use of information sheets nor participation in a participatory workshop impacted students enough to make significant improvements to their workstations. The findings of the analysis of the ergonomics quiz supported the impact of a participatory intervention on knowledge regarding ergonomics. All but one participant in the participatory intervention group improved or maintained their score on the ergonomic quiz. Despite the significant improvement in scores, the participatory intervention group’s scores were not notably better than the mean scores of the other two groups. Scores on the ergonomics quiz at the pre-intervention stage averaged around 3.9 out of 10 points and 5.3 out of 10 points at the post-intervention stage. The low initial scores indicate a potential lack of knowledge about ergonomics and the proper workstation setup. These low scores are consistent with Nahar and Sayed’s study where they found that 74% of the student participants were unaware of proper ergonomic practices [6]. College students’ inexperience with ergonomics is concerning considering the substantial levels of computer related musculoskeletal pain and discomfort that has been previously recorded [2, 6, 12].

Participants in the information intervention group received the information sheets through email and were told to review them at their convenience. This request did not include a time frame or details on how often to review the information, as we intended to leave that decision to the individual participants. There were no measures used to evaluate how often or if participants reviewed these materials. Therefore we are unaware if the participants actively engaged with the written materials or not, which may account for the lack of statistically significant changes in ergonomic knowledge and workstation setup.

The workstation evaluation included 58 questions specific to an individual’s workstation. We completed the evaluation with answers that represented an ergonomically correct workstation. Each participant’s workstation was given a score out of 58 based on how many of their answers matched those of the model workstation. This process was repeated with the post-intervention evaluations. When comparing individuals’ pre- and post-intervention evaluations, it appeared as though most participants made numerous changes to their workstations, with the means for each group falling between 9 and 11 changes. However, on average, the improvements made were significantly less. Improvements were defined as changes made that increased the participant’s score, meaning the change made their workstation more similar to the ergonomically correct one. The photographs were used to visually compare the participants’ workstation setup at the pre-intervention and post-intervention stages and to identify improvements and not productive changes. The mean improvements for the three groups fell between 0.3 and 2 changes made. This indicates that participants filled out the evaluation differently the two times they completed it for reasons other than actively attempting to improve their workstation. The reasoning behind these changes could be that participants did not use a tape measure when answering questions and estimated the measurements differently before and after the intervention. Some participants may have entirely changed the location of their workstation, as displayed in Fig. 1a and Fig. 1b. One participant added another workstation in order to rotate in the use of a standing desk, as shown in Fig. 2. Additionally, participants made changes that were not beneficial, such as the participant in Fig. 3a switching to a chair without lumbar support in Fig. 3b. Twenty-two participants completed the evaluation faster after receiving the intervention. The speed with which participants completed the evaluation could be a potential reason for the difference in number of changes and improvements.

Fig. 1a

Participant at initial workstation. Example of a participant completely changing their workstation location from a family home to an apartment, resulting in many changes including a lumbar supportive chair, armrests, and laptop height, while making potentially unproductive changes such as facing a window.

Participant at initial workstation. Example of a participant completely changing their workstation location from a family home to an apartment, resulting in many changes including a lumbar supportive chair, armrests, and laptop height, while making potentially unproductive changes such as facing a window.
Fig. 1b

Participant at final workstation. Example of a participant completely changing their workstation location from a family home to an apartment, resulting in many changes including a lumbar supportive chair, armrests, and laptop height, while making potentially unproductive changes such as facing a window.

Participant at final workstation. Example of a participant completely changing their workstation location from a family home to an apartment, resulting in many changes including a lumbar supportive chair, armrests, and laptop height, while making potentially unproductive changes such as facing a window.
Fig. 2

A participant used home objects to create a standing desk in her rotation of workstations.

A participant used home objects to create a standing desk in her rotation of workstations.
Fig. 3a

Chair with lumbar support. An example of a participant who made an improvement by raising her laptop and an ineffective change by selecting a different chair without lumbar support.

Chair with lumbar support. An example of a participant who made an improvement by raising her laptop and an ineffective change by selecting a different chair without lumbar support.
Fig. 3b

Chair without lumbar support. An example of a participant who made an improvement by raising her laptop and an ineffective change by selecting a different chair without lumbar support.

Chair without lumbar support. An example of a participant who made an improvement by raising her laptop and an ineffective change by selecting a different chair without lumbar support.

Six questions of the workstation evaluation addressed the use of a footrest. A footrest is a helpful component of a workstation as it impacts the angle of the knee [6]. The first question asked “If your feet do not rest completely on the floor when the chair is properly adjusted, is a footrest used?” The answer choices included “Yes”, “No”, and “I do not have a foot rest”. Only two participants responded that they used a footrest. Thirty-one of the thirty-five respondents stated they did not have a footrest. It is unclear if these participants do not use a footrest because their feet comfortably reach the ground or because they do not own one when one should be used. Additional options such as “My feet comfortably rest on the floor” should have been included in this question.

The second set of materials was requested during a particularly stressful time for students. The university where this study was conducted cancelled the week-long spring break that is usually scheduled and replaced it with multiple, sporadic “wellness days”. A previous study conducted during the COVID-19 pandemic indicates that students’ mental health issues, specifically levels of depression and anxiety, significantly increased during remote learning [13]. This study concluded that the levels of depression and anxiety students experienced returned to baseline during their break [13]. These findings suggest that the lack of a continuous break from school could put increased stress on the students where this study was conducted as they were not provided the much needed break from schoolwork to recover their mental health. Participants in this study were asked to submit materials in mid-late March, soon after the time when spring break would have normally occurred. This element of increased stress potentially contributed to the high attrition rate and increased speed in which participants filled out the evaluation.

Levels of discomfort and pain were reported using an activity time log. On the time log, participants were asked to record when and where they experienced pain, including what physical environment they were in at the time. Fifteen participants did not record any discomfort or pain. This could be because they did not experience pain or because they did not understand the instructions. The other 20 participants indicated an average of 7.8 hours of pain. Common areas of pain included the lower back, upper back, neck, shoulders, and hip regions. These common areas of pain are consistent with previous research on the topic. Nahar and Sayed found neck pain and lower back pain to be the most common areas of musculoskeletal dysfunction [6]. Participants were not asked to identify the magnitude of their pain and therefore only reported the location and duration. Of the participants who reported pain, the information intervention group had the highest mean post-intervention. Both intervention groups reported less pain on average than before receiving the intervention group. Previous studies found that the prevalence of musculoskeletal discomfort among university students was significant [2, 6, 12]. Katz et al. determined there was a statistically significant relationship between hours spent on a computer and musculoskeletal disorders [12]. Hamilton and colleagues attempted to replicate these findings, but concluded there was not a causal relationship between the two variables [2].


A significant limitation of the study was the participation, including attrition and group size. Recruiting participants was a major challenge of the study because of the COVID-19 pandemic. Information was distributed through residence halls, clubs, and social media posts, which may not have garnered attention or interest in the same way in-person recruitment could have. There were high attrition rates at the beginning of the study with approximately 10 participants from each group not returning materials or dropping out of the study. An additional 10 participants left the study during the six-week period between the intervention and final data collection. There are many potential reasons for the high attrition rate. Students may have had a lack of interest in the topic or a lack of time to participate. The university where this study was conducted cancelled their spring recess, which may have led students to believe they did not have time to participate in this study among mounting pressure of schoolwork. There were low numbers of participants to begin with, which was exacerbated by the high attrition rate. The study offered no incentive for participation.

The six-week period in between the intervention and post-intervention evaluation may not have been enough time for participants to make adequate changes to their home workstations. Potential barriers for workstation changes include financial situations, time constraints, spatial restrictions, and motivation levels. COVID-19 vaccine rollout became more efficient during this six-week period and the university announced a return to in-person instruction for the Fall 2021 semester. This announcement may have been a deciding factor for students to not make changes, as they are now aware their home workstation will not be the location of their future classes.

Providing participants with information sheets as an intervention cannot justifiably be considered an intervention since engagement with the materials was not tracked. Therefore, using data from this intervention may lead to inaccurate conclusions. Limiting the study to one type of intervention that may not have been effective hindered the study significantly.

5.2Future research

Future research should be conducted with a longer longitudinal approach. A longer timeline and a larger number of participants could significantly impact the results. Furthermore, using different or more objective evaluation measures, such as additional photographs or an occupational therapist or professional ergonomist evaluating individual workstations, may give future researchers a better understanding of at home workstation setups. A similar study with a more effective intervention, such as ergonomic assessment with recommendations specific to an individual’s workstation, is needed to properly assess the efficacy of ergonomic interventions for remote college students. Our findings indicate that the utilization of a measure dedicated to musculoskeletal pain and discomfort in future studies will provide an easier interpretation of changes in participants’ discomfort or pain levels.


As work-from-home becomes a universal experience and the “new normal,” it is of utmost importance to educate students on ergonomics to help prevent musculoskeletal pain and discomfort. Research in this area is forthcoming as the scientific community continues to address relevant and critical topics. Students learned a significant amount about ergonomics through the participatory ergonomics workshop offered in this study. The six-week period of time between the intervention and follow up data collection was not long enough for impactful changes to be made. Visually assessing workstations demonstrates that changes were made with the use of homemade items. These results show promise for a more in depth intervention to have a meaningful impact on university students’ home workstations.

Conflict of interest

None to report.



Davis KG , Kotowski SE , Daniel D , Gerding T , Naylor J , Syck M . The home office: ergonomic lessons from the “new normal”. Ergonomics in Design: The Quarterly of Human Factors Applications. (2020) ;28: .


Hamilton AG , Jacobs K , Orsmond G . The prevalence of computer-related musculoskeletal complaints in female college students. Work. (2005) ;24: (4):387–94.


Jacobs K , Johnson P , Dennerlein J , Peterson D , Kaufman J , Gold J , Williams S , Richmond N , Karban S , Firn E , Ansong E , Hudak S , Tung K , Hall V , Pencina K , Pencina M . University students’ notebook computer use. Applied Ergonomics. (2009) ;40: (3):404–9.


Baker NA , Jacobs K . The feasibility and accuracy of using a remote method to assess computer workstations. Human Factors. (2014) ;56: (4):784–8.


Jacobs K , Blanchard B , Baker N . Telehealth and ergono-mics: A pilot study. Technology & Health Care. (2012) ;20: (5):445–58.


Nahar S , Sayed A . Prevalence of musculoskeletal dysfunction in computer science students and analysis of workstation characteristics- an explorative study. International Journal of Advanced Research in Computer Science. (2018) ;9: (2):21–7.


Ritchie CLW , Miller LL , Antle DM . A case study detailing key considerations for implementing a telehealth approach to office ergonomics. Work. (2017) ;54: (4):469–73.


Robertson MM , Amick BC 3rd, Hupert N , Pellerin-Dionne M , Cha E , Katz JN . Effects of a participatory ergonomics intervention computer workshop for university students: a pilot intervention to prevent disability in tomorrow’s workers. Work. (2002) ;18: (3):305–14.


Department of Defense Ergonomics Working Group. Creating the Ideal Computer Workstation: A Step-by-Step Guide. 2000; pp. 33.


Jacobs K . Ergonomics for your Notebook Computer Work-station Mousepad. Department of Occupational Therapy: Boston University. (2008) .


Dorsey J . Home Office Ergonomic Tips. American Occupational Therapy Association. 2020;1-2.


Katz JN , Amick BC , Carroll BB , Hollis C , Fossel AH , Coley CM . Prevalence of upper extremity musculoskeletal disorders in college students. The American Journal of Medicine. (2000) ;109: :586–8.


Huckins JF , daSilva AW , Wang W , Hedlund E , Rogers C , Nepal SK , Wu J , Obuchi M , Murphy EI , Meyer ML , Wagner DD , Holtzheimer PE , Campbell AT . Mental health and behavior of college students during the early phases of the COVID-19 pandemic: longitudinal smartphone and ecological momentary assessment study. Journal of Medical Internet Research. (2020) ;22: (6).