research article

A Novel Framework for Analysis of Lower Limb Movements: Integration of Augmented Reality and Sensor-based Systems

Edward Davis1, Riki H. Patel1*, B. Sue Graves3,Vivek Sawhney2, Manish Gupta2, Abhijit S. Pandya1,2

1Department of CEECS, Florida Atlantic University Boca Raton, FL, USA

2Sports and Orthopedic Center, Coral Springs, Florida

3Department of Exercise Science and Health Promotion, Florida Atlantic University, Boca Raton , FL, USA

*Corresponding author: Patel RH, Department of CEECS, Florida Atlantic University Boca Raton, FL, USA

Received Date: 07 June, 2022

Accepted Date: 20 June , 2022

Published Date: 27 June, 2022

Citation: Davis E, Patel R H , Graves B S, Sawhney V, Gupta M, Pandya A S (2022) A Novel Framework for Analysis of Lower Limb Movements: Integration of Augmented Reality and Sensor-based Systems. Sports Injr Med 6: 183. DOI: https://doi.org/10.29011/2576-9596.100183

Abstract

In paper, an augmented reality device was coupled with motion sensor units to function as a system of cooperative technologies for usage within exercise science and neurorehabilitation. Specifically, in a subfield of exercise science called biomechanics, the assessment and analysis of movements are critical to the evaluation and prescription of improvements for physical function in both daily and sport-specific activities. Furthermore, the systematic combination of these technologies provided potential end-users with a modality to perform exercise within, and correlated feedback based upon the end-user’s exercise performance. Data collection specific to biomechanics can provide both the end-user and their evaluators with critical feedback that can be used to modify movement efficiency, improve exercise capacity, and evaluate exercise performance. By coordinating both technologies and completing movement-based experiments, the systems were successfully integrated.

Keywords: Magic Leap; AR/VR; Lower Limb Movements; wearable sensors.

Introduction

AR is a relatively new technology that bears the potential to be used in motor learning and skill acquisition specific to exercise by allowing the individual to perform exercise safely as they can visually see their environment. In terms, the primary benefit of AR is that virtual objects can be integrated and overlayed for the end-user to observe in the real world while completing movement-based tasks. Although some activity-based video games (AVGs), like Pokemon GO, involve AR; they are less immersive when compared to the fully immersive devices like the Microsoft Hololens 2 or the Magic Leap ML 1. Yet, devices like three-dimensional motion capture (3-D MOCAP), the gold standard of motion assessment, are not practically affordable for every day usage by the general population. From an observational standpoint, these devices take up minimal space on the end-user’s body during activity, thus, when participating in activity, other supporting technologies can be worn for a full quantitative assessment.

Despite their projected usage within the field of exercise science, AR devices do not record movement-based data for biomechanical evaluation because that is not their intended use. To account for this deficiency and achieve the desired function, wearable sensors capable of assessing motion about a joint can be worn that record and analyze data while the end-user performs movement. Thus, multiple technologies with variable primary functions can be integrated together to achieve a complex objective and in turn create a new host modality by which individuals can perform, monitor, and track exercise performance, biomechanical improvements in motor pattern development, and sport-specific testing and assessment.

As AR devices are a reality-based technology that allows for the integration of virtual objects into the real-world environment [1-3], they are primarily utilized for medical, military, and educational purposes, specifically, the skill-based training of personnel to handle field-specific situations and simulations properly [2-4]. Since AR is a relatively new technology, a potential field of application that limited research has been performed in is Exercise Science and Neurorehabilitation. Since exercise is involved in both rehabilitation and regular exercise activity, tracking movement patterns for analysis is critical to localized improvements regarding performance-related improvements. Concurrently, activity-based AR applications function as a construct to provide a modality by which an individual will take part in while emphasizing exercise participation. However, despite providing the modality by which exercise would be performed, there is currently no means of evaluating the movement pattern internal to the ML1 device. Thus, an external system comprised of multiple sensor devices must be integrated to function cooperatively with the AR device creating an integrated system by which the end-user can perform movement and observe the post-participation feedback based on their executed movements. Therefore, the purpose of this study was to integrate the AR device produced by Magic Leap, the Magic Leap One (ML1), with three WitMotion sensors to track physical movement during participation in an AR-based activity-based video game.

This study aimed to integrate multiple technologies into a system that can support a modality where end-users can perform exercise. In pursuit of this aim, the researchers set specific goals and timeline objectives. Specifically, the most important was to ensure that the three-dimensional axis innate to the WitMotion sensor technology recorded data matching the three planes of actual motion, and to determine which axis matched each respective plane. Once this was accomplished, the sensors were coordinated for data acquisition using the WitMotion software. Finally, a simulated experiment was performed using an in-house designed application to observe the integration of the ML1 device and the sensor technology. Thus, the primary objective was the integration of the ML1 device and the correlated WitMotion sensor technology, while providing a framework for the analysis of lower limb function.

This paper is organized as follows. Within section 1, a simplified introduction is provided to introduce the technologies used and their intended purpose of use for the fields related to exercise science. In section 2, the full detail-oriented background for all pertinent information is provided, in this chapter the critical information is provided to create a practical and theoretical connection, or bridging, between Engineering and Exercise Science using concepts present in Biomechanics, Exercise Science, and Computer Engineering. Next, in section 3, the complete methodologies enacted are listed in phase-oriented positioning Then, in section 4, a discussion of key findings and related concepts and reasoning are discussed and connections are made between the findings in the results section and practical applicability of these findings within the fields related to exercise science. In section 5, the future intended with conclusion direction for the current pipeline of research is listed and established further for planning and development of a potential application and other steps for ensuring that AR technology fills a niche in the fields related to exercise science.

Background

Activities like exercise, and jobs requiring skilled expertise and learning proficiency, require individuals to undertake intensive training and practice to promote and maintain skill acquisition and retention of sport-specific or job-specific skills. Specifically, skill-based learning involving neurological components and correlated adaptations is known as motor learning in the sports sciences [1]. By definition, motor learning is the repetitive process by which the neurological system dynamically adapts to the applied workloads and stressors experienced and managed by the physical structures that play a role in joint motion and human movement [2]. In an example, athletes, and anyone who partakes in a planned exercise regimen, aim to learn the required movement patterns via multiple repetitions over several sessions to ensure the intended movements are completed precisely and without error. The concept of motor learning also applies to a wide variety of career specialties, including mechanics, surgeons, phlebotomists, and tactical operators, whereby each career path requires extensive learning and repetitive practice of skill-based activities [3]. Thus, motor learning is applicable in various specialized fields, and, for the focus of this paper, in exercise science, biomechanics, and even physical therapy.

Applicability of motor pattern development also extends into physical therapy and other healthcare fields involving assessment or evaluation of human movements. For example, the ability to walk, balance on two feet or perform a single joint or multi-joint movement with minimized pain or compensatory motions are common end-goal focuses for clinicians when seeing an injured patient population [4]. Albeit there are inherent similarities to the goals and objectives between exercise training and physical therapy; however, the primary difference is that in exercise training, the focus is on the creation and guidance of a proper movement pattern, whereas in physical therapy, the focus is on the restoration of a proper or functional movement pattern and elimination of compensatory motions [5]. Therefore, targeted approaches can be designed uniquely for each field of specialization.

Recent advances in technology, specifically the development of wearable augmented reality (AR) devices, allow for the creation and subsequent implementation of new modalities in exercise testing and assessment. Each modality can be attuned and specified to the needs of a specific sport or individual by creating an application encompassing all correlated approaches and methodologies pertinent to the progression and development of the individual through a planned exercise regimen guided by AR technology. Wearable AR devices allow the end-user to partake in physical activity while completing a set of virtually overlayed objectives as part of an activity-based video game (AVG). Although AR devices do not currently allow for the direct assessment of exercise-based values, other wearable sensor-based technologies can accompany the primary device to evaluate exercise-specific values. Therefore, technologies can be strategically combined and integrated to obtain the necessary data and information to assess and progress the end-user.

The primary motivation behind this paper is to create a foundation for this technology and it’s applicability within the fields related to the exercise and rehabilitative sciences. The end goal for this pipeline of research is to design, develop, and test a working application for AR devices that can be used as an exercise-based testing modality with variable testing protocols and parameters to function as simulated and specialized exercise training assessments. Therefore, it is critical to explain, in detail, the theoretical and practical applications of the utility that this technology can provide to end-users and specialists within these fields.

Related works and technologies

WitMotion is not the only device that can assess motion and movement. However, this device was selected due to its readily availability for purchase on the Amazon marketplace, and cost-effectiveness. There are other notably similar technologies outside the reality-based technologies market involved in kinetic movement assessment and motion analysis during activity-based tasks. Specifically, technologies like Kinovea, Trazer, and 3-D motion capture systems [6].

Kinovea is a free-to-use publicly sourced movement analysis computer-based software that allows the end-user to analyze collected kinematic and kinetic data from an uploaded video file [7-10]. However, Kinovea requires technical knowledge in Kinesiology and Biomechanics to assess for collected information accurately [11]. Specifically, before an analysis is completed, the end-user must place markers to the best of their ability based on the anatomical structures and locations of critical biological landmarks for analysis [12]. Concurrently, Kinovea requires the end-user to manually prepare and analyze the data for sport-specific movements [12-15]. Using Kinovea, the end-user can perform exercise and obtain a visual assessment with a correlated feedback response that will provide information pertaining to their exercise training session or performed movement. In the most basic sense Kinovea functions as a 2-D marker-based video analysis software tool, thus, is dependent upon how many cameras or capture technologies the end-user has deployed for data collection.

Trazer is an expensive 2-D markerless motion analysis system marketed for clinical use. The components to the Trazer system are as follows: a Microsoft Kinect camera, a CPU unit, a TV monitor that provides a virtual representation of the end-user moving through motion in real-time, and proprietary software that operates to assess time-to-completion and task-based objectives data during an activity-based assessment [16]. Thus, it is very similar to any AVG released on the Xbox console series between 2010 and 2013 when the Kinect device reached peak popularity [17]. When using the Trazer system, the end-user has a mirrored virtual avatar that follows their performed movements. This iteration of 2-D motion capture can assess most simple movements that have specific phases of motion occurring in the frontal plane of motion since the camera is always in front of the end-user, and the end-user’s torso is always facing the device [17]. Due to Trazer’s 2-D marker less evaluation and its similarity to other markerless motion capture systems, if the system’s view of the end-user’s joints becomes obscured or blocked, Trazer may have some issues identifying joint position and location; thus, hampering assessment of complex movements or any activity that involves movement in planes of motion other than the frontal plane during marker-less assessment [18]. Trazer, and other 2-D motion analysis software like Nintendo’s WitFit bear a strategic similarity in that the end-user is provided with a form of visual biofeedback using a virtual on-screen avatar that represents the end-user’s movements in real-time [19]. Although, Trazer’s intended use is scientific in nature, while the intended use of other 2-D motion analysis software like the WiiFit or Microsoft Kinect is commercial for consumer use [19].

Three-dimensional motion capture (3-D MOCAP) is considered the gold standard within motion analysis [20]. This technology has seen significant usage in the movie industry for creating near life-like replications for characters, animals, and creatures in movies and television shows . However, laboratory- and professional-grade 3-D MOCAP analysis systems are the most expensive, yet most accurate, option available for motion analysis [21]. Due to the cost, in the sport setting, these devices appear to be primarily used by only professional sport organizations, within laboratory settings, and by top-tier athletes at specialized training facilities. Concurrently, these systems require either a specialized suit containing multiple marker-based units or multiple wearable tracking devices to derive data for movement analysis [Xsense 3D MOCAP]. Thus, these devices, albeit the most accurate, are the most impractical of the options for the general public due to pricing alone [22]. Therefore, comparatively speaking, due to the limitations of the other available and existing technologies, AR appears to be the most cost-effective option that, with supporting technologies, can be made into an all-inclusive system for motion and movement analyses.

Magic leap ar device

Three hardware units comprise the ML1 device. The goggles (Lightwear), the CPU (Lightpack), and the remote control [ML1 Overview]. Figure 1 shows pictographic images of the hardware components. The device is powered on using the buttons on the lightpack and remote control [ML1 Overview]. Once powered on, the end-user is brought to the home screen [ML1 Overview] The device can be turned off at any time by selecting the battery icon, and subsequently selecting the power off option [ML1 Overview]. If the end-user selects an application, with either the remote control or the touch pad, they are brought to the title screen of the application [ML1 Overview]. If the end-user wishes to return to the home screen, they can hold the home button [ML1 Overview]. End-users also have two separate options between closing an application, a standard way input by Magic Leap or one the developer inputs [ML1 overview]. The ML1 device operates on the Lumin OS, uses Lumin runtime for APIs and Lumin SDKs, and can operate using either Unity or Unreal Engine game design engines to design an application [ML1 overview].


Figure 1: The ML1 device contains the lightwear (goggles), lightpack (CPU), and remote control.

Sensor technology

An attitude and hearing reference system (AHRS) is a sensor that contains a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer that together make up an internal component within the AHRS device called an inertial measurement unit (IMU) [Constant]. The AHRS IMU sensor allows data collection on the three-dimensional assessment and analysis of an object’s movement in real-time [23]. Specifically, linear and angular acceleration, velocity, and position are assessable measurements while wearing an AHRS IMU sensor [23]. WitMotion is a company that has designed an AHRS IMU sensor with Bluetooth capacity for utilization within various fields including both engineering and the movement sciences [WitMotion BLE 5.0]. When paired with multiple other AHRS IMU devices, this device can assess movement about a joint and its biomechanical interaction with other joints during exercise activity regarding human movement [24].

As the other pertinent technology for this study, the WitMotion sensors seen in figure 2 collects and provides data pertaining to the movements performed. A free-to-use monitoring application made by WitMotion is recommended to be preemptively downloaded so the end-user may receive the collected data specific to the movements performed from the worn accelerometers. Once the accelerometers are placed, the end-user opens the application and connects the devices to their mobile device via Bluetooth. Following a short set-up after a device selection screen, the end-user then can perform movement, and the accelerometers track acceleration and velocity of their body in motion. Upon completion of the physical activity, the data can be transferred to a CSV file for export and analysis. However, it should be noted that each sensor device requires its own android device to capture data on the downloaded WitMotion software. Meaning, if recordings from multiple sensors are intended, an equivalent number of android devices are required, as well as individual operators for each device to manually capture the recorded data.


Figure 2: The WitMotion Sensor and all items needed to complete wearable function.

In [25], there existed a concern over the accuracy of data collected from a similar sensor-based movement system used. Since motion occurs about a joint [25]; the researchers believe that the best placement for accuracy of recorded measures from sensor-based systems is at or as close as possible to the joint itself. Concurrently, all measures are recommended to be taken on the right side of the body for uniformity throughout the measure [26]. Upon purchase of the WitMotion sensors, it was found that there were no wearable textiles supplied with the devices. Thus, all accelerometers used in data collection for this paper were outfitted with a customizable elastic band and a buckle to complete the wearable component of the sensor technology.

Previous research in the current literature suggests that common placement of sensors on the lower limbs consisted of placing a device at the center of a section of a limb, so for example, when assessing the knee mechanics, sensors are placed both above and below the knee as close to the middle of the upper and lower sections of the leg as possible while avoiding uneven surfaces [27]. Considering that the lower limbs and their respective joints function as a system during human movement, and that motion occurs about each joint, positioning of the sensor devices as close as possible to the joint where motion is occurring is critically important for accuracy of readings when collecting kinetic and kinematic data [28]. Thus, using this information and easily identifiable landmarks on the human body [29], Figure 3 depicts the placements that will be described below, in detail, selected to preventing slippage or sliding during movement while maintaining the proximity to the joint to be assessed. Although a five-sensor system is preferred, due to current limitations involved with assessing multiple devices concurrently, a three-sensor system assessing the ankle, knee, and hip locations is acceptable pending future production of an updated application allowing multiple sensor devices to assess simultaneously and provide data output on the same user interface page that would be seen for ease of use by the active operator.


Figure 3: WitMotion device recommended placement locations.

As seen above, in (Figure 3), the three locations selected are listed below from proximal to distal location and subsequently described in the next paragraph:

1) Hip: Lateralmost placement in line with the umbilicus.

2) Femoral Head: A double buckle tension placement allows the end-user or researcher to adjust for the placement over the femoral head.

3) Knee: Placing the device directly over the central position on the central most lateral aspect of the knee may provide unwanted shifting during movement, thus, the closest measure to be obtained is one inch above the lateral femoral epicondyle.

Each placement location was selected for location near an easily identifiable bony landmark. Specifically, the first location is selected as the umbilicus, or belly button, is an easily identifiable location by which the WitMotion sensor can be wrapped and clipped around an individual’s waistline to get close to the superior border of the iliac crest [26]. Of all sites listed, the most difficult to maintain the placement of a sensor will be the hip, because the sensor devices cannot be placed directly over the hip with a single strap, a two-strap buckled unit, as shown in figure 3, was designed in a hope to maintain placement during movement while placing the device over the femoral head. The knee being an uneven surface may induce sliding or slippage at the knee [27]. Therefore, to minimize slippage or sliding of the sensor and provide an approximate definition of the sensor’s location, it was placed one inch above the lateral femoral epicondyle of the right knee. The ankle sensor was placed one inch above the lateral malleolus for similar reasoning as the knee sensor. Finally, the foot sensor is placed on the superior aspect of the foot so that the device itself is protected while performing exercise.

Methodology

The methodology used in this study involved the integration of an AR device and multiple motion sensors to create a system that provides feedback on movements performed. This process included using a pre-existing movement-based application installed on the AR device. The items required to complete this study were as follows: the ML1 device with an installed “items pick up” application installed on the Magic Leap device, three motion sensors, WitMotion software, three android mobile devices to capture the sensor data through the WitMotion software, a foldable chair, an ankle weight, access to a set of stairs, a metronome application, goniometer, and a laptop to compile all captured data. This study’s methodology section is split into three subsections, an exploratory phase, an experimental phase, and an assessment phase.

Exploratory phase

During this phase, the goal was to familiarize the researchers with each technology while obtaining a practical understanding of the intended usage behind each technology and the collected sensor data. Since the sensor technology would be the only technology that outputs collected data, the focus was primarily on understanding how it can function synchronously with the ML1 AR device. The researchers practiced using the sensor technology during movements to familiarize themselves with the output data. By practicing exercises while observing the recorded data in real-time, three hypotheses were constructed regarding the functionality of the sensor technology. The three hypotheses are listed below:

  1. Each sensor device had a static three-dimensional coordinate plane system that changed the intended axis for each assigned plane of motion dependent upon placement on a human body.
  2. The three-dimensional coordinate plane system that the sensor devices contained within did match a plane of motion.
  3. When engaged in exercise, the intended axis for either single-joint or multi-joint movement was easily identifiable based on the axis that produced the most significant velocity.

Thus, it was hypothesized that following identification of the intended axes, the sensor technology did output informative data that provided information regarding what occurs at each joint. This occurred while engaged in a single task.

Experimental phase

During this experiment, several movement-based tasks were completed. The intended purpose of these experiments was to perform an exercise and comprehensively assess graphical data collected from the sensor-specific software. Thus, it was imperative to select both single-joint and multi-joint movements that could be easily completed without disrupting the ability of the sensor technology to capture and record accurate data. The selected movement-based tasks were completed as follows: walking, pivot, sit-to-stand, with varying conditions. The final task was an assessment completed during participation in the " items pick up " application on the Magic Leap ML1 device while wearing the sensor technology. Concurrently, as specified in figure 3 on sensor placement, the location of each sensor on the body aligned with a specific joint. The data was recorded on the movement that occurred about that joint during a specified exercise task at that particular joint. Due to a limitation within the sensor technology's software, which will be revisited in more detail later in the discussion chapter, three observers were required to capture and record all data from each joint during each task. With the WitMotion software, each subsequent movement-based tasks performed ended with the process of capturing and compiling the recorded data. Therefore, each task’s final stage’s data was captured and compiled for analyses.

Walking task

This task was meant to emulate what an individual would be doing while participating in any movement-based AR application. For example, the " items pick up" application involved the individual walking short distances to interact with digitally produced objects within the real-world environment. Concurrently, the movement of walking is a necessary movement involved in daily life and activity non-specific to age. Thus, for this task, the individual started in an upright standing position with both feet comfortably shoulder-width apart in a double-leg support stance during this task. When ready, the individual lead with their right leg and proceeded through the phases of motion specific to the walking movement. After completing two full steps, the individual stopped.

Pivot task

Pivoting is an important technique used for the purpose of change-in-direction in both regular physical activity and sport performance. This rotational movement as a task required the individual to stand in an athletic position and rotate at the ankle channeling a full body rotation which allowed the individual to change direction. Once the change-of-direction movement was completed, the individual finished in a split-stance position.

Sit-to-stand task

The sit-to-stand task is a movement used in physical therapy and evaluation of the elderly as referenced in past section on exercise theory and practical applications. During this task, the individual began in a seated position, then stood up into an upright position. The individual finished this task when standing in the upright position.

Magic leap and sensor technology integrated assessment

The final task consisted of engaging in the “items pick up” application downloaded on the Magic Leap ML1 AR device while wearing the sensor technology used in the previous tasks. During this task the individual, wearing both sets of technologies, partook in a video game construct while they walked and interacted with the objects that appeared within their field of view. The individual started in an upright standing position and when the game began, the individual reached out and pinched their hand to form a recognized gesture that the technology identified and caused the interacted item to disappear. The individual was requested to interact with three objects to complete the task.

Results and analysis

As mentioned previously, three hypotheses were formulated which culminated into one final post-integration hypothesis. Thus, four hypotheses were generated and subsequently tested. Each hypothesis was accepted or rejected and is detailed in this section. The first hypothesis stated that each sensor device was believed to contain a static three-dimensional coordinate plane system. Meaning, the intended axes for the expected planes of motion changed dependent upon sensor orientation and placement. Specifically, the x-, y-, and z-coordinate planes were expected to correlate with the Frontal, Sagittal, and Transverse planes of motion, respectively, according to a traditional anatomical configuration. However, after data collection, it was evident that this was not the case. As shown in Figure 4 the highest peaks in velocity were observed on the z-coordinate axis for the walking task which occurred primarily in the sagittal plane. Similarly, in figure 4, movement during the pivot task occurred primarily in the transverse plane, and the highest peaks in velocity were observed on the y-coordinate axis. Since the z-coordinate axis described movement in the sagittal plane, and the y-coordinate axis described movement in the transverse plane, a conclusion can be drawn that the x-coordinate must describe movement in the frontal plane. Therefore, as the expected and determined outcomes for the identified axes and their respective correlated planes of motion differ, the first hypothesis was accepted.

The second hypothesis stated that due to the static nature of the three-dimensional axes innate to the motion sensor device, each axis’ collected velocity data will define movement in a specific plane of motion. This resulted in the need for identification of each axis and the plane of motion it graphically represented. The identification process was simplified via the selection of specific movement-based tasks where an individual generated momentum through various phases of motion in one primary direction. By definition, velocity is a vector quantity that assesses the speed of movement in meters per second (m/s) in a specific direction. The interpretation of the velocity graphs magnitude allowed for the identification of the axes correlated planes of motion by performing movements that occur primarily in one direction, or in this case, one plane of motion. Thus, the direction and magnitude of the vector quantity, velocity, following performance of a movement, identifies which of the three axes controls which plane of motion. Once two were identified, simple deductive reasoning elucidated the plane represented by the final axis. Using the velocity data from the walking movement task and pivot task, the x-, y-, and z-coordinate axes are each representative of movement that occurs in the frontal, transverse, and sagittal planes, respectively. This was tested in several other movement-based tasks to confirm. Therefore, the second hypothesis was accepted.

The third hypothesis stated that once an individual completed an exercise performance, the collected graphical data from the WitMotion sensor technology assisted in identifying the intended axis in both single-joint and multi-joint movements based on the axis that produces the most significant velocity. Thus, several other movement-based tasks were selected to assist in accepting or rejecting this hypothesis. As specified between the first two hypotheses, the identification process was simple in movements occurring about the ankle through one plane of motion like the walking task and the pivot task.

Although difficult, each movement bears a specific reasoning as to why it was more troublesome to assess. Specifically, the sit-to-stand movement had a pre-test setup that positioned the user’s legs under themselves to ensure postural support, however, this also minimized movement occurring at the ankle and knee. Thus, the hip sensor data was found to show the transition from seated position to standing position since the majority of motion occurred at the hip during this movement. Thus, the third hypothesis was conditionally accepted as the intended axes were identified in majority of the selected movement-based tasks using the velocity graphs collected from the WitMotion software. However, it is evident that further research is warranted on motion technology and the effects of muscular contraction during exercise performance on the accuracy of collected data.

The final integrated hypothesis stated that after the exercise performance, the graphical data provided information on what is occurring during each movement. Meaning that by looking at the graphs, information can be obtained and analyzed to assess for abnormalities in the movement pattern. In example, looking at figure 4 on the ankle movement during the walking task, the phases of the movement can be seen in the graph. Since the individual lead with their right foot, the right foot heel strike phase occurs around 74 seconds and proceeds through to the toe off phase around 74.5 seconds. From there, the largest peaks in the velocity graph indicates the swing phases of the right leg during the walking movement task. Thus, if the individual had a motor pattern issue resultant from an injury or mechanical complications, it could be identified using the sensor technology and the graphical outputs. Therefore, the final integrated hypothesis was accepted.

It should also be noted that with the final integration task, participation in the pre-designed in-house “items pick up” app, the individual was required to walk to interact with the fruit objects. Thus, in comparison, the graphs collected between the walking task and the integrated task at the ankle are very similar. However, it can be seen that there is more noise experienced when performing the integrated task. It is believed that the noise is a result of the focus not being on performing the movement, but instead interacting with the fruit. Therefore, by using the AR-based application as a modality to host exercise assessment, more data may be collected on how the individual performs a given movement when not directly thinking or comprehending it.

Walking task

        

Figure 4: Respective joint velocities during the walking task at the ankle, knee, and hip.

During the walking task, the individual followed the protocol as specified in section 3, subsection part while the wearable sensor technology recorded the motion data during completion of the movement-based task. Fig. 4 Show the respective joint velocities during the walking task at the ankle, knee, and hip, The static z-coordinate is responsible for recording the z-axis for the human body, while the static x- and y-coordinates are responsible for recording the y-axis and x-axis for the human body, respectively. In terms, the x-axis being where forward to backward movement occurs, the z-axis for up and down movement, and the y- axis for side-to-side movement. Looking at each graph, in reference to this movement, the x-axis is responsible for movement within the frontal plane. Concurrently, in the knee velocity graph, the x-coordinate shows the knee moving along the y-axis while moving through the pendulum-like movement phases and any that result in even the most minimal shifts upward and downward while completing the movement. Furthermore, when movement is not occurring in a specific plane, yet, there is still some movement that shows frequent oscillations, this is representative of the passive capacity for the individual to maintain balance while performing the movement-based task.

Pivot task

        

Figure 5: Respective joint velocities during the pivot task at the ankle, knee, and hip.

During the pivot task, the individual followed the protocol as specified in section 3, subsection part while the wearable sensor technology recorded the motion data during completion of the movement-based task. As specified in the summary of the walking task, the static z-coordinate is responsible for recording the z-axis for the human body, while the static x- and y-coordinates are responsible for recording the y-axis and x-axis for the human body, respectively. Figure 5 Show the respective joint velocities during the pivot task at the ankle, knee, and hip. As seen on the ankle and knee velocity graphs, the y-coordinate shows shifting along the x-axis. Since this movement is a rotational movement, this means that the x-axis can show the movements occurring in the transverse plane. Concurrently, at the hip there is more movement due to the end positioning finishing in a lunge-like split stance.

Sit-to-stand task

        

Figure 6: Respective joint velocities during the sit-to-stand task at the ankle, knee and hip.

During the sit-to-stand task, the individual followed the protocol as specified in section 3, subsection part while the wearable sensor technology recorded the motion data during completion of the movement-based task. Figure 6 shows the respective joint velocities during the sit-to-stand task at the ankle, knee and hip. The static z-coordinate is responsible for recording the z-axis for the human body, while the static x- and y-coordinates are responsible for recording the y-axis and x-axis for the human body, respectively. The ankle and knee don’t have a significant movement present as they are mostly stationary through the movement based on the initial set-up for the individual prior to starting the movement. The hip being less stable in this movement when compared to the ankle and knee joints; and, based on the ball-and-socket joint movements able to be performed, movements can occur slightly in all present axes minimally depending on the stabilization capacity of the supporting musculature. Thus, movements occurring in the hip velocity graph that appear on the x-axis, or y-coordinate represents potential imbalances or a lack of stability during movement.

Integration task

        

Figure 7: Joint velocities while participating in the items pick up app at the ankle, knee, and hip.

During the integration task, the individual followed the protocol as specified in chapter 3, subsection part while the wearable sensor technology recorded the motion data during completion of the movement-based task. Figure 7 show the Joint velocities while participating in the items pick up app at the ankle, knee, and hip. The static z-coordinate is responsible for recording the z-axis for the human body, while the static x- and y-coordinates are responsible for recording the y-axis and x-axis for the human body, respectively. This task is biomechanically similar to the walking task performed, however, the difference in this is specific to the application used. Specifically, the “items pick up” app does not have a uniform formatting in reference to the placement of the virtual objects. Concurrently, the contextual interference effect is purported to be active during this assessment as the individual focuses more on the objectives present in the AR-based application, instead of completion of the walking task found within the full integration task. Therefore, due to the usage of the AR device and correlated AVG-based application, the evaluator received more accurate information on the motor control capacity of the individual specific to their vestibular capacity.

Conclusion

The aim of this study was to successfully integrate three WitMotion sensor devices with an AR device produced by Magic Leap (ML1). Following the collection of all necessary materials, several movement-based tasks were selected and completed while wearing only the WitMition devices. After all previous tasks were completed, a final integrated task was accomplished wearing both the ML1 and sensor technology. Biomechanical information on the user’s velocity was captured on each movement-based task to discover whether the sensor technology contained static or dynamic 3-D axes. This same biomechanical information that was collected was shown capable of not only identifying an axis and the intended plane of motion it represents, but also, describing what mechanically occurred during a given movement in a phase-like relationship. Finally, following the completion of the integrated task analyses were performed to either accept or refute the hypotheses. All hypotheses were accepted. To conclude, sensor technology and AR devices both have a finite and focused functionality. However, both technologies work cooperatively to expand applicability to new fields. With the addition of other technologies and creation of new software applications, the possibilities are expanding while this new technology finds a niche in a variety of areas and fields of expertise.

References

  1. Yue S (2020) Human motion tracking and positioning for augmented reality. Journal of Real-Time Image Processing 18: 357-368.
  2. Xiong J ,Tan G, Zhan T, Wu ST (2020) Breaking the field-of-view limit in augmented reality with a scanning waveguide display. OSA Continuum 3: 2730-2740.
  3. Safi M, Chung J, Pradhan P (2019 ) Review of Augmented Reality in aerospace industry. Aircraft Engineering and Aerospace Technology. 91:1187-1194.
  4. Mercedes SR (2019) Development of mixed reality applications using the magic leap one device, thesis: 1:74
  5. Maartje H, Scott H, Wiley B, Tristin H, Joan J, et al. (2021) Training capabilities assessment in support of Enhanced Military Training: Comparing head-mounted displays. Lecture Notes in Networks and Systems 275: 11-18.
  6. Palmarini R , Erkoyuncu J, Roy R, Torabmostaedi H (2018) A systematic review of augmented reality applications in maintenance. Robotics and Computer-Integrated Manufacturing 49: 215-228.
  7. Magic Leap 1 Overview. Magic Leap Developer 2019.
  8. Lee GY, Hong JY, Hwang SH, Moon S, Kang H, et al. (2018) Metasurface eyepiece for augmented reality. Nature Communications 9: 4562.
  9. Lareyre F, Chaudhuri A, Adam C, Carrier M, Mialhe C, et al. (2021) Applications of head-mounted displays and smart glasses in vascular surgery. Annals of Vascular Surgery 75: 497-512.
  10. Teng CC, Redfearn B, Nuttall C, Jarvis S, Carr J, et al. (2019) Mixed reality patients monitoring application for Critical Care Nurses. Proceedings of the third International Conference on Medical and Health Informatics 49-53.
  11. Chow J, Feng H, Amor R, Wunsche BC (2013) Music education using augmented reality with a head mounted display. Proceedings of the Fourteenth Australasian User Interface Conference 139: 73-79.
  12. Kawai J, Mitsuhara H, Shishibori M (2016) Game-based evacuation drill using augmented reality and head-mounted display. Interactive Technology and Smart Education 13: 186-201.
  13. Rahman R, Wood ME, Qian L, Price LC, Johnson AA, et al. (2019) Head-mounted display use in surgery: A systematic review, Surgical Innovation 27: 88-100.
  14. Lee K (2012) Augmented reality in education and training. TechTrends 56: 13-21.
  15. Duncan‐Vaidya EA, Stevenson EL (2020) The effectiveness of an augmented reality head‐mounted display in learning skull anatomy at a community college, Anatomical Sciences Education 14: 221-23.
  16. Yao K, Huang S (2021) Simulation Technology and analysis of military simulation training. Journal of Physics: Conference Series 1746: 012020.
  17. Hsieh MC, Lee JJ (2018) Preliminary study of VR and AR applications in medical and Healthcare Education. J Nurs Health Stud 3: 1-5.
  18. Loukas C (2016) Surgical simulation training systems: Box trainers, virtual reality and augmented reality simulators. International Journal of Advanced Robotics and Automation 1: 1-9.
  19. Lungu AJ, Swinkels W ,Claesen L, Tu P, Egger J, et al. (2021) A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: An extension to different kinds of surgery. Expert Rev Med Devices 18:47-62.
  20. Chen H, Hou L, Zhang K, Moon S (2021)Development of BIM, IOT and AR/VR Technologies for Fire Safety and Upskilling Automation in Construction 125: 103631.
  21. Flavián C, Ibáñez-Sánchez S, Orús C (2019) The impact of virtual, augmented and mixed reality technologies on the customer experience. Journal of Business Research 100: 547-560.
  22. Tseng JL (2021) Intelligent Augmented Reality System based on speech recognition. International Journal of Circuits, Systems and Signal Processing 15: 178-186.
  23. Constant N, Cay G, Ravichandran V, Diouf R, Akbar U (2021) Data Analytics for wearable IOT-based telemedicine. Wearable Sensors 357-378.
  24. Osborn L, Iskarous M, Thakor NV (2020) Sensing and control for prosthetic hands in clinical and research applications. Wearable Robotics 445-468.
  25. Liu H (2020) Rail Transit Collaborative Robot Systems. Robot Systems for Rail Transit Applications 89-141.
  26. WitMotion Bluetooth BLE 5.0 9 Axis Low-consumption Sensor, Witmotion 9-Axis Sensor.

© by the Authors & Gavin Publishers. This is an Open Access Journal Article Published Under Attribution-Share Alike CC BY-SA: Creative Commons Attribution-Share Alike 4.0 International License. With this license, readers can share, distribute, download, even commercially, as long as the original source is properly cited. Read More.

Sports Injuries & Medicine

cara menggunakan pola slot mahjongrtp tertinggi hari inislot mahjong ways 1pola gacor olympus hari inipola gacor starlight princessslot mahjong ways 2strategi olympustrik mahjong ways 2trik olympus hari inirtp koi gatertp pragmatic tertinggicheat jackpot mahjongpg soft link gamertp jackpotelemen sakti mahjongpola maxwin mahjongslot olympus mudah mainrtp live starlightrumus slot mahjongmahjong scatter hitamslot pragmaticjam gacor mahjongpola gacor mahjongstrategi maxwin olympusslot jamin menangrtp slot gacorscatter wild banditopola slot mahjongstrategi maxwin sweet bonanzartp slot terakuratkejutan scatter hitamslot88 resmimaxwin olympuspola mahjong pgsoftretas mahjong waystrik mahjongtrik slot olympusewallet modal recehpanduan pemula slotpg soft primadona slottercheat mahjong androidtips dewa slot mahjongslot demo mahjonghujan scatter olympusrtp caishen winsrtp sweet bonanzamahjong vs qilinmaxwin x5000 starlight princessmahjong wins x1000rtp baru wild scatterpg soft trik maxwinamantotorm1131