WO2019018371A1 - Methods, systems, and non-transitory computer readable media for assessing lower extremity movement quality - Google Patents

Methods, systems, and non-transitory computer readable media for assessing lower extremity movement quality Download PDF

Info

Publication number
WO2019018371A1
WO2019018371A1 PCT/US2018/042451 US2018042451W WO2019018371A1 WO 2019018371 A1 WO2019018371 A1 WO 2019018371A1 US 2018042451 W US2018042451 W US 2018042451W WO 2019018371 A1 WO2019018371 A1 WO 2019018371A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
task
user
extracting
phases
Prior art date
Application number
PCT/US2018/042451
Other languages
French (fr)
Inventor
Barnett Samuel FRANK
Darin Anthony PADUA
Michael Joseph MAZZOLENI
Brian Patrick Mann
Original Assignee
The University Of North Carolina At Chapel Hill Office Of Technology Commercialization
Duke University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The University Of North Carolina At Chapel Hill Office Of Technology Commercialization, Duke University filed Critical The University Of North Carolina At Chapel Hill Office Of Technology Commercialization
Priority to US16/632,054 priority Critical patent/US20200147451A1/en
Publication of WO2019018371A1 publication Critical patent/WO2019018371A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4082Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes

Definitions

  • This specification relates generally to assessing lower extremity movement quality, e.g., by applying machine learning algorithms to movement data from user sensors to classify movement patterns that are associated with lower extremity injury risk.
  • Clinical assessment tools can provide a simple and efficient method for identifying problems in the musculoskeletal system that may result in athletic injury. Sports medicine clinicians can often use the results of such assessment tools to develop effective intervention programs for injury prevention and rehabilitation (Padua et al., 2009, 201 1 , 2015). Many individuals are at risk to develop anterior cruciate ligament (ACL) or other serious lower extremity injuries, and those who experience such injuries often struggle to return to their normal active lifestyle and suffer from pain and disability (Noyes et al., 1983a,b; Brophy et al., 2012; Moksnes et al., 2013).
  • ACL anterior cruciate ligament
  • a system for assessing lower extremity movement quality includes one or more user sensors, at least one processor, and a movement evaluator implemented using the at least one processor.
  • the movement evaluator is configured, by virtue of appropriate programming, for receiving movement data from the one or more user sensors during a user's performance of a movement task; extracting one or more movement features from the movement data, each movement feature characterizing a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk; and classifying the movement pattern into a classified risk category for lower extremity injury for the user based on the one or more movement features.
  • the subject matter described in this specification may be implemented in hardware, software, firmware, or combinations of hardware, software and/or firmware.
  • the subject matter described in this specification may be implemented using a non-transitory computer- readable medium storing computer executable instructions that when executed by one or more processors of a computer cause the computer to perform operations.
  • Computer-readable media suitable for implementing the subject matter described in this specification include non-transitory computer- readable media, such as disk memory devices, chip memory devices, programmable logic devices, random access memory (RAM), read only memory (ROM), optical read/write memory, cache memory, magnetic read/write memory, flash memory, and application-specific integrated circuits.
  • a computer-readable medium that implements the subject matter described in this specification may be located on a single device or computing platform or may be distributed across multiple devices or computing platforms.
  • Figure 1 illustrates an example of a participant performing a jump- landing task
  • Figure 2 illustrates visual examples of excellent and poor lower extremity movement patterns
  • Figures 3A-B illustrate examples of the raw data collected from the accelerometer and force plates during a jump-landing trial
  • Figures 4A-B show ROC curves derived from the accelerometer- based SVM and force plate-based SVM
  • Figures 5A-B illustrate example systems for assessing lower extremity movement quality
  • Figures 6A-C illustrate example movement data from a countermovement jump, a medicine ball throw, and an agility test
  • Figures 7A-B illustrate example movement data from a chest- mounted tri-axial accelerometer and from a chest-mounted tri-axial gyroscope during treadmill running;
  • Figure 8 is a flowchart of an example method for assessing lower extremity movement quality.
  • This specification describes methods and systems for assessing lower extremity movement quality, e.g., by applying machine learning algorithms to movement data from user sensors to classify a movement pattern that is associated with lower extremity injury risk.
  • this specification describes a study performed to develop an accurate method to classify lower extremity movement patterns during a jump-landing task using a support vector machine (SVM).
  • SVM support vector machine
  • the methods and systems for assessing lower extremity movement quality may be used in various other appropriate settings than those described in the study and may be implemented using any appropriate technology.
  • Clinical assessment tools can provide a simple and efficient method for identifying problems in the musculoskeletal system that may result in athletic injury. Sports medicine clinicians can often use the results of such assessment tools to develop effective intervention programs for injury prevention and rehabilitation (Padua et al., 2009, 201 1 , 2015). Many individuals are at risk to develop anterior cruciate ligament (ACL) or other serious lower extremity injuries, and those who experience such injuries often struggle to return to their normal active lifestyle and suffer from pain and disability (Noyes et al., 1983a,b; Brophy et al., 2012; Moksnes et al., 2013).
  • ACL anterior cruciate ligament
  • Machine learning techniques have been applied to a wide assortment of biomechanical studies including gait recognition (Jonic et al., 1999; Begg and Kamruzzaman, 2005; Clermont et al., 2017), runner classification (Maurer et al., 2012; Kobsar et al., 2014), patient ailments and treatments (Silver et al., 2006; Deluzio and Astephen, 2007; Muniz et al., 2010; Labbe et al., 201 1 ), and fall detection (Maki, 1997; Lutrek and Kalua, 2009; Phinyomarksz et al., 2015).
  • Such classification methods provide an objective and analytical alternative to methods that have previously relied on subjective criteria.
  • a commonly used machine learning algorithm is the support vector machine (SVM), which constructs a hyperplane to maximize the margin of separation between a classified data set (Boser et al., 1992; Cortes and Vapnik, 1995).
  • SVM support vector machine
  • Begg and Kamruzzaman (2005) used an SVM to analyze how kinematic and kinetic data extracted from the gait of young and elderly adults could be used to classify gait patterns.
  • Fukuchi et al. (201 1 ) used an SVM to detect age-related changes in running kinematics.
  • Chan et al. (2010) used an SVM to classify ankle sprains.
  • Each participant performed three trials of a jump-landing task from a 30 cm platform placed at a distance of one-half their body height from a landing target consisting of two force plates located in front of the jump platform. The participants were instructed to jump forward so that both feet left the platform simultaneously, to land in the middle of the force plates with one foot on each force plate, and to jump for maximal height immediately after landing.
  • Figure 1 illustrates an example of a participant performing the jump- landing task by: (a) jumping down from the platform; (b) landing on the force plates; (c) immediately jumping vertically as high as possible.
  • Figure 2 illustrates visual examples of excellent and poor lower extremity movement patterns.
  • a participant demonstrates excellent lower extremity movement patterns by: (a) exhibiting no medial knee displacement in the frontal plane at maximum knee flexion; (b) exhibiting a "soft" landing with large displacement of the trunk, hips, and knees in the sagittal plane.
  • a participant demonstrates poor lower extremity movement patterns by: (c) exhibiting medial knee displacement in the frontal plane at maximum knee flexion; (d) exhibiting a "stiff landing with very little displacement of the trunk, hips, and knees in the sagittal plane.
  • the force plates collected tri-axial ground reaction force and moment data for each foot, but only the vertical ground reaction force (VGRF) data was used in this study.
  • the inertial measurement unit contained a tri-axial accelerometer, tri-axial gyroscope, and tri-axial magnetometer, but only the data collected from the accelerometer axis aligned with the vertical orientation of the participant was used in this study.
  • the force plate data was sampled at 1000 Hz and the accelerometer data was sampled at 200 Hz.
  • a zero-phase second-order high-pass Butterworth filter with a cutoff frequency of 0.2 Hz was applied to the accelerometer data to remove the DC offset. All aspects of postprocessing, data analysis, and statistical analysis were carried out using MATLAB software (MATLAB R2016a; MathWorks, Inc.; Natick, MA).
  • Figures 3A-B illustrate examples of the raw data collected from the ( Figure 3A) accelerometer and ( Figure 3B) force plates during a jump-landing trial.
  • the shaded regions on the plots represent the stance phase.
  • labels are provided to distinguish between the three main portions of the jump- landing task.
  • Figure 3B the left force plate data is represented by the solid line and the right force plate data is represented by the dashed line.
  • the features used in the data analysis were calculated from the stance phase of the jump-landing task, as this is a critical portion of the test for classifying lower extremity movement patterns (Padua et al., 2009, 201 1 , 2015).
  • the stance phase occurred when the participant landed on the force plates following the horizontal jump from the platform and began transitioning into the subsequent vertical jump.
  • the stance phase was formally defined to begin when the first foot landed on the ground and to end when the last foot left the ground.
  • Four features were calculated from the stance phase of the accelerometer data: ground contact time (TIME), pseudo-impulse during the first half (IMP-i ), pseudo-impulse during the second half (IMP 2 ), and peak acceleration (PEAK).
  • the pseudo-impulse variables were defined as the area under the curve of the acceleration profile over the desired time interval (with a baseline value of -1 g).
  • the stance phase and pseudo- impulse calculated from the accelerometer data are depicted in Fig. 3A.
  • PCA is a statistical procedure that creates a set of orthogonal principal components (PC's) to describe the original set of data (Joliffe, 2002). Prior to implementing the PCA, all variable distributions were normalized to have a mean of 0 and a standard deviation of 1 .
  • An SVM is a supervised machine learning algorithm that constructs a hyperplane to maximize the margin of separation between a classified data set (Boser et al., 1992; Cortes and Vapnik, 1995).
  • the variable distributions contained in the feature vectors were normalized to have a mean of 0 and a standard deviation of 1 to avoid artificially exaggerating the importance of certain features, and 10-fold cross- validation was applied to the data sets to help prevent overfitting during the training process.
  • Receiver operating characteristic (ROC) curves were generated using the SVM scores from both the individual trials and the averaged trials, and the area under the curve (AUC) was calculated for both cases (individual and averaged trials) to facilitate comparisons.
  • An optimal cutoff point was selected on the averaged ROC curves to determine overall classification accuracy (ACC), true positive rate (TPR), and false positive rate (FPR). 95% confidence intervals (Cl's) were calculated for each AUC.
  • the feature vector variables can be sorted into tiers related to their effect size.
  • the TIME and IMP variables have very large effect sizes
  • the PEAK variables have moderate effect sizes (with the exception of PEAKiviaxsiope, which has a very small effect size)
  • the ⁇ ⁇ variables have small effect sizes.
  • the PCA was applied to the force plate feature vectors, and 5 PC's were retained following an analysis of the system's eigenvalues (all PC's retained had a corresponding ⁇ > 1 .0). These 5 PC's explained 94.4% of the variance in the force plate data.
  • the list of PC's associated with the force plate data and their corresponding eigenvalues and variable loadings following varimax rotation can be seen in Table 2. Cross-loadings are not shown.
  • Table 2 A statistical comparison between the feature vectors containing the normalized PC's of the two subject groups can be seen in Table 3.
  • PCi is associated with the TIME and IMP variables and has a very large effect size
  • PC2.3 are associated with the PEAK variables and have moderate effect sizes
  • PC 4 ,5 are associated with the ⁇ variables and have small effect sizes.
  • FIG. 4A-B show ROC curves derived from the ( Figure 4A) accelerometer-based SVM and ( Figure 4B) force plate-based SVM.
  • the black circles represent the optimal cutoff points on the ROC curves. Note that the force plate-based SVM has two optimal cutoff points.
  • TPR true positive rates
  • FPR false positive rates
  • ACC overall classification accuracies
  • Both the force plate-based SVM and the accelerometer-based SVM were able to successfully classify lower extremity movement patterns with a high level of accuracy. Therefore, both methods have the potential to provide value as objective and autonomous clinical assessment tools.
  • the force plate data was collected from two sensors as opposed to the accelerometer data being collected from one, and the force plate data was sampled at a higher frequency than the accelerometer data.
  • the accuracy of the accelerometer-based SVM was still very close to the accuracy achieved by the force plate-based SVM. This is impressive, especially when considering the orders of magnitude price difference between force plates and accelerometers. Additionally, a force plate-based assessment method would typically be limited to applications in a laboratory setting, while a wearable accelerometer-based assessment method could be implemented at a much wider scale in a field testing environment.
  • the force plate-based SVM has two optimal cutoff points on its ROC curve, as shown in Fig. 4B, which indicates that a judgment call must be made if this algorithm were to be implemented in practice.
  • Machine learning algorithms have been applied to a large variety of biomechanical studies in recent years. However, to date, we believe that this is the first study that has applied machine learning algorithms to classify movement patterns during a jump-landing task that are associated with lower extremity injury risk. We also believe that the systems and methods employed in our study could be adapted in future studies to classify different movement patterns related to other injury risks.
  • Figures 5A-B illustrate example systems for assessing lower extremity movement quality.
  • Figure 5A shows a first example system 500 including one or more user sensors 502 that may be worn by a user 504 or trained on the user 504 or otherwise situated to capture motion of the user 504.
  • the system 500 includes a user device 506 and a movement evaluator 508 implemented on the user device 506.
  • the user device 506 can be any appropriate computing device and typically includes at least one processor, memory storing instructions for the processor, and a display.
  • the user device 506 may include a user input device and a wired or wireless communication system.
  • the user device 506 may be a laptop, tablet, or mobile phone.
  • the user sensors 502 or the user device 506 or both can be housed in a wearable fitness monitor.
  • the user sensors 502 may be embedded in a shoe or ankle bracelet or other user garment, and the user device 506 may be a smart watch.
  • some of the user sensors 502 are not wearable, e.g., where the user sensors 502 include a force plate situated on the ground for the user 504 to land on.
  • the movement evaluator 508 is programmed for receiving movement data from the user sensors 502 during the user's performance of a movement task and extracting one or more movement features from the movement data. Each movement feature characterizes a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk.
  • the movement evaluator 508 is programmed for classifying the movement pattern into a classified risk category for lower extremity injury for the user based on the one or more movement features.
  • classifying the movement pattern can include supplying the one or more movement features to a machine learning classifier trained using pre-classified training data for the movement task.
  • the machine learning classifier can include a support vector machine (SVM) configured to construct a hyperplane to maximize, based on the pre-classified training data, a margin of separation between the risk categories.
  • Extracting the one or more movement features can include performing a principle component analysis (PCA) to reduce a dimensionality of a feature set extracted from the movement data.
  • PCA principle component analysis
  • the movement evaluator 508 can be programmed to display, on a display of the user device 506, an indicator for the classified risk category for the user. For example, the movement evaluator 508 can display a text label for the classified risk category or a color representing the classified risk category.
  • Figure 5B illustrates a second example system 520 including the user sensors 502 situated to capture motion of the user 504.
  • the movement data is transmitted to a remote server 522 implementing the movement evaluator 508.
  • the movement evaluator 508 on the remote server 522 is configured to classify movement patterns, e.g., for storage and analysis or to return classifications for local display.
  • the user sensors 502 can transmit the movement data to the user device 506, which can then transmit the movement data to the remote server 522 (e.g., using a wireless router 524 and a data communications network 526 such as the Internet).
  • the user sensors 502 are configured with appropriate hardware for communicating directly with the wireless router 524 so that the user device 506 is optional.
  • the remote server 522 sends results from the classification to the user device 506 for display to the user 504.
  • the movement task may be any appropriate task for assessing lower extremity motion quality, e.g., a jumping task; a running, jogging, or walking task; a cutting and sprinting task; a squatting task, a weight lifting task; a medicine ball toss task; or a combination of the listed tasks.
  • a jumping task e.g., a jumping task; a running, jogging, or walking task; a cutting and sprinting task; a squatting task, a weight lifting task; a medicine ball toss task; or a combination of the listed tasks.
  • a jumping task e.g., a running, jogging, or walking task
  • a cutting and sprinting task e.g., a squatting task, a weight lifting task
  • medicine ball toss task e.g., a medicine ball toss task.
  • Figures 6A-C illustrate example movement data from a countermovement jump (Figure 6A), a medicine ball throw (Figure 6B), and an agility test (Figure 6C).
  • Figures 7A-B illustrate example movement data from a ( Figure 7A) chest-mounted tri-axial accelerometer and from a ( Figure 7B) chest-mounted tri-axial gyroscope during treadmill running.
  • the following five features were calculated from the time series for all six degrees of freedom: root mean square (RMS), economy of acceleration/angular velocity (ECON), step regularity (REG1 ), stride regularity (REG2), and symmetry (SYM).
  • the ECON variable was calculated by dividing RMS by the treadmill speed.
  • the REG1 and REG2 variables were calculated using an unbiased autocorrelation procedure at a phase shift equal to the average step time and stride time, respectively.
  • the SYM variable was calculated as the percent difference between REG1 and REG2.
  • the average peak vertical acceleration and the coefficient of variation for the vertical acceleration were also calculated.
  • Figure 8 is a flowchart of an example method 800 for assessing lower extremity movement quality.
  • the method 800 can be performed by a movement evaluator implemented on a computer system.
  • the method 800 includes receiving movement data from one or more user sensors during a user's performance of a movement task (802).
  • receiving the movement data can include one or more of: receiving the movement data from one or more force plates during the user's performance of the movement task, receiving the movement data from one or more pressure sensors during the user's performance of the movement task, receiving the movement data from one or more video cameras during the user's performance of the movement task, and receiving the movement data from one or more wearable electronic fitness monitors worn by the user during the user's performance of the movement task.
  • a wearable electronic fitness monitor can include a combination of one or more accelerometers, gyroscopes, magnetometers, pressure sensors, GPS sensors, RFID sensors, HR monitors, V02 monitors, and Sm02 monitors.
  • the wearable electronic fitness device can be worn in any appropriate position on the user, for example, in clothing or apparel, in a chest strap, in a sports bra, in a waistband or belt, in footwear, in headphones or headgear, in a watch or armband, or in a leg band or leg strap.
  • the method 800 includes extracting one or more movement features from the movement data, each movement feature characterizing a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk (804). Extracting the one or more movement features can include performing a principle component analysis to reduce the dimensionality of a feature set extracted from the movement data.
  • the movement features may characterize a degree of fatigue.
  • the movement task includes a jump-landing task and extracting the movement features includes identifying one or more movement phases of the jump-landing task in the movement data and extracting the one or more movement features from the movement phases of the movement data.
  • the movement phases of the jump-landing task can be one or more takeoff phases, flight phases, and stance phases.
  • the user sensors can include an accelerometer on the user, and extracting the movement features from the jump-landing task can include extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo-impulse during the second half of the stance phase, and a peak acceleration during the stance phase.
  • the user sensors can include a force plate and the jump- landing task can include a combination of one or more events comprising jumping and landing on the one or more force plates, and extracting movement features can include extracting the movement features to characterize timing, loading, and asymmetrical characteristics from the one or more force plates.
  • the movement task includes a drop-landing task
  • extracting the movement features includes identifying one or more movement phases of the drop-landing task in the movement data and extracting the movement features from the movement phases of the movement data.
  • the movement phases of the drop-landing task can include one or more takeoff phases, flight phases, and stance phases.
  • the user sensors can include an accelerometer on the user, and extracting the movement features from the drop-landing task can include extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo-impulse during the second half of the stance phase, and a peak acceleration during the stance phase.
  • the user sensors can include a force plate and the drop- landing task can include landing on the one or more force plates, and extracting the movement features can include extracting the movement features to characterize timing, loading, and asymmetrical characteristics from the one or more force plates.
  • the movement task includes a countermovement jump task
  • extracting the movement features includes identifying one or more movement phases of the countermovement jump task in the movement data and extracting the movement features from the movement phases of the movement data.
  • the movement phases of a countermovement jump task can include one or more takeoff phases, flight phases, and stance phases.
  • the user sensors can include an accelerometer on the user, and extracting the movement features from the countermovement jump task can include extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo-impulse during the second half of the stance phase, and a peak acceleration during the stance phase.
  • the user sensors can include a force plate and the countermovement jump task can include a combination of one or more events including jumping and landing on the one or more force plates, and extracting the one or more movement features can include extracting the one or more movement features to characterize timing, loading, and asymmetrical characteristics from the one or more force plates.
  • the movement task includes a running, jogging, or walking task. In some examples, the movement task includes one of a jumping task; a running, jogging, or walking task; a cutting and sprinting task; a squatting task, a weight lifting task; and a medicine ball toss task.
  • the user sensors can include a combination of one or more accelerometers and gyroscopes on the user, and extracting the movement features can include extracting the movement features to characterize timing, loading, efficiency, regularity, and asymmetrical characteristics.
  • the method 800 includes classifying the movement pattern into a classified risk category for lower extremity injury for the user based on the one or more movement features (806).
  • Classifying the movement pattern can include supplying the movement features to a machine learning classifier trained using pre-classified training data for the movement task.
  • the machine learning classifier can be a support vector machine (SVM) configured to construct a hyperplane to maximize, based on the pre- classified training data, a margin of separation between the risk categories.
  • the method 800 can include displaying, on a display device, an indicator for the classified risk category for the user.
  • Boser B.E., Guyon, I.M., Vapnik, V.N., 1992. A training algorithm for optimal margin classifiers, in: Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp. 144 - 152.
  • the landing error scoring system (LESS) is a valid and reliable clinical assessment tool of jump-landing biomechanics: The JUMP-ACL study. The American Journal of Sports Medicine 37, 1996-2002.

Abstract

A system for assessing lower extremity movement quality includes one or more user sensors, at least one processor, and a movement evaluator implemented using the at least one processor. The movement evaluator is configured, by virtue of appropriate programming, for receiving movement data from the one or more user sensors during a user's performance of a movement task; extracting one or more movement features from the movement data, each movement feature characterizing a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk; and classifying the movement pattern into a classified risk category for lower extremity injury for the user based on the one or more movement features.

Description

METHODS, SYSTEMS, AND NON-TRANSITORY COMPUTER READABLE MEDIA FOR ASSESSING LOWER EXTREMITY MOVEMENT QUALITY
PRIORITY CLAIM
This application claims the benefit of U.S. Provisional Patent
Application Serial No. 62/533,523 filed July 17, 2017, the disclosure of which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
This specification relates generally to assessing lower extremity movement quality, e.g., by applying machine learning algorithms to movement data from user sensors to classify movement patterns that are associated with lower extremity injury risk.
BACKGROUND
Clinical assessment tools can provide a simple and efficient method for identifying problems in the musculoskeletal system that may result in athletic injury. Sports medicine clinicians can often use the results of such assessment tools to develop effective intervention programs for injury prevention and rehabilitation (Padua et al., 2009, 201 1 , 2015). Many individuals are at risk to develop anterior cruciate ligament (ACL) or other serious lower extremity injuries, and those who experience such injuries often struggle to return to their normal active lifestyle and suffer from pain and disability (Noyes et al., 1983a,b; Brophy et al., 2012; Moksnes et al., 2013). Effective screening methods such as the Landing Error Scoring System (LESS) have been developed to identify movement patterns during a jump-landing task that are related to lower extremity injury risk (Padua et al., 2009, 201 1 , 2015). Abnormal lower extremity movement patterns are modifiable, so identifying injury risk patterns prior to injury can result in effective interventions. However, screening methods such as the LESS are reliant on subjective assessments from trained professionals, which limits the scope and scale of their applicability. Accordingly, there exists a need for an objective and autonomous system to classify lower extremity movement patterns, and for methods and systems for enabling assessments to take place in a field testing environment as opposed to in a controlled laboratory setting.
SUMMARY
A system for assessing lower extremity movement quality includes one or more user sensors, at least one processor, and a movement evaluator implemented using the at least one processor. The movement evaluator is configured, by virtue of appropriate programming, for receiving movement data from the one or more user sensors during a user's performance of a movement task; extracting one or more movement features from the movement data, each movement feature characterizing a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk; and classifying the movement pattern into a classified risk category for lower extremity injury for the user based on the one or more movement features.
The subject matter described in this specification may be implemented in hardware, software, firmware, or combinations of hardware, software and/or firmware. In some examples, the subject matter described in this specification may be implemented using a non-transitory computer- readable medium storing computer executable instructions that when executed by one or more processors of a computer cause the computer to perform operations.
Computer-readable media suitable for implementing the subject matter described in this specification include non-transitory computer- readable media, such as disk memory devices, chip memory devices, programmable logic devices, random access memory (RAM), read only memory (ROM), optical read/write memory, cache memory, magnetic read/write memory, flash memory, and application-specific integrated circuits. In addition, a computer-readable medium that implements the subject matter described in this specification may be located on a single device or computing platform or may be distributed across multiple devices or computing platforms.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 illustrates an example of a participant performing a jump- landing task;
Figure 2 illustrates visual examples of excellent and poor lower extremity movement patterns;
Figures 3A-B illustrate examples of the raw data collected from the accelerometer and force plates during a jump-landing trial;
Figures 4A-B show ROC curves derived from the accelerometer- based SVM and force plate-based SVM;
Figures 5A-B illustrate example systems for assessing lower extremity movement quality;
Figures 6A-C illustrate example movement data from a countermovement jump, a medicine ball throw, and an agility test;
Figures 7A-B illustrate example movement data from a chest- mounted tri-axial accelerometer and from a chest-mounted tri-axial gyroscope during treadmill running; and
Figure 8 is a flowchart of an example method for assessing lower extremity movement quality.
DESCRIPTION
This specification describes methods and systems for assessing lower extremity movement quality, e.g., by applying machine learning algorithms to movement data from user sensors to classify a movement pattern that is associated with lower extremity injury risk.
For purposes of illustration, this specification describes a study performed to develop an accurate method to classify lower extremity movement patterns during a jump-landing task using a support vector machine (SVM). The methods and systems for assessing lower extremity movement quality, however, may be used in various other appropriate settings than those described in the study and may be implemented using any appropriate technology.
In the study, forty female athletes each performed three trials of a jump-landing task. Two digital video cameras were used to record the frontal and sagittal perspectives during the jump-landing tasks. The video data was evaluated to classify the lower extremity movement patterns of the participants as either excellent (n = 20) or poor (n = 20) using the Landing Error Scoring System (LESS) assessment method. Data was also collected from two force plates and a chest-mounted single-axis accelerometer. Fourteen features were extracted from the force plate data and four features were extracted from the accelerometer data. A principal component analysis (PCA) was applied to the force plate data to reduce the dimensionality of the feature vector. Five principal components (PC's) were retained, explaining 94.4% of the variance in the force plate data.
Two separate linear SVM classifiers were trained using the accelerometer data and the force plate data, respectively, with the LESS assessment providing the classification labels during training and evaluation. The accelerometer-based SVM had an overall classification accuracy of 80.0%, and the force plate-based SVM had an overall classification accuracy of 87.5%. These findings suggest that a linear SVM can successfully classify lower extremity movement patterns during a jump-landing task using force plate and accelerometer data, with force plate data providing greater classification accuracy.
1. Introduction
Clinical assessment tools can provide a simple and efficient method for identifying problems in the musculoskeletal system that may result in athletic injury. Sports medicine clinicians can often use the results of such assessment tools to develop effective intervention programs for injury prevention and rehabilitation (Padua et al., 2009, 201 1 , 2015). Many individuals are at risk to develop anterior cruciate ligament (ACL) or other serious lower extremity injuries, and those who experience such injuries often struggle to return to their normal active lifestyle and suffer from pain and disability (Noyes et al., 1983a,b; Brophy et al., 2012; Moksnes et al., 2013). Effective screening methods such as the Landing Error Scoring System (LESS) have been developed to identify movement patterns during a jump-landing task that are related to lower extremity injury risk (Padua et al., 2009, 201 1 , 2015). Abnormal lower extremity movement patterns are modifiable, so identifying injury risk patterns prior to injury can result in effective interventions. However, screening methods such as the LESS are reliant on subjective assessments from trained professionals, which limits the scope and scale of their applicability. The development of an objective and autonomous system to classify lower extremity movement patterns during a jump-landing task would provide a powerful and objective clinical assessment tool. Integrating such a system with portable and inexpensive sensors would provide further benefits by enabling assessments to take place in a field testing environment as opposed to in a controlled laboratory setting.
Bittencourt et al. (2016) recently advocated for a paradigm shift in the field of sports injury prevention, encouraging researchers to move away from a reductionist view of injury factor identification to a more complex systems approach of injury pattern recognition. One method for accomplishing this objective is through the application of machine learning algorithms, which have the ability to recognize complex patterns and create data-driven prediction models (Shalev-Shwartz and Ben-David, 2014). Machine learning techniques have been applied to a wide assortment of biomechanical studies including gait recognition (Jonic et al., 1999; Begg and Kamruzzaman, 2005; Clermont et al., 2017), runner classification (Maurer et al., 2012; Kobsar et al., 2014), patient ailments and treatments (Silver et al., 2006; Deluzio and Astephen, 2007; Muniz et al., 2010; Labbe et al., 201 1 ), and fall detection (Maki, 1997; Lutrek and Kalua, 2009; Phinyomarksz et al., 2015). Such classification methods provide an objective and analytical alternative to methods that have previously relied on subjective criteria.
A commonly used machine learning algorithm is the support vector machine (SVM), which constructs a hyperplane to maximize the margin of separation between a classified data set (Boser et al., 1992; Cortes and Vapnik, 1995). Begg and Kamruzzaman (2005) used an SVM to analyze how kinematic and kinetic data extracted from the gait of young and elderly adults could be used to classify gait patterns. Fukuchi et al. (201 1 ) used an SVM to detect age-related changes in running kinematics. Chan et al. (2010) used an SVM to classify ankle sprains. Wu and Wang (2008) implemented a principal component analysis (PCA) for feature vector dimensionality reduction and utilized an SVM with nonlinear kernels (polynomial and radial basis function) to classify gait patterns. Other supervised learning approaches such as information gain and artificial neural networks (ANN's) have been used to examine the effects of midsole resilience and upper shoe structure on running kinematics (Onodera et al., 2017). Muniz et al. (2010) compared the performance of logistic regression (LR), ANN, and SVM to classify the gaits of subjects diagnosed with Parkinson's disease. Bennetts et al. (2013) applied k-means clustering to study peak plantar pressure distributions for diabetic patients. Kobsar et al. (2014) used tri-axial accelerometer data and a discriminant analysis to classify runners based on their experience level and training background. However, to date, we are unaware of any study that has applied machine learning algorithms to classify lower extremity movement patterns during a jump-landing task that are associated with increased lower extremity injury risk.
The purpose of this study was to develop an accurate and autonomous method for classifying lower extremity movement patterns during a jump-landing task using a linear SVM. Video analysis was implemented to classify the participants' lower extremity movement patterns using the LESS assessment method. Data was collected from two force plates and a chest-mounted single-axis accelerometer. Two separate linear SVM classifiers were trained using the accelerometer data and the force plate data, respectively, with the LESS assessment providing the classification labels during training and evaluation. 2. Methods
2. 1. Participants
Forty female athletes were recruited for this study. The participants were all physically active, participated in field or court sports, had no history of lower extremity injuries, and were between the ages of 18-25. The participants were evaluated while performing a jump-landing task using the LESS assessment method (Padua et al., 2009, 201 1 , 2015), with one group exhibiting excellent lower extremity movement patterns (n = 20; age: 20.6 ± 1 .9 yr; mass: 64.5 ± 7.8 kg; height: 1 .67 ± 0.07 m) and the other group exhibiting poor lower extremity movement patterns (n = 20; age: 20.4 ± 1 .3 yr; mass: 60.9 ± 6.1 kg; height: 1 .69 ± 0.07 m). Informed written consent was obtained from each participant, and the testing protocol was approved by the institutional review board (IRB# 14-3298).
2.2. Data collection
Each participant performed three trials of a jump-landing task from a 30 cm platform placed at a distance of one-half their body height from a landing target consisting of two force plates located in front of the jump platform. The participants were instructed to jump forward so that both feet left the platform simultaneously, to land in the middle of the force plates with one foot on each force plate, and to jump for maximal height immediately after landing.
Figure 1 illustrates an example of a participant performing the jump- landing task by: (a) jumping down from the platform; (b) landing on the force plates; (c) immediately jumping vertically as high as possible.
The participants were allowed to practice until they felt comfortable with the jump-landing task and were able to complete it correctly. Mistrials were excluded and repeated if the participant did not perform the jump- landing correctly. Two digital video cameras (HER03+; GoPro, Inc.; San Mateo, CA) were positioned 3 m away from the participant to record the frontal and sagittal perspectives during the jump-landing tasks. Two expert raters evaluated the videos using free computer software (QuickTime; Apple, Inc.; Cupertino, CA) and classified each participant as exhibiting either excellent or poor lower extremity movement patterns during the jump-landing task using the LESS assessment method (Padua et al., 2009, 201 1 , 2015). If the two evaluators disagreed on the classification assessment, then a third evaluator was consulted. Figure 2 illustrates visual examples of excellent and poor lower extremity movement patterns. In Figure 2, a participant demonstrates excellent lower extremity movement patterns by: (a) exhibiting no medial knee displacement in the frontal plane at maximum knee flexion; (b) exhibiting a "soft" landing with large displacement of the trunk, hips, and knees in the sagittal plane. A participant demonstrates poor lower extremity movement patterns by: (c) exhibiting medial knee displacement in the frontal plane at maximum knee flexion; (d) exhibiting a "stiff landing with very little displacement of the trunk, hips, and knees in the sagittal plane.
Data was collected from the two force plates (FP4060-10; Bertec
Corp.; Columbus, OH) and a chest-mounted inertial measurement unit (Armour39; Under Armour, Inc.; Baltimore, MD). The force plates collected tri-axial ground reaction force and moment data for each foot, but only the vertical ground reaction force (VGRF) data was used in this study. The inertial measurement unit contained a tri-axial accelerometer, tri-axial gyroscope, and tri-axial magnetometer, but only the data collected from the accelerometer axis aligned with the vertical orientation of the participant was used in this study. The force plate data was sampled at 1000 Hz and the accelerometer data was sampled at 200 Hz. A zero-phase second-order high-pass Butterworth filter with a cutoff frequency of 0.2 Hz was applied to the accelerometer data to remove the DC offset. All aspects of postprocessing, data analysis, and statistical analysis were carried out using MATLAB software (MATLAB R2016a; MathWorks, Inc.; Natick, MA).
Examples of the raw data collected from the accelerometer and force plates during a jump-landing trial can be seen in Figures 3A-B. Figures 3A-B illustrate examples of the raw data collected from the (Figure 3A) accelerometer and (Figure 3B) force plates during a jump-landing trial. The shaded regions on the plots represent the stance phase. In Figure 3A, labels are provided to distinguish between the three main portions of the jump- landing task. In Figure 3B, the left force plate data is represented by the solid line and the right force plate data is represented by the dashed line. These data sets came from a participant who exhibited poor lower extremity movement patterns. 2.3. Feature extraction
The features used in the data analysis were calculated from the stance phase of the jump-landing task, as this is a critical portion of the test for classifying lower extremity movement patterns (Padua et al., 2009, 201 1 , 2015). The stance phase occurred when the participant landed on the force plates following the horizontal jump from the platform and began transitioning into the subsequent vertical jump. The stance phase was formally defined to begin when the first foot landed on the ground and to end when the last foot left the ground. Four features were calculated from the stance phase of the accelerometer data: ground contact time (TIME), pseudo-impulse during the first half (IMP-i ), pseudo-impulse during the second half (IMP2), and peak acceleration (PEAK). These features were selected to capture the timing and loading characteristics of the stance phase. The pseudo-impulse variables were defined as the area under the curve of the acceleration profile over the desired time interval (with a baseline value of -1 g). For reference, the stance phase and pseudo- impulse calculated from the accelerometer data are depicted in Fig. 3A.
Fourteen features were calculated from the stance phase of the force plate data: ground contact time (TIME), maximum impulse during the first half (IMPMaxi ), total impulse during the first half (IMPTot1 ), maximum impulse during the second half (IMPMax2), total impulse during the second half (IMPTot2), maximum impulse over the entire time duration (IMPMax), maximum peak VGRF (PEAKMax), average peak VGRF (PEAKAvg), minimum time to peak VGRF (PEAKMinTime), average time to peak VGRF (PEAKAvgTime), maximum slope to peak VGRF (PEAKMaxsiope), absolute difference between the ground contact times of the two feet (ΤΙΜΕΔ), absolute time difference between the landings of the two feet at the beginning of the stance phase (ΤΙΜΕΛ-Ι), and absolute time difference between the takeoffs of the two feet at the end of the stance phase (ΤΙΜΕΔ2)- These features were selected to capture the timing, loading, and asymmetrical characteristics of the stance phase. All of the variables involving VGRF measurements were normalized by body mass (IMPMax1 , IMPTot1 , IMPMax2, IMPTot2, IMPMax, PEAKMax, PEAKAvg, PEAKMaxsiope)- All of the variables involving maximum or minimum values were based on comparisons between the right and left force plates (IMPMBXI , IMPMax2, IMPMax, PEAKMax, PEAKMinTime, PEAKMaxSlope). All Of the variables involving total values were based on the sum of the right and left force plates (IMPTot1 , IMPTot2)- All of the variables involving average values were based on the average of the right and left force plates (PEAKAvg,
P EAKAvgTime)-
2.4. Principal component analysis (PC A)
The number of features contained in the accelerometer feature vector (n = 4) was small relative to the sample size of participants (n = 40), so those features could be used directly in the subsequent SVM analysis. However, the number of features contained in the force plate feature vector was relatively large (n = 14), so a PCA was implemented to reduce the dimensionality of the force plate feature vector and thereby decrease the risk of overfitting. PCA is a statistical procedure that creates a set of orthogonal principal components (PC's) to describe the original set of data (Joliffe, 2002). Prior to implementing the PCA, all variable distributions were normalized to have a mean of 0 and a standard deviation of 1 . After implementing the PCA, the system's eigenvalues (A's) were analyzed to determine the appropriate number of PC's to retain, and a varimax rotation was then applied to the retained PC's to obtain a simple structure and increase interpretability (see Kobsar et al. (2014) for a similar application).
2.5. Support vector machine (SVM) classification
Two separate linear SVM classifiers were trained, one using the accelerometer feature vectors and the other using the force plate feature vectors (after implementing the PCA). An SVM is a supervised machine learning algorithm that constructs a hyperplane to maximize the margin of separation between a classified data set (Boser et al., 1992; Cortes and Vapnik, 1995). The variable distributions contained in the feature vectors were normalized to have a mean of 0 and a standard deviation of 1 to avoid artificially exaggerating the importance of certain features, and 10-fold cross- validation was applied to the data sets to help prevent overfitting during the training process. Each individual jump-landing trial was used in the training process (n = 120). However, for classification purposes, the SVM scores of the individual trials were averaged for each participant (n = 40).
2.6. Statistical analysis
Descriptive statistics (mean ± SD) were calculated for all of the variables contained in the feature vectors, as well as for all of the participants' demographic variables (age, mass, and height). The differences between the feature vectors of the two subject groups were examined using p-values calculated from an unpaired two-tailed Student's t-test with statistical significance set at a = 0.05. Cohen's d was calculated to evaluate the effect size of these differences (Cohen, 1988).
Receiver operating characteristic (ROC) curves were generated using the SVM scores from both the individual trials and the averaged trials, and the area under the curve (AUC) was calculated for both cases (individual and averaged trials) to facilitate comparisons. An optimal cutoff point was selected on the averaged ROC curves to determine overall classification accuracy (ACC), true positive rate (TPR), and false positive rate (FPR). 95% confidence intervals (Cl's) were calculated for each AUC.
3 Results
A statistical comparison between the feature vectors of the two subject groups can be seen in Table 1 .
Figure imgf000014_0001
Table 1
The feature vector variables can be sorted into tiers related to their effect size. In general, the TIME and IMP variables have very large effect sizes, the PEAK variables have moderate effect sizes (with the exception of PEAKiviaxsiope, which has a very small effect size), and the ΤΙΜΕΔ variables have small effect sizes.
The PCA was applied to the force plate feature vectors, and 5 PC's were retained following an analysis of the system's eigenvalues (all PC's retained had a corresponding λ > 1 .0). These 5 PC's explained 94.4% of the variance in the force plate data. The list of PC's associated with the force plate data and their corresponding eigenvalues and variable loadings following varimax rotation can be seen in Table 2. Cross-loadings are not shown.
Figure imgf000014_0002
Table 2 A statistical comparison between the feature vectors containing the normalized PC's of the two subject groups can be seen in Table 3.
Figure imgf000015_0001
Table 3
The PC's can be sorted into tiers related to their effect size, and these tiers are similar to the ones established prior to implementing the PCA. PCi is associated with the TIME and IMP variables and has a very large effect size, PC2.3 are associated with the PEAK variables and have moderate effect sizes, and PC4,5 are associated with the ΤΙΜΕΔ variables and have small effect sizes.
The SVM classification results are depicted in the ROC curves shown in Figures 4A-B. Figures 4A-B show ROC curves derived from the (Figure 4A) accelerometer-based SVM and (Figure 4B) force plate-based SVM. The solid lines represent the ROC curves generated using the SVM scores from the averaged trials (n = 40) and the dashed lines represent the ROC curves generated using the SVM scores from the individual trials (n = 120). The black circles represent the optimal cutoff points on the ROC curves. Note that the force plate-based SVM has two optimal cutoff points. The optimal true positive rates (TPR), false positive rates (FPR), and overall classification accuracies (ACC) are shown on the plots, along with the areas under the curve and their corresponding 95% Cl's associated with the averaged trials (AUC1 ) and individual trials (AUC2).
The accelerometer-based SVM classifier had ACC = 80.0%, TPR = 80.0%, and FPR = 20.0%. The AUC associated with the averaged trials was 0.87 (95% CI = 0.75, 0.99) and the AUC associated with the individual trials was 0.86 (95% CI = 0.79, 0.93). The force plate-based SVM classifier had two optimal cutoff points on its ROC curve, both with ACC = 87.5%. The trade-off between the two cutoff points was related to their TPR and FPR, with one of the optimal cutoff points resulting in TPR = 75.0% and FPR = 0.0% and the other optimal cutoff point resulting in TPR = 85.0% and FPR = 10.0%. The AUC associated with the averaged trials was 0.88 (95% CI = 0.77, 0.99) and the AUC associated with the individual trials was 0.89 (95% CI = 0.83, 0.95).
4. Discussion
Both the force plate-based SVM and the accelerometer-based SVM were able to successfully classify lower extremity movement patterns with a high level of accuracy. Therefore, both methods have the potential to provide value as objective and autonomous clinical assessment tools. The classification accuracy of the force plate-based SVM (ACC = 87.5%) was greater than the classification accuracy of the accelerometer-based SVM (ACC = 80.0%). However, this is not surprising, since the force plates collected data directly at the location of impact during the jump-landing task, while the accelerometer collected data that was an indirect and dissipated representation of the impact event. Additionally, the force plate data was collected from two sensors as opposed to the accelerometer data being collected from one, and the force plate data was sampled at a higher frequency than the accelerometer data. However, the accuracy of the accelerometer-based SVM was still very close to the accuracy achieved by the force plate-based SVM. This is impressive, especially when considering the orders of magnitude price difference between force plates and accelerometers. Additionally, a force plate-based assessment method would typically be limited to applications in a laboratory setting, while a wearable accelerometer-based assessment method could be implemented at a much wider scale in a field testing environment.
The force plate-based SVM has two optimal cutoff points on its ROC curve, as shown in Fig. 4B, which indicates that a judgment call must be made if this algorithm were to be implemented in practice. Most researchers agree that lower extremity injury prevention programs pose no risk to participants, while ACL and other serious lower extremity injuries have devastating consequences (Padua et al., 2015). Therefore, it is recommended that a cutoff point should be selected to prioritize minimizing the potential for false positive predictions. For this case, the force plate- based SVM classifier would have FPR = 0.0% and TPR = 75.0%, indicating that 100% of the participants who exhibited poor lower extremity movement patterns would be correctly classified at the cost of incorrectly classifying 25% of the participants who exhibited excellent lower extremity movement patterns. Again, since lower extremity injury prevention programs are not harmful to the participants, this seems like an acceptable trade-off.
A traditional analysis of the variables contained in the feature vectors of the two subject groups might suggest that only the TIME, IMP, and PEAKAvgTime variables have value for classifying lower extremity movement patterns, as these are the only variables contained in the feature vectors with significant differences between the two subject groups. However, it is important to consider that looking at the significant differences and effect sizes of individual variables is a reductionist approach that fails to account for complex patterns and interactions between variables. The SVM methods that were leveraged in this study searched the data for these patterns and variable relationships and generated sophisticated predictive models with excellent accuracy.
Machine learning algorithms have been applied to a large variety of biomechanical studies in recent years. However, to date, we believe that this is the first study that has applied machine learning algorithms to classify movement patterns during a jump-landing task that are associated with lower extremity injury risk. We also believe that the systems and methods employed in our study could be adapted in future studies to classify different movement patterns related to other injury risks.
One limitation of this study is the relatively small sample size (n = 40) and narrow demographics of the subject pool, as all of the participants were young female athletes. Further validations with a larger sample size and broader demographics could be applied before applying this SVM classification approach for clinical assessment purposes. An additional limitation of this study was the fact that only VGRF data was analyzed from the force plates, even though six degrees of freedom (tri-axial ground reaction forces and moments) were available. Similarly, only the data collected from the accelerometer axis aligned with the vertical orientation of the participant was used in this study, even though the chest-mounted inertial measurement unit captured nine degrees of freedom (tri-axial accelerometer, tri-axial gyroscope, and tri-axial magnetometer). The decision to use only the VGRF and single-axis acceleration data was due to the relatively small sample size of participants (n = 40), as increasing the size of the feature vectors could have resulted in overfitting. However, additional degrees of freedom undoubtedly contain valuable information, and further studies with larger sample sizes could utilize the extra features to increase classification accuracy.
The results of this study demonstrate that a linear SVM can accurately classify movement patterns during a jump-landing task that are associated with lower extremity injury risk. The objective and autonomous nature of this screening methodology eliminates the subjective limitations associated with many current clinical assessment tools. Integrating such a system with portable and inexpensive sensors would provide further benefits by enabling assessments to take place in a field testing environment as opposed to in a controlled laboratory setting.
Figures 5A-B illustrate example systems for assessing lower extremity movement quality. Figure 5A shows a first example system 500 including one or more user sensors 502 that may be worn by a user 504 or trained on the user 504 or otherwise situated to capture motion of the user 504. The system 500 includes a user device 506 and a movement evaluator 508 implemented on the user device 506. The user device 506 can be any appropriate computing device and typically includes at least one processor, memory storing instructions for the processor, and a display. The user device 506 may include a user input device and a wired or wireless communication system.
For example, the user device 506 may be a laptop, tablet, or mobile phone. The user sensors 502 or the user device 506 or both can be housed in a wearable fitness monitor. For example, the user sensors 502 may be embedded in a shoe or ankle bracelet or other user garment, and the user device 506 may be a smart watch. In some cases, some of the user sensors 502 are not wearable, e.g., where the user sensors 502 include a force plate situated on the ground for the user 504 to land on.
The movement evaluator 508 is programmed for receiving movement data from the user sensors 502 during the user's performance of a movement task and extracting one or more movement features from the movement data. Each movement feature characterizes a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk. The movement evaluator 508 is programmed for classifying the movement pattern into a classified risk category for lower extremity injury for the user based on the one or more movement features.
For example, classifying the movement pattern can include supplying the one or more movement features to a machine learning classifier trained using pre-classified training data for the movement task. The machine learning classifier can include a support vector machine (SVM) configured to construct a hyperplane to maximize, based on the pre-classified training data, a margin of separation between the risk categories. Extracting the one or more movement features can include performing a principle component analysis (PCA) to reduce a dimensionality of a feature set extracted from the movement data.
The movement evaluator 508 can be programmed to display, on a display of the user device 506, an indicator for the classified risk category for the user. For example, the movement evaluator 508 can display a text label for the classified risk category or a color representing the classified risk category.
Figure 5B illustrates a second example system 520 including the user sensors 502 situated to capture motion of the user 504. The movement data is transmitted to a remote server 522 implementing the movement evaluator 508. The movement evaluator 508 on the remote server 522 is configured to classify movement patterns, e.g., for storage and analysis or to return classifications for local display. For example, the user sensors 502 can transmit the movement data to the user device 506, which can then transmit the movement data to the remote server 522 (e.g., using a wireless router 524 and a data communications network 526 such as the Internet). In some examples, the user sensors 502 are configured with appropriate hardware for communicating directly with the wireless router 524 so that the user device 506 is optional. In some examples, the remote server 522 sends results from the classification to the user device 506 for display to the user 504.
In general, the movement task may be any appropriate task for assessing lower extremity motion quality, e.g., a jumping task; a running, jogging, or walking task; a cutting and sprinting task; a squatting task, a weight lifting task; a medicine ball toss task; or a combination of the listed tasks. For example, the following list illustrates optional examples of user sensors and movement features that can be used in various implementations for various movement tasks.
Jumping (Jump-Landing, Drop-Landing, Countermovement
Jump)
1 . Collect Data from Force Plates, IMU's, and/or Pressure Sensors
2. Extract Features
a Takeoff Phase
i. Peaks, Timing, Slopes, Areas under the Curve, Asymmetry
b Flight Phase 1
i. Timing, Stability, Asymmetry
c. Stance Phase 1
i. Peaks, Timing, Slopes, Areas under the Curve, Asymmetry
d Flight Phase 2
i. Timing, Stability, Asymmetry
e Stance Phase 2
i. Peaks, Timing, Slopes, Areas under the Curve, Asymmetry Run PCA or other similar feature reduction algorithm (optional) Train Predictor Algorithm
a. Jump Performance Quality
b. Landing Performance Quality
c. Movement Quality
d. Balance Index
Running, Jogging, and/or Walking
Collect Data from Force Plates, IMU's, Pressure Sensors, GPS, RFID, HR monitors, V02 monitors, and/or Sm02 monitors
Extract Features
a. Foot Strike Features
i. Pace, Cadence, Peaks, Ground Contact Time, Stride Length
b. Time Series Features
i. RMS, Speed-Normalized RMS, Step/Stride Regularity and Symmetry
Run PCA or other similar feature reduction algorithm (optional) Train Predictor Algorithm
a. Movement Quality
b. Fatigue Index
Agility and Quickness (Cutting and Sprinting)
Collect Data from Force Plates, IMU's, and/or Pressure Sensors Extract Features
a. Foot Strike Features
i. Pace, Cadence, Peaks, Ground Contact Time, Stride Length
b. Time Series Features
i. RMS, Speed-Normalized RMS, Step/Stride Regularity and Symmetry
c. Agility Features (during change of direction or stop/start) i. Peaks, Timing, Slopes, Areas under the Curve, Asymmetry
Run PCA or other similar feature reduction algorithm (optional) Train Predictor Algorithm
a. Agility Index
b. Movement Quality
Squatting (Single-Leg and Double-Leg)
Collect Data from Force Plates, IMU's, and/or Pressure Sensors Extract Features
a. Cadence, Peaks, Timing, Slopes, Areas under the Curve, Regularity, Asymmetry
Run PCA or other similar feature reduction algorithm (optional) Train Predictor Algorithm
a. Movement Quality
b. Balance Index
Weight Lifting
Collect Data from Force Plates, IMU's, Pressure Sensors, HR monitors, V02 monitors, and/or Sm02 monitors
Extract Features
a. Cadence, Peaks, Timing, Slopes, Areas under the Curve, Regularity, Asymmetry
Run PCA or other similar feature reduction algorithm (optional) Train Predictor Algorithm
a. Performance Quality
b. Movement Quality
c. Fatigue Index
d. Balance Index
Medicine Ball Toss
Collect Data from Force Plates, IMU's, and/or Pressure Sensors, Extract Features a. Peaks, Timing, Slopes, Areas under the Curve, Asymmetry Run PCA or other similar feature reduction algorithm (optional) Train Predictor Algorithm
a. Performance Quality
b. Movement Quality
c. Fatigue Index
d. Balance Index
Figures 6A-C illustrate example movement data from a countermovement jump (Figure 6A), a medicine ball throw (Figure 6B), and an agility test (Figure 6C).
Figures 7A-B illustrate example movement data from a (Figure 7A) chest-mounted tri-axial accelerometer and from a (Figure 7B) chest-mounted tri-axial gyroscope during treadmill running. The following five features were calculated from the time series for all six degrees of freedom: root mean square (RMS), economy of acceleration/angular velocity (ECON), step regularity (REG1 ), stride regularity (REG2), and symmetry (SYM). The ECON variable was calculated by dividing RMS by the treadmill speed. The REG1 and REG2 variables were calculated using an unbiased autocorrelation procedure at a phase shift equal to the average step time and stride time, respectively. The SYM variable was calculated as the percent difference between REG1 and REG2. The average peak vertical acceleration and the coefficient of variation for the vertical acceleration were also calculated. These features were selected to capture the timing, loading, efficiency, regularity, and asymmetrical characteristics of the movement profile.
Figure 8 is a flowchart of an example method 800 for assessing lower extremity movement quality. The method 800 can be performed by a movement evaluator implemented on a computer system.
The method 800 includes receiving movement data from one or more user sensors during a user's performance of a movement task (802). For example, receiving the movement data can include one or more of: receiving the movement data from one or more force plates during the user's performance of the movement task, receiving the movement data from one or more pressure sensors during the user's performance of the movement task, receiving the movement data from one or more video cameras during the user's performance of the movement task, and receiving the movement data from one or more wearable electronic fitness monitors worn by the user during the user's performance of the movement task.
A wearable electronic fitness monitor can include a combination of one or more accelerometers, gyroscopes, magnetometers, pressure sensors, GPS sensors, RFID sensors, HR monitors, V02 monitors, and Sm02 monitors. The wearable electronic fitness device can be worn in any appropriate position on the user, for example, in clothing or apparel, in a chest strap, in a sports bra, in a waistband or belt, in footwear, in headphones or headgear, in a watch or armband, or in a leg band or leg strap.
The method 800 includes extracting one or more movement features from the movement data, each movement feature characterizing a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk (804). Extracting the one or more movement features can include performing a principle component analysis to reduce the dimensionality of a feature set extracted from the movement data. The movement features may characterize a degree of fatigue.
In some examples, the movement task includes a jump-landing task and extracting the movement features includes identifying one or more movement phases of the jump-landing task in the movement data and extracting the one or more movement features from the movement phases of the movement data. The movement phases of the jump-landing task can be one or more takeoff phases, flight phases, and stance phases. The user sensors can include an accelerometer on the user, and extracting the movement features from the jump-landing task can include extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo-impulse during the second half of the stance phase, and a peak acceleration during the stance phase. The user sensors can include a force plate and the jump- landing task can include a combination of one or more events comprising jumping and landing on the one or more force plates, and extracting movement features can include extracting the movement features to characterize timing, loading, and asymmetrical characteristics from the one or more force plates.
In some examples, the movement task includes a drop-landing task, and extracting the movement features includes identifying one or more movement phases of the drop-landing task in the movement data and extracting the movement features from the movement phases of the movement data. The movement phases of the drop-landing task can include one or more takeoff phases, flight phases, and stance phases. The user sensors can include an accelerometer on the user, and extracting the movement features from the drop-landing task can include extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo-impulse during the second half of the stance phase, and a peak acceleration during the stance phase. The user sensors can include a force plate and the drop- landing task can include landing on the one or more force plates, and extracting the movement features can include extracting the movement features to characterize timing, loading, and asymmetrical characteristics from the one or more force plates.
In some examples, the movement task includes a countermovement jump task, and extracting the movement features includes identifying one or more movement phases of the countermovement jump task in the movement data and extracting the movement features from the movement phases of the movement data. The movement phases of a countermovement jump task can include one or more takeoff phases, flight phases, and stance phases. The user sensors can include an accelerometer on the user, and extracting the movement features from the countermovement jump task can include extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo-impulse during the second half of the stance phase, and a peak acceleration during the stance phase. The user sensors can include a force plate and the countermovement jump task can include a combination of one or more events including jumping and landing on the one or more force plates, and extracting the one or more movement features can include extracting the one or more movement features to characterize timing, loading, and asymmetrical characteristics from the one or more force plates.
In some examples, the movement task includes a running, jogging, or walking task. In some examples, the movement task includes one of a jumping task; a running, jogging, or walking task; a cutting and sprinting task; a squatting task, a weight lifting task; and a medicine ball toss task. The user sensors can include a combination of one or more accelerometers and gyroscopes on the user, and extracting the movement features can include extracting the movement features to characterize timing, loading, efficiency, regularity, and asymmetrical characteristics.
The method 800 includes classifying the movement pattern into a classified risk category for lower extremity injury for the user based on the one or more movement features (806). Classifying the movement pattern can include supplying the movement features to a machine learning classifier trained using pre-classified training data for the movement task. The machine learning classifier can be a support vector machine (SVM) configured to construct a hyperplane to maximize, based on the pre- classified training data, a margin of separation between the risk categories. The method 800 can include displaying, on a display device, an indicator for the classified risk category for the user.
It is understood that various details of the presently disclosed subject matter may be changed without departing from the scope of the presently disclosed subject matter. Furthermore, the foregoing description is for the purpose of illustration only, and not for the purpose of limitation.
References
Each of the following references is hereby incorporated herei reference in its entirety. Begg, R., Kamruzzaman, J., 2005. A machine learning approach for automated recognition of movement patterns using basic, kinetic and kinematic gait data. Journal of Biomechanics 38, 401 -408.
Bennetts, C.J., Owings, T.M., Erdemir, A., Botek, G., Cavanagh, P.R., 2013. Clustering and classification of regional peak plantar pressures of diabetic feet. Journal of Biomechanics 46, 19-25.
Bittencourt, N.F.N., Meeuwisse, W.H., Mendo^a, L.D., Nettel-Aguirre, A., Ocarino, J.M., Fonseca, ST., 2016. Complex systems approach for sports injuries: moving from risk factor identification to injury pattern recognition - narrative review and new concept. British Journal of Sports Medicine 50, 1309-1314.
Boser, B.E., Guyon, I.M., Vapnik, V.N., 1992. A training algorithm for optimal margin classifiers, in: Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp. 144 - 152.
Brophy, R.H., Schmitz, L, Wright, R.W., Dunn, W.R., Parker, R.D.,
Andrish, J.T., McCarty, E.C., Spindler, K.P., 2012. Return to play and future ACL injury risk after ACL reconstruction in soccer athletes from the multicenter orthopaedic outcomes network (MOON) group. The American Journal of Sports Medicine 40, 2517-2522.
Chan, Y., Fong, D.T., Chung, M.M., Li, W., Liao, W., Yung, P.S.,
Chan, K., 2010. Identification of ankle sprain motion from common sporting activities by dorsal foot kinematics data. Journal of Biomechanics 43, 1965- 1969.
Clermont, C.A., Osis, S T., Phinyomark, A., Ferber, R., 2017. Kinematic gait patterns in competitive and recreational runners. Journal of Applied Biomechanics , 1 -26.
Cohen, J., 1988. Statistical Power Analysis for the Behavioral Sciences. Lawrence Erlbaum Associates, Hillsdale.
Cortes, C, Vapnik, V., 1995. Support-vector networks. Machine Learning 20, 273-297.
Deluzio, K.J., Astephen, J.L., 2007. Biomechanical features of gait waveform data associated with knee osteoarthritis: An application of principal component analysis. Gait & Posture 25, 86-93. Fukuchi, R.K., Eskofier, B.M., Duarte, M., Ferber, R., 201 1 . Support vector machines for detecting age-related changes in running kinematics. Journal of Biomechanics 44, 540-542.
Joliffe, IT., 2002. Principal Component Analysis. 2 ed., Springer, New York.
Jonic, S., Jankovic, T., Gajic, V., Popvic, D., 1999. Three machine learning techniques for automatic determination of rules to control locomotion. IEEE Transactions on Biomedical Engineering 46, 300-310.
Kobsar, D., Osis, ST., Hettinga, B.A., Ferber, R., 2014. Classification accuracy of a single tri-axial accelerometer for training background and experience level in runners. Journal of Biomechanics 47, 2508-251 1 .
Labbe, D.R., de Guise, J. A., Mezghani, N., Godbout, V., Grimard, G., Baillargeon, D., Lavigne, P., Fernandes, J., Ranger, P., Hagemeister, N.,
201 1 . Objective grading of the pivot shift phenomenon using a support vector machine approach. Journal of Biomechanics 44, 1 -5.
Lutrek, M., Kalua, B., 2009. Fall detection and activity recognition with machine learning. Informatica 33, 197-204.
Maki, B.E., 1997. Gait changes in older adults: Predictors of falls or indicators of fear? Journal of the American Geriatrics Society 45, 313-320.
Maurer, C, Federolf, P., von Tscharner, V., Stirling, L, Nigg, B.M.,
2012. Discrimination of gender-, speed- and shoe-dependent movement patterns in runners using full-body kinematics. Gait & Posture 36, 40-45.
Moksnes, H., Engebretsen, L, Eitzen, I., Risberg, M.A., 2013.
Functional outcomes following a non-operative treatment algorithm for anterior cruciate ligament injuries in skeletally immature children 12 years and younger. A prospective cohort with 2 years follow-up. British Journal of
Sports Medicine 47, 488-494.
Muniz, A., Liu, H., Lyons, K., Pahwa, R., Liu, W., Nobre, F., Nadal, J.,
2010. Comparison among probabilistic neural network, support vector machine and logistic regression for evaluating the effect of subthalamic stimulation in Parkinson disease on ground reaction force during gait.
Journal of Biomechanics 43, 720-726. Noyes, F.R., Mooar, P.A., Matthews, D.S., Butler, D.L., 1983a. The symptomatic anterior cruciate-deficient knee. Part I: The long-term functional disability in athletically active individuals. Journal of Bone & Joint Surgery 65, 154-162.
Noyes, F.R., Mooar, P.A., Matthews, D.S., Butler, D.L., 1983b. The symptomatic anterior cruciate-deficient knee. Part II: The long-term functional disability in athletically active individuals. Journal of Bone & Joint Surgery 65, 163-174.
Onodera, A.N., Neto, W.P.G., Roveri, M. I., Oliveira, W.R., Sacco, I.C., 2017. Immediate effects of EVA midsole resilience and upper shoe structure on running biomechanics: A machine learning approach. PeerJ 5, e3026.
Padua, D.A., Boling, M.C., DiStefano, L.J., Onate, J.A., Beutler, A.I., Marshall, S.W., 201 1 . Reliability of the landing error scoring system-real time, a clinical assessment tool of jump-landing biomechanics. Journal of Sport Rehabilitation 20, 145-156.
Padua, D.A., DiStefano, L.J., Beutler, A. I., de la Motte, S.J., DiStefano, M.J., Marshall, S.W., 2015. The landing error scoring system as a screening tool for an anterior cruciate ligament injury-prevention program in elite-youth soccer athletes. Journal of Athletic Training 50, 589-595.
Padua, D.A., Marshall, S.W., Boling, M.C., Thigpen, C.A., Garrett,
W.E., Beutler, A. I., 2009. The landing error scoring system (LESS) is a valid and reliable clinical assessment tool of jump-landing biomechanics: The JUMP-ACL study. The American Journal of Sports Medicine 37, 1996-2002.
Phinyomark, A., Osis, S., Hettinga, B.A., Ferber, R., 2015. Kinematic gait patterns in healthy runners: A hierarchical cluster analysis. Journal of Biomechanics 48, 3897-3904.
Shalev-Shwartz, S., Ben-David, S., 2014. Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press, New York.
Silver, A.E., Lungren, M.P., Johnson, M.E., O'Driscoll, S.W., An, K.,
Hughes, R.E., 2006. Using support vector machines to optimally classify rotator cuff strength data and quantify post-operative strength in rotator cuff tear patients. Journal of Biomechanics 39, 973-979. Wu, J., Wang, J., 2008. PCA-based SVM for automatic recognition of gait patterns. Journal of Applied Biomechanics 24, 83-87.

Claims

CLAIMS What is claimed is:
1 . A method for assessing lower extremity movement quality, the method comprising:
receiving, by a movement evaluator comprising at least one processor, movement data from one or more user sensors during a user's performance of a movement task;
extracting, by the movement evaluator, one or more movement features from the movement data, each movement feature characterizing a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk; and
classifying, by the movement evaluator, the movement pattern into a classified risk category of a plurality of risk categories for lower extremity injury for the user based on the one or more movement features.
2. The method of claim 1 , wherein classifying the movement pattern comprises supplying the one or more movement features to a machine learning classifier trained using pre-classified training data for the movement task.
3. The method of claim 2, wherein the machine learning classifier comprises a support vector machine (SVM) configured to construct a hyperplane to maximize, based on the pre-classified training data, a margin of separation between the risk categories.
4. The method of claim 1 , wherein extracting the one or more movement features comprises performing a principle component analysis (PCA) to reduce the dimensionality of a feature set extracted from the movement data.
5. The method of claim 1 , wherein receiving the movement data from the one or more user sensors comprises receiving the movement data from one or more force plates during the user's performance of the movement task.
6. The method of claim 1 , wherein receiving the movement data from the one or more user sensors comprises receiving the movement data from one or more pressure sensors during the user's performance of the movement task.
7. The method of claim 1 , wherein receiving the movement data from the one or more user sensors comprises receiving the movement data from one or more video cameras during the user's performance of the movement task.
8. The method of claim 1 , wherein receiving the movement data from the one or more user sensors comprises receiving the movement data from one or more wearable electronic fitness monitors worn by the user during the user's performance of the movement task.
9. The method of claim 8, wherein the wearable electronic fitness monitor worn by the user comprises a combination of one or more accelerometers, gyroscopes, magnetometers, pressure sensors, GPS sensors, RFID sensors, HR monitors, V02 monitors, and Sm02 monitors.
10. The method of claim 1 , wherein the movement task comprises a jump-landing task, and wherein extracting the one or more movement features comprises identifying one or more movement phases of the jump- landing task in the movement data and extracting the one or more movement features from the movement phases of the movement data.
1 1 . The method of claim 10, wherein the movement phases of the jump- landing task comprise one or more takeoff phases, flight phases, and stance phases.
12. The method of claim 10, wherein the one or more user sensors comprise an accelerometer on the user, and wherein extracting the one or more movement features from the jump-landing task comprises extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo- impulse during the second half of the stance phase, and a peak acceleration during the stance phase.
13. The method of claim 10, wherein the one or more user sensors comprise a force plate and the jump-landing task comprises a combination of one or more events comprising jumping and landing on the one or more force plates, and wherein extracting the one or more movement features comprises extracting the one or more movement features to characterize a plurality of timing, loading, and asymmetrical characteristics from the one or more force plates.
14. The method of claim 1 , wherein the movement task comprises a drop- landing task, and wherein extracting the one or more movement features comprises identifying one or more movement phases of the drop-landing task in the movement data and extracting the one or more movement features from the movement phases of the movement data.
15. The method of claim 14, wherein the movement phases of the drop- landing task comprise one or more takeoff phases, flight phases, and stance phases.
16. The method of claim 14, wherein the one or more user sensors comprise an accelerometer on the user, and wherein extracting the one or more movement features from the drop-landing task comprises extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo- impulse during the second half of the stance phase, and a peak acceleration during the stance phase.
17. The method of claim 14, wherein the one or more user sensors comprise a force plate and the drop-landing task comprises landing on the one or more force plates, and wherein extracting the one or more movement features comprises extracting the one or more movement features to characterize a plurality of timing, loading, and asymmetrical characteristics from the one or more force plates.
18. The method of claim 1 , wherein the movement task comprises a countermovement jump task, and wherein extracting the one or more movement features comprises identifying one or more movement phases of the countermovement jump task in the movement data and extracting the one or more movement features from the movement phases of the movement data.
19. The method of claim 18, wherein the movement phases of a countermovement jump task comprise one or more takeoff phases, flight phases, and stance phases.
20. The method of claim 18, wherein the one or more user sensors comprise an accelerometer on the user, and wherein extracting the one or more movement features from the countermovement jump task comprises extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo-impulse during the second half of the stance phase, and a peak acceleration during the stance phase.
21 . The method of claim 18, wherein the one or more user sensors comprise a force plate and the countermovement jump task comprises a combination of one or more events comprising jumping and landing on the one or more force plates, and wherein extracting the one or more movement features comprises extracting the one or more movement features to characterize a plurality of timing, loading, and asymmetrical characteristics from the one or more force plates.
22. The method of claim 1 , wherein the movement task comprises one of a jumping task; a running, jogging, or walking task; a cutting and sprinting task; a squatting task, a weight lifting task; and a medicine ball toss task.
23. The method of claim 1 , comprising displaying, on a display device, an indicator for the classified risk category for the user.
24. A system for assessing lower extremity movement quality, the system comprising:
one or more user sensors;
at least one processor; and
a movement evaluator implemented using at least one processor and configured to perform operations comprising:
receiving movement data from the one or more user sensors during a user's performance of a movement task;
extracting one or more movement features from the movement data, each movement feature characterizing a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk; and
classifying the movement pattern into a classified risk category of a plurality of risk categories for lower extremity injury for the user based on the one or more movement features.
25. The system of claim 24, wherein classifying the movement pattern comprises supplying the one or more movement features to a machine learning classifier trained using pre-classified training data for the movement task.
26. The system of claim 25, wherein the machine learning classifier comprises a support vector machine (SVM) configured to construct a hyperplane to maximize, based on the pre-classified training data, a margin of separation between the risk categories.
27. The system of claim 24, wherein extracting the one or more movement features comprises performing a principle component analysis (PCA) to reduce a dimensionality of a feature set extracted from the movement data.
28. The system of claim 24, wherein receiving the movement data from the one or more user sensors comprises receiving the movement data from one or more force plates during the user's performance of the movement task.
29. The system of claim 24, wherein receiving the movement data from the one or more user sensors comprises receiving the movement data from one or more pressure sensors during the user's performance of the movement task.
30. The system of claim 24, wherein receiving the movement data from the one or more user sensors comprises receiving the movement data from one or more video cameras during the user's performance of the movement task.
31 . The system of claim 24, wherein receiving the movement data from the one or more user sensors comprises receiving the movement data from one or more wearable electronic fitness monitors worn by the user during the user's performance of the movement task, wherein the one or more wearable electronic fitness monitors house the one or more user sensors or at least one processor or both.
32. The system of claim 31 , wherein the wearable electronic fitness monitor worn by the user comprises a combination of one or more accelerometers, gyroscopes, magnetometers, pressure sensors, GPS sensors, RFID sensors, HR monitors, V02 monitors, and Sm02 monitors.
33. The system of claim 24, wherein the movement task comprises a jump-landing task, and wherein extracting the one or more movement features comprises identifying one or more movement phases of the jump- landing task in the movement data and extracting the one or more movement features from the movement phases of the movement data.
34. The system of claim 33, wherein the movement phases of the jump- landing task comprise one or more takeoff phases, flight phases, and stance phases.
35. The system of claim 33, wherein the one or more user sensors comprise an accelerometer on the user, and wherein extracting the one or more movement features from the jump-landing task comprises extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo- impulse during the second half of the stance phase, and a peak acceleration during the stance phase.
36. The system of claim 33, wherein the one or more user sensors comprise a force plate and the jump-landing task comprises a combination of one or more events comprising jumping and landing on the one or more force plates, and wherein extracting the one or more movement features comprises extracting the one or more movement features to characterize a plurality of timing, loading, and asymmetrical characteristics from the one or more force plates.
37. The system of claim 24, wherein the movement task comprises a drop-landing task, and wherein extracting the one or more movement features comprises identifying one or more movement phases of the drop- landing task in the movement data and extracting the one or more movement features from the movement phases of the movement data.
38. The system of claim 37, wherein the movement phases of the drop- landing task comprise one or more takeoff phases, flight phases, and stance phases.
39. The system of claim 37, wherein the one or more user sensors comprise an accelerometer on the user, and wherein extracting the one or more movement features from the drop-landing task comprises extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo- impulse during the second half of the stance phase, and a peak acceleration during the stance phase.
40. The system of claim 37, wherein the one or more user sensors comprise a force plate and the drop-landing task comprises landing on the one or more force plates, and wherein extracting the one or more movement features comprises extracting the one or more movement features to characterize a plurality of timing, loading, and asymmetrical characteristics from the one or more force plates.
41 . The system of claim 24, wherein the movement task comprises a countermovement jump task, and wherein extracting the one or more movement features comprises identifying one or more movement phases of the countermovement jump task in the movement data and extracting the one or more movement features from the movement phases of the movement data.
42. The system of claim 41 , wherein the movement phases of a countermovement jump task comprise one or more takeoff phases, flight phases, and stance phases.
43. The system of claim 41 , wherein the one or more user sensors comprise an accelerometer on the user, and wherein extracting the one or more movement features from the countermovement jump task comprises extracting one or more of a combination of a ground contact time during the stance phase, a pseudo-impulse during the first half of the stance phase, a pseudo-impulse during the second half of the stance phase, and a peak acceleration during the stance phase.
44. The system of claim 41 , wherein the one or more user sensors comprise a force plate and the countermovement jump task comprises a combination of one or more events comprising jumping and landing on the one or more force plates, and wherein extracting the one or more movement features comprises extracting the one or more movement features to characterize a plurality of timing, loading, and asymmetrical characteristics from the one or more force plates.
45. The system of claim 24, wherein the movement task comprises one of a jumping task; a running, jogging, or walking task; a cutting and sprinting task; a squatting task, a weight lifting task; and a medicine ball toss task.
46. The system of claim 24, comprising a display device, wherein the operations comprise displaying, on a display device, an indicator for the classified risk category for the user.
47. A non-transitory computer readable medium storing executable instructions that when executed by at least one processor of a computer control the computer to perform operations comprising:
receiving movement data from one or more user sensors during a user's performance of a movement task;
extracting one or more movement features from the movement data, each movement feature characterizing a respective aspect of a movement pattern of the user's performance that is associated with lower extremity injury risk; and
classifying the movement pattern into a classified risk category of a plurality of a risk categories for lower extremity injury for the user based on the one or more movement features.
PCT/US2018/042451 2017-07-17 2018-07-17 Methods, systems, and non-transitory computer readable media for assessing lower extremity movement quality WO2019018371A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/632,054 US20200147451A1 (en) 2017-07-17 2018-07-17 Methods, systems, and non-transitory computer readable media for assessing lower extremity movement quality

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762533523P 2017-07-17 2017-07-17
US62/533,523 2017-07-17

Publications (1)

Publication Number Publication Date
WO2019018371A1 true WO2019018371A1 (en) 2019-01-24

Family

ID=65016128

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/042451 WO2019018371A1 (en) 2017-07-17 2018-07-17 Methods, systems, and non-transitory computer readable media for assessing lower extremity movement quality

Country Status (2)

Country Link
US (1) US20200147451A1 (en)
WO (1) WO2019018371A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112468956A (en) * 2020-11-12 2021-03-09 西安邮电大学 Human activity monitoring method for indoor positioning and motion state

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11877870B2 (en) * 2019-08-05 2024-01-23 Consultation Semperform Inc Systems, methods and apparatus for prevention of injury
CN110782991B (en) * 2019-10-23 2022-06-10 吉林大学 Real-time evaluation method for assisting rehabilitation exercise of heart disease patient
US20210393166A1 (en) * 2020-06-23 2021-12-23 Apple Inc. Monitoring user health using gait analysis
JP7173102B2 (en) * 2020-07-01 2022-11-16 カシオ計算機株式会社 Information processing device, information processing method and program
US20220042801A1 (en) * 2020-08-07 2022-02-10 The Regents Of The University Of California Methods and systems for adaptive pedestrian inertial navigation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040127337A1 (en) * 1997-03-12 2004-07-01 Nashner Lewis M. Reducing errors in screening-test administration
US20050015002A1 (en) * 2003-07-18 2005-01-20 Dixon Gary S. Integrated protocol for diagnosis, treatment, and prevention of bone mass degradation
US20110230791A1 (en) * 2008-08-28 2011-09-22 Koninklijke Philips Electronics N.V. Fall detection and/or prevention systems
US20160300347A1 (en) * 2014-01-02 2016-10-13 Accelerated Conditioning And Learning, Llc Dynamic movement assessment system and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8363891B1 (en) * 2012-03-26 2013-01-29 Southern Methodist University System and method for predicting a force applied to a surface by a body during a movement
US10959647B2 (en) * 2015-12-30 2021-03-30 Seismic Holdings, Inc. System and method for sensing and responding to fatigue during a physical activity

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040127337A1 (en) * 1997-03-12 2004-07-01 Nashner Lewis M. Reducing errors in screening-test administration
US20050015002A1 (en) * 2003-07-18 2005-01-20 Dixon Gary S. Integrated protocol for diagnosis, treatment, and prevention of bone mass degradation
US20110230791A1 (en) * 2008-08-28 2011-09-22 Koninklijke Philips Electronics N.V. Fall detection and/or prevention systems
US20160300347A1 (en) * 2014-01-02 2016-10-13 Accelerated Conditioning And Learning, Llc Dynamic movement assessment system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DARIN A. PADUA ET AL.: "The Landing Error Scoring System (LESS) Is a Valid and Reliable Clinical Assessment Tool of Jump-Landing Biomechanics", THE AMERICAN JOURNAL OF SPORTS MEDICINE, vol. 37, no. 10, 2 September 2009 (2009-09-02), pages 1996 - 2002, XP055571860 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112468956A (en) * 2020-11-12 2021-03-09 西安邮电大学 Human activity monitoring method for indoor positioning and motion state
CN112468956B (en) * 2020-11-12 2022-10-11 西安邮电大学 Human activity monitoring method for indoor positioning and motion state

Also Published As

Publication number Publication date
US20200147451A1 (en) 2020-05-14

Similar Documents

Publication Publication Date Title
US20200147451A1 (en) Methods, systems, and non-transitory computer readable media for assessing lower extremity movement quality
Barth et al. Biometric and mobile gait analysis for early diagnosis and therapy monitoring in Parkinson's disease
Zhang et al. Classifying lower extremity muscle fatigue during walking using machine learning and inertial sensors
Hebenstreit et al. Effect of walking speed on gait sub phase durations
Phinyomark et al. Gender differences in gait kinematics in runners with iliotibial band syndrome
Prateek et al. Modeling, detecting, and tracking freezing of gait in Parkinson disease using inertial sensors
Sethi et al. A comprehensive survey on gait analysis: History, parameters, approaches, pose estimation, and future work
Ahamed et al. Subject-specific and group-based running pattern classification using a single wearable sensor
Phinyomark et al. Do intermediate-and higher-order principal components contain useful information to detect subtle changes in lower extremity biomechanics during running?
US20180092572A1 (en) Gathering and Analyzing Kinetic and Kinematic Movement Data
Barth et al. Combined analysis of sensor data from hand and gait motor function improves automatic recognition of Parkinson's disease
Hemmatpour et al. A review on fall prediction and prevention system for personal devices: evaluation and experimental results
Lee et al. Gender recognition using optimal gait feature based on recursive feature elimination in normal walking
Howcroft et al. Prospective elderly fall prediction by older-adult fall-risk modeling with feature selection
Sama et al. Analyzing human gait and posture by combining feature selection and kernel methods
US11006860B1 (en) Method and apparatus for gait analysis
Matsushita et al. Recent use of deep learning techniques in clinical applications based on gait: a survey
Kour et al. A survey of knee osteoarthritis assessment based on gait
WO2019095055A1 (en) Method and system utilizing pattern recognition for detecting atypical movements during physical activity
Strohrmann et al. A data-driven approach to kinematic analysis in running using wearable technology
Senanayake et al. A knowledge-based intelligent framework for anterior cruciate ligament rehabilitation monitoring
Kaur et al. A Vision-Based Framework for Predicting Multiple Sclerosis and Parkinson's Disease Gait Dysfunctions—A Deep Learning Approach
Carrier et al. Validation of garmin fenix 3 HR fitness tracker biomechanics and metabolics (VO2max)
Majumder et al. A wireless smart-shoe system for gait assistance
Faisal et al. Characterization of knee and gait features from a wearable tele-health monitoring system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18834966

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18834966

Country of ref document: EP

Kind code of ref document: A1