Generic placeholder image

Recent Advances in Electrical & Electronic Engineering

Editor-in-Chief

ISSN (Print): 2352-0965
ISSN (Online): 2352-0973

Research Article

3D Visual Motion Amplitude Tracking Simulation Method for Sports

Author(s): Ni Zhuo and Anil Sharma*

Volume 14, Issue 7, 2021

Published on: 09 July, 2021

Page: [718 - 726] Pages: 9

DOI: 10.2174/2352096514666210709161646

Price: $65

Abstract

Background: The character motion of 3D animation production is difficult, and to solve this problem, motion-pose capture technology is very effective. The workload of character motion control is reduced by this technique and the animation development is improved. It is used in virtual training grounds and real-time tracking of the motion. The complete contour features cannot be efficiently acquired by the 3D visual motion technique. The 3D visual motion amplitude tracking method in sports is studied in order to improve the training quality of athletes effectively. However, when the current method is used for motion amplitude tracking, the markers in the adjacent monocular sequence motion amplitude images cannot be calculated.

Methodology: At this point, there is a problem of a large tracking error of 3D visual motion amplitude. To this end, a 3D amplitude tracking method in sports based on inverse kinematics is proposed. This method locates the marker points that appear in the adjacent monocular sequence motion range and merges it with the rotation angle method to predict the position of the motion range in the image.

Results: The motion amplitude is obtained, and 3D visual motion amplitude is outside the optimization area; this has been used as a basis to complete the tracking of the 3D visual motion amplitude in sports.

Conclusion: Experimental simulation performed in MATLAB proves the high tracking accuracy of the proposed method.

Keywords: Sports, Three-dimensional vision, motion range tracking, motion amplitude, real-time tracking, camera imaging plane, 3D visual motion, pinhole camera model.

Graphical Abstract


Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy