Abstract
Background: In the internet era, there is a prime need to access and manage the huge volume of multimedia data in an effective manner. Shot is a sequence of frames captured by a single camera in an uninterrupted space and time. Shot detection is suitable for various applications such that video browsing, video indexing, content based video retrieval and video summarization.
Objective: To detect the shot transitions in the video within a short duration. It compares the visual features of frames like correlation, histogram and texture features only in the candidate region frames instead of comparing the full frames in the video file.
Methods: This paper analyses candidate frames by searching the values of frame features which matches with the abrupt detector followed by the correct cut transition frame with in the datacube recursively until it detects the correct transition frame. If they are matched with the gradual detector, then it will give the gradual transition ranges, otherwise the algorithm will compare the frames within the next datacube to detect shot transition.
Results: The total average detection rates of all transitions computed in the proposed Data-cube Search Based Shot Boundary Detection technique are 92.06 for precision, 96.92 for recall and 93.94 for f1 measure and the maximum accurate detection rate.
Conclusion: Proposed method for shot transitions uses correlation value for searching procedure with less computation time than the existing methods which compares every single frame and uses multi features such as color, edge, motion and texture features in wavelet domain.
Keywords: Shot boundary detection, video summarization, content-based video retrieval, abrupt transition detection, and gradual transition detection, frame features.
Graphical Abstract
[http://dx.doi.org/10.2174/2213275911666180719111118]
[http://dx.doi.org/10.1016/j.compeleceng.2009.02.003]
[http://dx.doi.org/10.1142/S021946780100027X]
[http://dx.doi.org/10.1016/S0923-5965(00)00011-4]
[http://dx.doi.org/10.1109/76.988656]
[http://dx.doi.org/10.1023/A:1009630331620]
[http://dx.doi.org/10.1007/BF01210504]
[http://dx.doi.org/10.1117/12.206348]
[http://dx.doi.org/10.1016/S0167-8655(01)00085-X]
[http://dx.doi.org/10.1006/jvci.1998.0402]
[http://dx.doi.org/10.1007/BF01261224]
[http://dx.doi.org/10.1007/s00500-009-0527-9]
[http://dx.doi.org/10.1016/j.cviu.2007.09.014]