The exponential growth of visual and pictorial content undoubtedly drives an increasing need for image similarity quantification that can be utilized for various computer vision applications. The similarity of two images is often measured with respect to some attributes, for instance, shape, color or texture. The attributes to be utilized in the measure are application-dependent. For example, texture attributes are widely used in constructing images depicting natural scenes for virtual reality environments in what is known as texture synthesis. In example-based texture synthesis, texture images are generated or extended in a way such that they have the same textural feel and pattern as the example texture image without naively copying the example image. The synthesis process can be optimized by maximizing the textural similarity between the example and synthesized images.
In this paper, we introduce a non-parametric texture similarity measure based on the singular value decomposition of the Curvelet coefficients followed by a content-based truncation of the singular values. This measure focuses on the textural and directional content of images. Such content is critical for image perception and its similarity plays a vital role in various computer vision applications. In this paper, we evaluate the effectiveness of the proposed measure using a retrieval experiment. The proposed measure outperforms the state-of-the-art texture similarity metrics on CUReT and PerTex texture databases, respectively.
- M. Alfarraj, Y. Alaudah, and G. AlRegib , “Content-adaptive Non-parametric Texture Similarity Measure,” 2016 IEEE Workshop on Multimedia Signal Processing (MMSP 2016), Montreal, Canada, Sep. 21-23, 2016. [PDF] [PPT (Poster/Slide)] [Bib] [Code]