Loading...
Search for: complexity-measures
0.008 seconds

    Estimating watermarking capacity in gray scale images based on image complexity

    , Article Eurasip Journal on Advances in Signal Processing ; Volume 2010 , December , 2010 ; 16876172 (ISSN) Yaghmaee, F ; Jamzad, M ; Sharif University of Technology
    2010
    Abstract
    Capacity is one of the most important parameters in image watermarking. Different works have been done on this subject with different assumptions on image and communication channel. However, there is not a global agreement to estimate watermarking capacity. In this paper, we suggest a method to find the capacity of images based on their complexities. We propose a new method to estimate image complexity based on the concept of Region Of Interest (ROI). Our experiments on 2000 images showed that the proposed measure has the best adoption with watermarking capacity in comparison with other complexity measures. In addition, we propose a new method to calculate capacity using proposed image... 

    Galloping in fast-growth natural merge sorts

    , Article 49th EATCS International Conference on Automata, Languages, and Programming, ICALP 2022, 4 July 2022 through 8 July 2022 ; Volume 229 , 2022 ; 18688969 (ISSN); 9783959772358 (ISBN) Ghasemi, E ; Jugé, V ; Khalighinejad, G ; CNRS; Inria; Nomadic Lab; Universite Paris Cite ; Sharif University of Technology
    Schloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing  2022
    Abstract
    We study the impact of sub-array merging routines on merge-based sorting algorithms. More precisely, we focus on the galloping sub-routine that TimSort uses to merge monotonic (non-decreasing) sub-arrays, hereafter called runs, and on the impact on the number of element comparisons performed if one uses this sub-routine instead of a naive merging routine. The efficiency of TimSort and of similar sorting algorithms has often been explained by using the notion of runs and the associated run-length entropy. Here, we focus on the related notion of dual runs, which was introduced in the 1990s, and the associated dual run-length entropy. We prove, for this complexity measure, results that are... 

    A complexity-based approach in image compression using neural networks

    , Article World Academy of Science, Engineering and Technology ; Volume 35 , 2009 , Pages 684-694 ; 2010376X (ISSN) Veisi, H ; Jamzad, M ; Sharif University of Technology
    2009
    Abstract
    In this paper we present an adaptive method for image compression that is based on complexity level of the image. The basic compressor/de-compressor structure of this method is a multilayer perceptron artificial neural network. In adaptive approach different Back-Propagation artificial neural networks are used as compressor and de-compressor and this is done by dividing the image into blocks, computing the complexity of each block and then selecting one network for each block according to its complexity value. Three complexity measure methods, called Entropy, Activity and Pattern-based are used to determine the level of complexity in image blocks and their ability in complexity estimation... 

    Introducing a new method for estimation image complexity according to calculate watermark capacity

    , Article 2008 4th International Conference on Intelligent Information Hiding and Multiedia Signal Processing, IIH-MSP 2008, Harbin, 15 August 2008 through 17 August 2008 ; 2008 , Pages 981-984 ; 9780769532783 (ISBN) Yaghmaee, F ; Jamzad, M ; Sharif University of Technology
    2008
    Abstract
    One of the most important parameters in evaluating a watermarking algorithm is its capacity. In fact, capacity has a paradoxical relation with other two important parameters: image quality and robustness. Some works have been done on watermarking capacity and a few on image complexities. Most works on watermarking capacity is based on information theory and the capacity values which are calculated based on these methods are very tolerate. In this paper we propose a new method for calculating image complexity based on Region Of Interest (ROI) concept. After that we analyze three complexity measures named: Image compositional complexity (ICC), Quad tree and ROI method, with three different... 

    Image compression with neural networks using complexity level of images

    , Article ISPA 2007 - 5th International Symposium on Image and Signal Processing and Analysis, Istanbul, 27 September 2007 through 29 September 2007 ; 2007 , Pages 282-287 ; 9789531841160 (ISBN) Veisi, H ; Jamzad, M ; Sharif University of Technology
    2007
    Abstract
    This paper presents a complexity-based image compression method using neural networks. In this method, different multi-layer perceptron ANNs are used as compressor and de-compressor. Each image is divided into blocks, complexity of each block is computed using complexity measure methods and one network is selected for each block according to its complexity value. Three complexity measure methods, called Entropy, Activity and Pattern-based are used to determine the level of complexity in image blocks and their ability are evaluated and compared together. Selection of a network for each image block is based on its complexity value or the Best-SNR criterion. Best-SNR chooses one of the trained...