Assignment 1a And 1b

  • Uploaded by: Avinash Baldi
  • 0
  • 0
  • August 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Assignment 1a And 1b as PDF for free.

More details

  • Words: 469
  • Pages: 4
Assignment 1a. and 1b. -Project by Aniruddha Paturkar. -ID: 2017H1240115P

Assignment 1a. Aim: This experiment tries to compare the amount of information in various types of images. It also finds whether there is some loss of information or not after the format of the original image is changed. Note: In this experiment, I have calculated the Entropy of various images using both, the direct MATLAB command as well as the typical Entropy formula. Since, both the values are identical, I haven’t mentioned them separately. Formula: Entropy is given by,

H(X) = -∑p(xi)*log2(p(xi)) Where, H(X) = Entropy i.e. Average information per source letter. p(xi) = Probability of occurrence of the symbol xi. X = Possible output symbols from a source. Result: Entropy values obtained for various types of images is summarized in table shown below. Sr. no.

Image

1. 2. 3. 4.

Natural Medical Remote Binary

Entropy of original .tiff format 7.6462 7.3568 6.6296 1.1776

Entropy of modified .jpg format 7.6461 7.3711 6.6258 1.27

Conclusion: After the comparisons, we find that the histograms (see next page) and entropies of original .tiff images and formatted .jpg images are almost identical. Hence, it is concluded that there is no significant loss of information after converting the .tiff images into .jpg format. Further, Natural image has highest Entropy, followed by Medical image. Remote image has somewhat lesser Entropy while Binary image has least Entropy. Hence, as the amount of information i.e. number of grey levels reduces in an image, Entropy is reduced.

The figure on next page shows the histogram plots for various types of images.

Assignment 1b. Aim: This experiment tries to compare a synthetic Greyscale image with its Binary counterpart obtained by thresholding. The normalized threshold value used in the experiment was 0.8. We compare the entropies of both the images to see the effect on the Average amount of information per symbol in both the images carry. Note: In this experiment, I have calculated the Entropy of the images using both, the direct MATLAB command as well as the typical Entropy formula. Since, both the values are identical, I haven’t mentioned them separately. Formula: Entropy is given by,

H(X) = -∑p(xi)*log2(p(xi)) Where, H(X) = Entropy i.e. Average information per source letter. p(xi) = Probability of occurrence of the symbol xi. X = Possible output symbols from a source. Result: Entropy values obtained for both the images is summarized in table shown below. Sr. no.

Entropy of original Greyscale image

1.

1.5289

Entropy of the modified Binary image (with normalized threshold = 0.8) 0.8742

Conclusion: It is concluded that as the greyscale image is formatted into binary image, its entropy has decreased significantly since number of grey levels are now reduced to only two viz. Black and White.

The figure on next page shows the histogram plots for both the images.

Related Documents

Assignment 1a And 1b
August 2019 18
Assignment 1b- N
June 2020 9
Cia 720 Assignment 1a
June 2020 7
Assignment 1a 2009
June 2020 5

More Documents from ""

Readme.txt
November 2019 20
Assignment 1a And 1b
August 2019 18
Report_vlsi.docx
November 2019 21
Lecture Slides On Cpfsk.pdf
November 2019 33