Home > Terms > English, UK (UE) > Entropy

Entropy

In video, entropy, the average amount of information represented by a symbol in a message, is a function of the model used to produce that message and can be reduced by increasing the complexity of the model so that it better reflects the actual distribution of source symbols in the original message. Entropy is a measure of the information contained in a message, it’s the lower bound for compression.

This is auto-generated content. You can help to improve it.
0
Collect to Blossary

Member comments

You have to log in to post to discussions.

Terms in the News

Featured Terms

Harry8L
  • 0

    Terms

  • 0

    Blossaries

  • 1

    Followers

Industry/Domain: Aviation Category: General aviation

Elevator

A movable horizontal airfoil, usually attached to the horizontal stabiliser on the tail, that is used to control pitch. It usually changes the ...

Contributor

Featured blossaries

Finance and Econmics

Category: Business   1 1 Terms

International Organizations

Category: Politics   1 20 Terms