Information Processing Ser.: Information Processing: the Maximum Entropy Principle by David Blower (2013, Trade Paperback)

Better World Books West (388175)
99.1% positive feedback
Price:
US $25.91
ApproximatelyRM 108.55
+ $19.00 shipping
Estimated delivery Wed, 26 Nov - Tue, 9 Dec
Returns:
30 days return. Buyer pays for return shipping. If you use an eBay shipping label, it will be deducted from your refund amount.
Condition:
Very Good

About this product

Product Identifiers

PublisherCreateSpace
ISBN-101482359510
ISBN-139781482359510
eBay Product ID (ePID)167517992

Product Key Features

Number of Pages608 Pages
LanguageEnglish
Publication NameInformation Processing: the Maximum Entropy Principle
SubjectProbability & Statistics / Bayesian Analysis
Publication Year2013
TypeTextbook
Subject AreaMathematics
AuthorDavid Blower
SeriesInformation Processing Ser.
FormatTrade Paperback

Dimensions

Item Height1.4 in
Item Weight45.5 Oz
Item Length10 in
Item Width7 in

Additional Product Features

Intended AudienceTrade
SynopsisHow does an Information Processor assign legitimate numerical values to probabilities? One very powerful method to achieve this goal is through the Maximum Entropy Principle. Let a model insert information into a probability distribution by specifying constraint functions and their averages. Then, maximize the amount of missing information that remains after taking this step. The quantitative measure of the amount of missing information is Shannon's information entropy. Examples are given showing how the Maximum Entropy Principle assigns numerical values to the probabilities in coin tossing, dice rolling, statistical mechanics , and other inferential scenarios. The Maximum Entropy Principle also eliminates the mystery as to the origin of the mathematical expressions underlying all probability distributions. The MEP derivation for the Gaussian and generalized Cauchy distributions is shown in detail.The MEP is also related to Fisher information and the Kullback-Leibler measure of relative entropy. The initial examples shown are a prelude to a more in-depth discussion of Information Geometry.
No ratings or reviews yet
Be the first to write a review