Computer science: Difference between revisions

1,201 bytes added ,  11 April 2021
add info
(add info)
(add info)
Line 125: Line 125:
{{main|Information theory|Coding theory}}
{{main|Information theory|Coding theory}}


Information theory, closely related to [[probability]] and [[statistics]], is related to the quantification of information. This was developed by [[Claude Shannon]] to find fundamental limits on [[signal processing]] operations such as compressing data and on reliably storing and communicating data.<ref>{{cite web |date=October 14, 2002 |last=P. Collins |first=Graham |title=Claude E. Shannon: Founder of Information Theory |url=http://www.scientificamerican.com/article.cfm?id=claude-e-shannon-founder |work=Scientific American |access-date=December 12, 2014}}</ref>
Coding theory is the study of the properties of [[code]]s (systems for converting information from one form to another) and their fitness for a specific application. Codes are used for [[data compression]], [[cryptography]], [[error detection and correction]], and more recently also for [[Linear network coding|network coding]]. Codes are studied for the purpose of designing efficient and reliable [[data transmission]] methods.
<ref>Van-Nam Huynh; Vladik Kreinovich; Songsak Sriboonchitta; 2012. Uncertainty Analysis in Econometrics with Applications. Springer Science & Business Media. p. 63. {{ISBN|978-3-642-35443-4}}.</ref>


=== Answering the question ===
=== Answering the question ===