Mark (Prof) Reynolds

Master Solutions Architect

Data Engineer

Backend Developer

Integrated Systems Engineer

Systems and Industrial Integration

Mark (Prof) Reynolds

Master Solutions Architect

Data Engineer

Backend Developer

Integrated Systems Engineer

Systems and Industrial Integration

Blog Post

Information Theory and Information Flow

Information Theory and Information Flow

(originally posted on blogspot January 28, 2010)

Information is the core, the root, of any business. But exactly what is information? Many will immediately begin explaining computer databases. But only a small portion of information theory is actually computer databases.

Information is a concrete substance in that it is a quantity that is sought, it is a quantity that can be sold, and it is a quantity that is protected.

Wikipedia’s definition: “Information is any kind of event that affects the state of a dynamical system. In its most restricted technical sense, it is an ordered sequence of symbols. As a concept, however, information has many meanings. Moreover, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, and representation.” (http://en.wikipedia.org/wiki/Information)

Information Theory then is not the study of bits and bytes. It is the study of information. Moreover, the quantification of the information. And fundamental to Information Theory is the acquisition of information, along with the extraction of the true information from the extraneous. In Electrical Engineering, this process is addressed by signal conditioning and noise filtering. In the mathematical sciences (and specifically the probability sciences), the acquisition of information is the investigation into the probability of events and the correlation of events – both as simultaneous events and as cause-effect events. Process control looks to the acquisition of information to lead to more optimum control of the processes.

So the acquisition of a clear signal, the predictive nature of that information, and the utilization of that information is at the root of information theory.

C. E. Shannon published a paper in 1948 Mathematical Theory of Communication which served to introduce the concept of Information Theory to modern science. His tenet is that communication systems (the means for dispersal of information) are composed of five parts:

  1. An information source (radio signal, DNA, industrial meter)
  2. A transmitter
  3. The channel (medium used to transmit)
  4. A receiver
  5. A recipient.

Since the information flow must be as distraction and noise free as possible, digital systems are often employed for industrial and parameterized data. Considerations then focus on data precision, latency, clarity, and storage.

Interestingly, the science of cryptography actually looks for obfuscation. Data purity, but hidden. Within cryptography, the need for precise, timely, and clear information is as important as ever, but the encapsulation of that information into shucks of meaningless dribble is the objective.

But then the scientist (as well as the code breaker) is attempting to achieve just the opposite: finding patterns, tendencies, and clues. These patterns, tendencies, and clues are the substance of the third phase of the Data –> Information –> Knowledge –> Understanding –> Wisdom. And finding these patterns, tendencies, and clues is what provides the industrial information user his ability to improve performance and, as a result, profitability.

The Digital Oilfield is a prime example of the search for more and better information. As the product becomes harder to recover – shale gas, undersea petroleum, horizontal drilling, etc. – the importance of the ability to mine patterns, tendencies, and clues is magnified.

Hence the Digital Oilfield is both lagging in the awakening to the need for information and leading in the resources to uncap the information.

Taggs: