Archive for the ‘Predictive Analytics’ category

Artificial Intelligence vs Algorithms

February 9, 2012

I first considered aspects of artificial intelligence (AI) in the 1980s while working for General Dynamics as an Avionics Systems Engineer on the F-16. Over the following 3 decades, I continued to follow the concept until I made a realization – AI is just an algorithm. Certainly the goals of AI will one day be reached, but the manifestation metric of AI is not well defined.

Mark Reynolds is currently at Southwestern Energy where he works in the Fayetteville Shale Drilling group as a Staff Drilling Data Analyst. In this position, he pulls his experiences in data processing, data analysis, and data presentation to improve Southwestern Energy’s work in the natural gas production and mid-stream market.

Recently, Mark has been working toward improved data collection, retention, and utilization in the real-time drilling environment.

www.ProfReynolds.com

Consider the Denver International Airport. The baggage handling system was state of the art, touted as AI based and caused the delay of the opening by 16 months and cost $560M to fix. (more – click here) In the end, the entire system was replaced with a more stable system based not on a learning or deductive system, but upon much more basic routing and planning algorithms which may be deterministically designed and tested.

Consider the Houston traffic light system. Mayors have been elected on the promise to apply state of the art computer intelligence. Interconnected traffic lights, traffic prediction, automatic traffic redirection. Yet the AI desired results in identifiable computer algorithms with definitive behavior and expectations. Certainly an improvement, but not a thinking machine. The closest thing to automation is the remote triggering features used by the commuter rail and emergency vehicles.

So algorithms form the basis for computer advancement. And these algorithms may be applied with human interaction to learn the new lessons so necessary to achieving behavioral improvement with the computers. Toward this objective, distinct fields of study are untangling interrelated elements – clustering, neural networks, case based reasoning, and predictive analytics are just a few.

When AI can be achieved, it will be revolutionary. But until that time, deterministic algorithms, data mining, and predictive analytics will be at the core of qualitative and quantitative advancement.

Advertisements

Predictive Analytics

September 9, 2011

Predictive analytics is used in actuarial science, financial services, insurance, telecommunications, retail, travel, healthcare, pharmaceuticals and other fields (Wikipedia). But operations – manufacturing, processing, etc., have been a little slower to encompass the concept. A drilling engineer friend of mine says “just put my hand on the brake lever and I’ll drill that well”. He probably can, but few of the rest of us can, or want to.

We want to see operating parameters, performance metrics, and process trends. All this because we want to have the necessary information and knowledge to assimilate understanding and invoke our skill set (wisdom). In this scenario, we are responding to stimulus, we are applying “reactive analytics”. But systems get more complex, operations becomes more intertwined, performance expectations become razor-thin. And with this complexity grows demand for better assistance from technology. In this case, the software performs the integrated analysis and the results is “predictive analytics”. And with predictive analytics comes the close cousin: decision models and decision trees.

Sullivan McIntyre, in his article From Reactive to Predictive Analytics, makes an observation about predictive analytics in social media that is mirrored in operations:

There are three key criteria for making social data useful for making predictive inferences:

  • Is it real-time? (Or as close to real-time as possible)
  • Is it metadata rich?
  • Is it integrated?

Having established these criteria, the nature of the real-time data and the migration of historical data into real-time, predictive analytics becomes achievable.

The Value of Real-Time Data, Part 2

September 1, 2011

Previously, predictive analytics was summarized as “system anticipates” (https://profreynolds.wordpress.com/2011/08/31/the-value-of-real-time-data/). But that left a lot unsaid. Predictive analytics is a combination of statistical analysis, behaviour clustering, and system modeling. No one piece of predictive analytics can exist in a vacuum; the real-time system must be statistically analyzed, its behaviour grouped or clustered, and finally a system modeled that can use real-time data to anticipate the future – near term and longer.

Examples of predictive analytics in everyday life include credit scores, hurricane forecasts, etc. In each case, past events are analyzed, clustered, and then predicted.

The result of predictive analytics is, therefore, a decision tool. And the decision tree will, to some degree, take into account a predictive analysis.

The output of Predictive Analytics will be descriptive or analytic – subjective or objective. Both outputs are reasonable and viable. Looking at the hurricane predictions, there are analytical computer models (including the so-called spaghetti models) that seek to propose a definitive resulting behaviour; then there are descriptive models that seek to produce a visualization and comprehension of the discrete calculations. By extension, one can generalize that descriptive predictions must be the result of multiple analytic predictions. Perhaps this is true.

Returning to the idea that predictive analytics is comprised of statistical analysis, clustering analysis, and finally system modelling, we see that a sub-field of analytics could be considered: reactive analytics. Reactive analytics seeks to understand the statistical analysis, and even the clustering analysis, with an eye to adapt processes and procedures – but not in real-time. Reactive Analytics is, therefore, the Understanding portion of the Data-Information hierarchy (https://profreynolds.wordpress.com/2011/08/31/the-data-information-hierarcy-part-3/). Predictive Analytics is, therefore, the Wisdom portion of the Data-Information hierarchy.

The Value of Real-Time Data

August 31, 2011

Real-Time data is a challenge to any process-oriented operation. But the functionality of the data is difficult to describe in such a way that team members not well versed in data management. Toward that end, four distinct phases of data have been identified:

  1. Real-Time: streaming data
    visualized and considered – system responds
  2. Forensic: captured data
    archived, condensed – system learns
  3. Data Mining: consolidated data
    hashed and clustered – system understands
  4. Predictive Analytics: patterned data
    compared and matched – system anticipates

A mnore detailed explanation of these phases may be:

Control and Supervision: Real-time data is used to provide direct HMI (human-machine-interface) and permit the human computer to monitor / control the operations from his console. The control and supervision phase of real-time data does not, as part of its function, record the data. (However, certain data logs may be created for legal or application development purposes.) Machine control and control feedback loops require, as a minimum, real-time data of sufficient quality to provide steady operational control.

Forensic Analysis and Lessons Learned: Captured data (and, to a lesser extent, data and event logs) are utilized to investigate specific performance metrics and operations issues. Generally, this data is kept in some form for posterity, but it may be filtered, processed, or purged. Nevertheless, the forensic utilization does represent post-operational analytics. Forensic analysis is also critical to prepare an operator for an upcoming similar process – similar in function, geography, or sequence.

Data Mining: Data mining is used to research previous operational events to locate trends, areas for improvement, and prepare for upcoming operations. Data mining is used identify a bottleneck or problem area as well as correlate events that are less than obvious.

Proactive / Predictive Analytics: The utilization of data streams, both present and previous, in an effort to predict the immediate (or distant) future requires historical data, data mining, and the application of learned correlations. Data mining may provide correlated events and properties, but the predictive analytics will provide the conversion of the correlations into positive, immediate performance and operational changes. (This utilization is not, explicitly AI, artificial intelligence, but the two are closely related)

Real-Time Data in an Operations/Process Environment

May 16, 2011

The operations/process environment differs from the administrative and financial environments in that operations is charged with getting the job done. As such, the requirements placed on computers, information systems, instrumentation, controls, and data is different too. Data is never ‘in balance’, data always carries uncertainty, and the process cannot stop. Operations personally have learned to perform their job while waiting for systems to come online, waiting for systems to upgrade, or even waiting for systems to be invented.

Once online, systems must be up 100% of the time, but aren’t. Systems must process data from a myriad of sources, but those sources are frequently intermit or sporadic. Thus the processing, utilization, storage, and analysis of real-time data is a challenge totally unlike the systems seen in administrations or financial operations.

Real time systems must address distinct channels of data flow – from the immediate to the analysis of terabytes of archived data.

Control and Supervision: Real-time data is used to provide direct HMI (human-machine-interface) and permit the human computer to monitor / control the operations from his console. The control and supervision phase of real-time data does not, as part of its function, record the data. (However, certain data logs may be created for legal or application development purposes.) Machine control and control feedback loops require, as a minimum, real-time data of sufficient quality to provide steady operational control.

Forensic Analysis and Lessons Learned: Captured data (and, to a lesser extent, data and event logs) are utilized to investigate specific performance metrics and operations issues. Generally, this data is kept in some form for posterity, but it may be filtered, processed, or purged. Nevertheless, the forensic utilization does represent post-operational analytics. Forensic analysis is also critical to prepare an operator for an upcoming similar process – similar in function, geography, or sequence.

Data Mining: Data mining is used to research previous operational events to locate trends, areas for improvement, and prepare for upcoming operations. Data mining is used identify a bottleneck or problem area as well as correlate events that are less than obvious.

Proactive / Predictive Analytics: The utilization of data streams, both present and previous, in an effort to predict the immediate (or distant) future requires historical data, data mining, and the application of learned correlations. Data mining may provide correlated events and properties, but the predictive analytics will provide the conversion of the correlations into positive, immediate performance and operational changes. (This utilization is not, explicitly AI, artificial intelligence, but the two are closely related)

The data-information-knowledge-understanding-wisdom paradigm: Within the data—>wisdom paradigm, real-time data is just that – data. The entire tree breaks out as:

  • data – raw, untempered data from the operations environment (elemental data filtering and data quality checks are, nevertheless, required).
  • information – presentation of the data in human comprehensible formats – the control and supervision phase of real-time data.
  • knowledge – forensic analytics, data mining, and correlation analysis
  • understanding – proactive and forward-looking changes in behavior characteristic of the proactive / predictive analytics phase.
  • wisdom – the wisdom phase remains the domain of the human computer.

Related Posts:

Data Mining and Data, Information, Understanding, Knowledge
https://profreynolds.wordpress.com/2011/01/30/data-mining-and-data-information-understanding-knowledge/

The Digital Oilfield, Part 1
https://profreynolds.wordpress.com/2011/01/30/the-digital-oilfield-part-1/

The Data-Information Hierarchy
https://profreynolds.wordpress.com/2011/01/31/the-data-information-hierarcy/


%d bloggers like this: