Archive for the ‘Industry and Applications’ category

Did they actually say that?

February 10, 2012

On occassion we should all read these quotes and realize that humor is rampant!

“Computers in the future may weigh no more than 1.5 tons.”
~Popular Mechanics, forecasting the relentless march of science, 1949

“I think there is a world market for maybe five computers.”
~Thomas Watson, chairman of IBM, 1943

“I have traveled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won’t last out the year.”
~The editor in charge of business books for Prentice Hall, 1957

“But what … is it good for?”
~Engineer at the Advanced Computing Systems Division of IBM, 1968, commenting on the microchip.

“There is no reason anyone would want a computer in their home.”
~Ken Olson, president, chairman and founder of Digital Equipment Corp., 1977

“This ‘telephone’ has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us.”
~Western Union internal memo, 1876.

“The wireless music box has no imaginable commercial value. Who would pay for a message sent to nobody in particular?”
~David Sarnoff’s associates in response to his urgings for investment in the radio in the 1920s.

“The concept is interesting and well-formed, but in order to earn better than a ‘C,’ the idea must be feasible.”

~A Yale University management professor in response to Fred Smith’s paper proposing reliable overnight delivery service. (Smith went on to found Federal Express Corp.)

“Who the **** wants to hear actors talk?”
~H. M. Warner, Warner Brothers, 1927.
[When celebrities start talking politics, I find myself asking the same thing – Prof Reynolds]

“I’m just glad it’ll be Clark Gable who’s falling on his face and not Gary Cooper.”
~Gary Cooper on his decision not to take the leading role in “Gone With The Wind.”

“A cookie store is a bad idea. Besides, the market research reports say America likes crispy cookies, not soft and chewy cookies like you make.”
~Response to Debbi Fields’ idea of starting Mrs. Fields’ Cookies.

“We don’t like their sound, and guitar music is on the way out.”
~Decca Recording Co. rejecting the Beatles, 1962.

“Heavier-than-air flying machines are impossible.”
~Lord Kelvin, president, Royal Society, 1895.

“If I had thought about it, I wouldn’t have done the experiment. The literature was full of examples that said you can’t do this.”
~Spencer Silver, on the work that led to the unique adhesives for 3-M “Post-It” Notepads.

“So we went to Atari and said, ‘Hey, we’ve got this amazing thing, even built with some of your parts, and what do you think about funding us? Or we’ll give it to you. We just want to do it. Pay our salary, we’ll come work for you.’ And they said, ‘No.’ So then we went to Hewlett-Packard, and they said, ‘Hey, we don’t need you. You haven’t got through college yet.'”
~Apple Computer Inc. founder Steve Jobs on attempts to get Atari and H-P interested in his and Steve Wozniak’s personal computer.

“Professor Goddard does not know the relation between action and reaction and the need to have something better than a vacuum against which to react. He seems to lack the basic knowledge ladled out daily in high schools.”
~1921 New York Times editorial about Robert Goddard’s revolutionary rocket work.

“Drill for oil? You mean drill into the ground to try and find oil? You’re crazy.”
~Drillers who Edwin L. Drake tried to enlist to his project to drill for oil in 1859.

“Stocks have reached what looks like a permanently high plateau.”
~Irving Fisher, Professor of Economics, Yale University, 1929.

“Airplanes are interesting toys but of no military value.”
~Marechal Ferdinand Foch, Professor of Strategy, Ecole Superieure deGuerre.

“Everything that can be invented has been invented.”
~Charles H. Duell, Commissioner, U.S. Office of Patents, 1899.
[Possibly my favorite – Prof Reynolds]

“Louis Pasteur’s theory of germs is ridiculous fiction”.
~Pierre Pachet, Professor of Physiology at Toulouse, 1872 p>

“The abdomen, the chest, and the brain will forever be shut from the intrusion of the wise and humane surgeon”.
~Sir John Eric Ericksen, British surgeon, appointed Surgeon-Extraordinary to Queen Victoria 1873.

“640K ought to be enough for anybody.”
~Bill Gates, 1981

Advertisements

Making of a Fly

February 6, 2012

While watching a TED video about algorithms, mention was made of an unrealistic price on Amazon. Apparently two retailers had an out-of-control computer feedback loop.

One company, with lots of good customer points, is in the habit of selling products a little higher than the competition. Anyone’s guess why, but facts are facts – they routinely price merchandise about 25% higher than the competition (and rely on the customer experience points to pull customers away?).

Well, the competition routinely prices merchandise a little lower than the highest priced competitor: about 1% less.

So these computer programs began a game of one-upmanship. A $10.00 product was listed for $12.70 by the first company. Later in the day, the second company’s computer listed the same product for 1% less – $12.57. So the process repeated:  $15.96 and $15.80. Then $20.07 and $1987. The process continued until the book was listed for $23,698,655.93, plus shipping. (all numbers illustrative)

This story illustrates one of the challenges to automated feedback loops. An engineering instructor once explained it – if the gain feedback is a positive value greater than 1, the feedback will either oscillate, or latch-up.

More on feedback controls for real systems another day.

Read more here: https://www.google.com/#q=making+of+a+fly

Making new IT work for the business

September 23, 2011

I found an EXCELLENT article in the Digital Energy Journal by Dutch Holland. In this article he explore different strategies for transforming operational requirements into successful initiatives.

Without stealing too much of his well articulated article, the five approaches normally used are:

  • The by-the-book business analyst
  • The business-experienced analyst”
  • The businessman CIO
  • The IT expert Inside the business
  • The operations-led interface

I encourage anyone attempting to implement a operations-centric technological solution to read his article.

http://www.findingpetroleum.com/n/Making_new_IT_work_for_the_business/d1a1861b.aspx

“When trying to connect technology innovation with business, an intelligent interface between the two is required. It must be able to translate business opportunity into technical requirements; innovate, test and evaluate; and seamlessly implement new technology into the business.” ~Dutch Holland

The Physics of Measurements

June 13, 2011

An NPR report this morning addressed the new rotary-action mechanical heart. Instead of pulsing, it pumps like a conventional pump. (KUHF News Article) As interesting as this mechanical marvel is, an obscure quote discussing the process of measurement deserves some recognition.

Dr. Billy Cohn at the Texas Heart Institute says “If you listened to [the cow’s] chest with a stethoscope, you wouldn’t hear a heartbeat. If you examined her arteries, there’s no pulse. If you hooked her up to an EKG, she’d be flat-lined. By every metric we have to analyze patients, she’s not living. But here you can see she’s a vigorous, happy, playful calf licking my hand.”

The point to be made here is that neither the process of measurement, nor the device performing the measurement, is the process or event being measured; they are intrusive and will affect the reading. In fact, the measurement will always impact, in some way, the activity or event being measured. For example, electronic voltage measurement must withdraw a small stream of electrons to perform that measurement. This small stream represents power – measurable, demonstrable power that can, and does, modify the electronics being measured. Likewise in health, blood pressure cuffs, in a very real way, will alter the blood flow and resultant blood pressure reading. In fact, users of blood pressure cuffs are told that, after two failures at getting a clear reading, one should stop trying because the results will be skewed.

Generally measurements are performed as true real-time and as post real-time. Electrical voltage in the previous example is performed in real-time. But the speed of a vehicle is actually performed after-the-fact by measuring either the distance travelled over a specific interval or measuring the number of related events (magnetic actuator readings from a rotating wheel). Similarly, blood pressure may be an instantaneous measurement, but blood pulse rate is actually the number of pulses detected over a period of time (ie 15 seconds) or the time between pulses (a reciprocal measurement).

Having said that, most physical world measurements, including voltage, blood pressure, vehicle speed, etc.) are actually filtered and processed so that random events or mis-measurements are effectively removed from the resulting display. For example, a doctor may read 14 beats during a 15 second period leading him to declare a pulse rate of 14×4 or 56 (beats per minute). But what if he barely missed a pulse before starting the 15 second window, and barely missed the next pulse at the end? Perhaps the correct reading should be 59! Further complicate this error by mis-counting the number of pulses or be misreading the second-hand of the watch.

Typically normal measurement inaccuracy is compensated through various corrections, filters, gap fillers, and spurious removal. In this heart rate example, the actual reading can be improved by

  • removing outliers (measured events that do not cluster) typically the ‘top 5%’ and ‘bottom 5%’ of events are removed as extraneous
  • filtering the results (more complex than simple averages, but with the same goal in mind)
  • gap-filling (inserting an imagined pulse that was not sensed while a patient fidgets)
  • spurious removal (ignoring an unexpected beat)
  • increasing the size of the sample set

Discounting the fact that an unexpected beat should not be ignored by the doctor’s staff, the above illustrates how measurements are routinely processed by post measurement processes.

Finally, adjustments must be made to compensate for the impact of the measurement action, or the environment. In the case of the voltage measurement, the engineer may have to mathematically adjust the actual reading to reflect a true, un metered, performance. The experimental container may also require compensation. Youth swimming does this all of the time – some swim meets occur in pools that are 25 yards long while others occur in pools that are 25 meters long. Although this does not change the outcome of any individual race, it does affect the coach’s tracking of contestant improvement and the league’s tracking of record setting – both seasonal and over time.

So the cow in the NPR article may be clinically dead while grazing in the meadow, neither are in dispute – the cow is alive, and the cow is clinically dead. The fault here lies in the measurement. And once again, the valid and reliable scientific principles and techniques are suddenly no longer either valid or reliable.

The Big Crew Change

May 17, 2011

“The Big Crew Change” is an approaching event within the oil and gas industry when the mantle of leadership will move from the “calculators and memos” generation to the “connected and Skype” generation. In a blog 4 years ago, Rembrandt observes:

“The retirement of the workforce in the industry is normally referred to as “the big crew change”. People in this sector normally retire at the age of 55. Since the average age of an employee working at a major oil company or service company is 46 to 49 years old, there will be a huge change in personnel in the coming ten years, hence the “big crew change”. This age distribution is a result of the oil crises in ‘70s and ‘80s as shown in chart 1 & 2 below. The rising oil price led to a significant increase in the inflow of petroleum geology students which waned as prices decreased.”

Furthermore, a Society of Petroleum Engineers study found:

“There are insufficient personnel or ‘mid-carrers’ between 30 and 45 with the experience to make autonomous decisions on critical projects across the key areas of our business: exploration, development and production. This fact slows the potential for a safe increase in production considerably”

A study undertaken by Texas Tech University make several points about the state of education and the employability of graduates during this crew change:

  • Employment levels at historic lows
  • 50% of current workers will retire in 6 years
  • Job prospects: ~100% placement for the past 12 years
  • Salaries: Highest major in engineering for new hires

The big challenge: Knowledge Harvesting. “The loss of experienced personnel combined with the influx of young employees is creating unprecedented knowledge retention and transfer problems that threaten companies’ capabilities for operational excellence, growth, and innovation.” (Case Study: Knowledge Harvesting During the Big Crew Change).

In a blog by Otto Plowman, “Retaining knowledge through the Big Crew Change”, we see that

“Finding a way to capture the knowledge of experienced employees is critical, to prevent “terminal leakage” of insight into decisions about operational processes, best practices, and so on. Using of optimization technology is one way that producers can capture and apply this knowledge.When the retiring workforce fail to convey the important (critical) lessons learned, the gap is filled by data warehouses, knowledge systems, adaptive intelligence, and innovation.”

When the retiring workforce fail to convey the important (critical) lessons learned, the gap is filled by data warehouses, knowledge systems, adaptive intelligence, and innovation. Perhaps the biggest challenge is innovation. Innovation will drive the industry through the next several years. Proactive intelligence, coupled with terabyte upon terabyte of data will form the basis.

The future: the nerds will take over from the wildcatter.

Real-Time Data in an Operations/Process Environment

May 16, 2011

The operations/process environment differs from the administrative and financial environments in that operations is charged with getting the job done. As such, the requirements placed on computers, information systems, instrumentation, controls, and data is different too. Data is never ‘in balance’, data always carries uncertainty, and the process cannot stop. Operations personally have learned to perform their job while waiting for systems to come online, waiting for systems to upgrade, or even waiting for systems to be invented.

Once online, systems must be up 100% of the time, but aren’t. Systems must process data from a myriad of sources, but those sources are frequently intermit or sporadic. Thus the processing, utilization, storage, and analysis of real-time data is a challenge totally unlike the systems seen in administrations or financial operations.

Real time systems must address distinct channels of data flow – from the immediate to the analysis of terabytes of archived data.

Control and Supervision: Real-time data is used to provide direct HMI (human-machine-interface) and permit the human computer to monitor / control the operations from his console. The control and supervision phase of real-time data does not, as part of its function, record the data. (However, certain data logs may be created for legal or application development purposes.) Machine control and control feedback loops require, as a minimum, real-time data of sufficient quality to provide steady operational control.

Forensic Analysis and Lessons Learned: Captured data (and, to a lesser extent, data and event logs) are utilized to investigate specific performance metrics and operations issues. Generally, this data is kept in some form for posterity, but it may be filtered, processed, or purged. Nevertheless, the forensic utilization does represent post-operational analytics. Forensic analysis is also critical to prepare an operator for an upcoming similar process – similar in function, geography, or sequence.

Data Mining: Data mining is used to research previous operational events to locate trends, areas for improvement, and prepare for upcoming operations. Data mining is used identify a bottleneck or problem area as well as correlate events that are less than obvious.

Proactive / Predictive Analytics: The utilization of data streams, both present and previous, in an effort to predict the immediate (or distant) future requires historical data, data mining, and the application of learned correlations. Data mining may provide correlated events and properties, but the predictive analytics will provide the conversion of the correlations into positive, immediate performance and operational changes. (This utilization is not, explicitly AI, artificial intelligence, but the two are closely related)

The data-information-knowledge-understanding-wisdom paradigm: Within the data—>wisdom paradigm, real-time data is just that – data. The entire tree breaks out as:

  • data – raw, untempered data from the operations environment (elemental data filtering and data quality checks are, nevertheless, required).
  • information – presentation of the data in human comprehensible formats – the control and supervision phase of real-time data.
  • knowledge – forensic analytics, data mining, and correlation analysis
  • understanding – proactive and forward-looking changes in behavior characteristic of the proactive / predictive analytics phase.
  • wisdom – the wisdom phase remains the domain of the human computer.

Related Posts:

Data Mining and Data, Information, Understanding, Knowledge
https://profreynolds.wordpress.com/2011/01/30/data-mining-and-data-information-understanding-knowledge/

The Digital Oilfield, Part 1
https://profreynolds.wordpress.com/2011/01/30/the-digital-oilfield-part-1/

The Data-Information Hierarchy
https://profreynolds.wordpress.com/2011/01/31/the-data-information-hierarcy/

The Digital Oilfield, Part 2

January 30, 2011

(originally posted on blogspot January 18, 2010)

“The improved operational performance promised by a seamless digital oil field is alluring, and the tasks required to arrive at a realistic implementation are more specialized than might be expected.” (http://www.epmag.com/archives/digitalOilField/1936.htm)

Seamless integrated operations requires a systematic view of the entire exploration process. But the drilling operation may be the largest generator of diverse operational and performance data and may produce more downstream data information than any other process. Additionally, the drilling process is one of the most legally exposing processes performed in energy production – BP’s recent Gulf disaster is an excellent example.

The seamless, integrated, digital oilfield is data-centric. Data is at the start for the process, and data is at the end of the process. But data is not the objective. In fact, data is an impediment to information and knowledge. But data is the base of the information and knowledge tree – data begets information, information begets knowledge, knowledge begets wisdom. Bringing data up to the next level (information) or the subsequent level (knowledge) requires a systematic and root knowledge of the data available withing an organization, the data which should be available within an organization, and the meaning of that data.

Data mining is the overarching term used in many circles to define the process of developing information and knowledge. In particular, data mining is taking the data to the level of knowledge. Converting data to information is often no more complex that producing a pie chart or an x-y scatter chart. But that information requires extensive operational experience to analyze and understand. Data mining takes data into knowledge tier. Data mining will extract the tendencies of operational metrics to fortell an outcome.

Fortunately, there are several bright and shinning examples of entrepreneurs developing the data-to-knowledge conversion. One bright and promising star is Verdande’s DrillEdge product (http://www.verdandetechnology.com/products-a-services/drilledge.html). Although this blog does not support or advocate this technology as a matter of policy, this technology does illustrate an example of forward thinking and systematic data-to-knowledge development.

A second example is PetroLink’s modular data acquisition and processing model (http://www.petrolink.com). This product utilizes modular vendor-agnostic data accumulation tools (in particular interfacing to Pason and MD-TOTCO), modular data repositories, modular equation processing, and modular displays. All of this is accomplished through the WITSML standards (http://en.wikipedia.org/wiki/WITSML).

Future blogs will consider the movement of data, latency, reliability, and synchronization.


%d bloggers like this: