Monday, March 7, 2011

Process Effectiveness Versus Finance Engineering

In the 1970s, while I was working on my dissertation, which used a "time-sharing mini-computer" (with much less capability and perform than today's medium-smart phones), that when all 16 users worked the system to the point that it was using more than 50% of its "potential" Central Processing Unit (CPU) cycles, the throughput leveled off.  That is, at 60% the throughput was essentially the same, or slightly less than at 50%.  By the time the system was using 65% of its CPU cycle, the throughput had tailed off measurably.  And, by the time the system 85%, the users (student's) were going out for coffee after pressing the return key; literally, 90% or more of the CPU cycles were being used up in the administrative and management processes of the system.  Finally, when 90% of the CPU cycles were being used, there was no measurable throughput.

The thought has occurred to me often, that from a finance engineering perspective, using a computer to 90% of its rated CPU cycles would be considered much leaner and more economic than using it to 50% of those cycles.  Yet, my "informal" study (I did the research, but didn't write a paper) demonstrated the system was most effective when it was run a 50 to 60 percent capacity.

Since, I have seen many other systems that produce more when not run flat-out all the time.  I've often wondered why the transactional managers and finance engineers were never taught that.  The only answer I can come up with is that measuring effectiveness is difficult.

No comments:

Post a Comment