From Part 1
There have been four ages of mankind. The first was the Age of Speech; for the first
time humans could “learn by listening” rather than “learn by doing”; that is,
data could be accumulated, communicated, and stored by verbal
communications. It also transformed the
hunting and gathering into an economic architecture of small somewhat settled
communities over the course of 300,000 years.
Settlement produced first
significant increase in economic activity, wealth per capita, and in the
academics in the form of the shaman for tribal organization.
The second, the Age of Writing, produced a quantum leap in
data and information that could be accumulated, communicated, and stored. This was over a period of at least 6,500
years. During this time, academic
activity evolved from everyone working to survive to a diversity of jobs and
trades and the economic stratification of political organizations. Again,
the total wealth of humanity took a leap of orders of magnitude as the economic
architectures of city states, then countries, and then empires evolved. The academics evolved from the shaman, to
priests, clerics, researchers, mathematicians, and universities (e.g. the Museum
at Alexandria ~ 370 BC and the University of Bologna, 1088) and libraries.
The third, the Age of Print, started with Gutenberg’s press
in 1455, but blossomed with Luther’s radical admonition that everyone should “read”
the bible about 1517. Suddenly, the
quantity of information and knowledge to a leap of several orders of magnitude
as all types of ideas were accumulated, communicated, and stored.
This created for many changes in sociopolitical organizations
and cultural upheaval. After major wars
(e.g., the 30 years, the 100 years, WWI, and WWII, with continuous warring in
between) and “the age of exploration and exploitation (colonization)” which
created the footings of globalization, in 1776 economic architecture was
formalized into the mass production industrial architecture called Capitalism. Capitalism, with its risk/reward, and mass
production has created far more wealth for humanity, though several new
religions have destroyed a significant percentage of this wealth; religions,
like Fascism (a throwback to feudalism), Communism, Socialism, and Liberalism
(All which replace “personal responsibility” with “social responsibility” as
their major article of faith).
Now humanity is on the cusp of the Digital Age. It too, will create orders of magnitude more
data, information, and knowledge. And,
it promises another giant leap in the wealth, but with commensurate risks of
barbarianism and wars, from all sides, unless there can be integration of
cultures, not “cultural diversity”. History
graphically demonstrates that cultures always clash and that “diversity” cultures
implode from the clash. I will discuss
these later, but first Part 2 will discuss the inception and gestation of the
Digital Age.
Part 2: The Digital Age: A Personal Perspective of How Services Oriented Architecture Evolved
“Intel giveth,
Microsoft taketh away”.
A saying by computer
software developers circa 1975
From its start to today, the Digital Age could be called the
Age of Assembly. I know because I was
there.
Starting in earnest sometime in the late 1960s, humanity has
entered a new age, the Digital Age! And, as in the past three occasions, humanity
has not realized the potential of this change in information technology.
Computing Power
In 1965, Gordon Moore, the founder of Intel, observed that “the
number of transistors in a dense integrated circuit doubles approximately every
two years.” Effectively, what this means
is that raw computing power doubles approximately every two years.
Coding Cost Efficiency
But, the digital age requires three technologies. In addition to faster and more powerful
hardware, it requires the same abilities of the other ages, the ability to
accumulate and analyze the raw datum and to communicate both the data and store the
results of the analysis.
The ability of a digital system to accumulate and analyze
data is based on its programming. In
1956, I played with my first computer, or what was referred to at that time as
a computer. This computer was programmed
with a wire board—a board with a matrix of connectors; the programming came of
wires connecting the “proper” connectors together. Data was inserted using Hollerith cards
(referred to as punch cards).
By 1960, programming
had graduated to computer coding using the punch cards. I coded my first program in Symbolic
Programming System (SPS), a form of Assembler—the first step up from coding in
machine language (1s and 0s)—as a member of my high school’s math club. An Assembler is little more than a useful
translator of machine code to make it simpler of the coder to create code and
to more easily identify bugs—both, greatly increasing the coders and codes
effectiveness and also the cost efficiency of creating code.
By 1964, I started taking the first of three computer
courses offered by the Math Department of the university I attended. This class included programming in machine
code (literal ons and offs), SPS, and Fortran 1. The latter was the first use of the concept of
Services Oriented Architecture (SOA).
Fortran 1 (Formula Translation 1) was among the earliest
scientific programming languages. These
languages were made of a set of computer commands (functions), read (getting
input), print (provide output), do mathematically calculations (add, subtract,
etc.) and perform some logical step (loop, branch, and so on). Actually it’s much more complex than this,
but I’m not quite ready to take a swan dive into the minutia of computer
software and hardware design and architecture.
By 1965, a coder could create hundreds of instructions per
hour, rather than a couple of dozen like in 1956. Since the computing power of the hardware was
(and is) dramatically increasing, making the coder more cost efficient makes
sense. Additionally, it meant that
computers could handle much larger and more complex tasks.
Data Storage and Data Communications
Between 1964 and 1980 two other technological developments
occurred that have led to the start of the Digital Age, data storage devices, and
data communications.
In 1964, I first saw, wrote code for, and used a storage
device called a disk drive. Prior to
this data was stored either on cards or on tape drives. Like the CPU, data storage hardware has
continued to follow Moore’s Law. Today,
the average smart phone has 100 to 1000
times the storage that the “mainframe” computer had when I was working on my
Ph.D. and that storage 300MB costs tens of thousands time more and took up more
than a basketball court sized room.
So the abilities to store and analyze data and information
have become much more effective and cost efficient. So has the technology’s ability to
communicate data, information, and knowledge.
In 1836 Samuel Morse demonstrated
telegraphic communications, the first machine language method for communicating
information. It took until the mid-1970s
telegraphic communications to evolve into a wide variety of data communications
hardware, software, and communications protocols. Then it took the next 20 years to coalesce
into the hardware, software, and communications protocols we know as the
“Internet” and the “web”.
During the same 20 years the final
element for the digital age evolved, Services Oriented Architecture.
Services Oriented Architecture
At the dawn of the digital age programming, all programs
were simply an order set of machine instructions, no loops and no logic. When computer languages, like FORTRAN 1 first
evolved, they were created with a good deal of branch logic to allow code to be
reused. Why: because the memory and data
storage on the machine was so small. So,
at the time it made sense to reuse sections of code that were already in the
program, rather rewrite the same code.
Inevitably this led to what “computer scientists”, people
that taught programming rather than writing programs for a living, called
officially “unstructured” programming, or in the slang of the day “spaghetti
code”. In production at the time, unstructured
programs were much faster in execution on the hardware available. However, they were also much more difficult
to understand especially if they weren’t properly documented. This meant that
they were hard to debug and hard to update or upgrade.
According to the computer scientists, the chief culprit of
unstructured programming was the unconditional branch, called the “goto”
statement. This statement indicated the
location of the next coded statement that the computer should execute, and
this, in general, was back to some location earlier in the program. This meant that in following the program, it
jumped around, rather than going from top to bottom, making it easier for the
inexperienced to follow. So, like all
liberal bureaucrats they outlawed unconditional branching.
Then they replayed the goto statement with an unconditional
branch to another program, initially called a subroutine. This fragmented the program and intuitively
created what could become the services of SOA, but didn’t. Instead, they became the Dynamic Link Library
(DLL) of computer functions; utilities of the operating system that can be used
to support all programs on a computing system.
Instead of structured programs, I opted for modular
programming. These were programs that
performed a given function for the program.
For the application that I wrote as part of my Ph.D., I used this
architecture; and I submit that it’s the basis for the SOA-based applications.
Be that as it may, I’ve described how I see by the concepts
for SOA evolving from machine language of the late 1950s, to assembler, then
programming languages with DLLs, and then modular programming.
Each time, the number of machine instructions increased by
orders of magnitude, meaning that a single instruction to call a “subroutine”
or function could generate 50 to 500 or more instructions and each of these
instructions could generate 1000 to 10,000 instructions. Actually, one time I did some research and
found that one 20 line program actually generated 63 mega-bytes of code.
There was still a problem.
It really helped the cost efficiency of the coder to be able to create
massive amounts of code fast, but many times the customer for whom the program
was being created wanted to interlink that program with other programs. Most often these programs were created in
different computer languages (e.g., FORTRAN, COBOL, PL1), and on various brands
of computers (e.g., IBM, DEC, HP, Sun, Silicon Graphics).
Obviously, the problem was that all of these various
software and hardware suppliers were competing for business and therefore
making it hard to interlink their products with competitors’ products. The reason was simple; to force their
customers to buy only their products.
Today you can see the identical strategy, with Apple, Microsoft, and
others building their own “ecosystems” to ensure their customers stay their
customers.
In the early 1980s, customers began to recognize this,
especially with the advent of data networks.
This set off a series of international standards committees for all
components, from data and how to store it, to data communications.
For example, I was on a number of Open Systems Interconnect
(OSI) data communications committees starting in 1982. One of the team members of the team I led had
been a coauthor of the Standard Generalized Markup Language (SGML). Both HTML and XML, the languages of the web,
were derived from SGML. I was
peripherally involved with STEP (PDES) for engineering data, X.400 for e-mail
and X.500 Directory Services from which LDAP was derived. This was all between 1982 and 2009.
Another step in creating SOA for information and knowledge-based
system (the so called “Big Data” systems) is a standardized interface for the
Services (i.e., components or functions).
Various organizations including the W3C worked on this issue
and came up with Web Services. Web
Services uses XML in a defined and standardized manner to enable the software
functions to communicate without having to write an interfacing routine between
and among functions.\
The final step in creating SOA was the design and
development of a reference architecture or model denoting the components of its
supporting infrastructure, assembly process, and functions needed to create a
robust application using SOA. I was a
team member on the OASIS Team the created this model.
So from personal experience, I’ve seen the international
standards for data formats, and data communications develop. All of the above are the precursors for
Services Oriented Architecture as used for a new economic architecture.
Mass Customization Using Services Oriented Architecture--Part 3, Coming Soon
Part 3 will discuss an economic version of Services Oriented
Architecture and how it will reformulate business organizations in the Digital
Age.
No comments:
Post a Comment