Computation By Design
All present day information technology is powered by general purpose microprocessors. These microprocessors, based on Von Neumann architecture, at its core does computation. Complex Information systems are made up of collection of programs. Each program takes input, changes state and creates an output. These programs are compiled and stored in a instruction stream in RAM. The processor takes each instruction, from the instruction stream, executes that instruction which manipulates the data in the data stream. For eg: DBMS to store data, languages to process data. ERP to do business, Mail to send and receive mail, Chat to send and receive quick messages etc. are some of the well known use cases.
The present day programs and processor, has its roots in the computing theoretical foundation like Turing Machine , Von Neumann architecture and lambda calculus. The present technology is based on the premise that anything computable can be computed on this abstract Turing machine model. The Von Neumann architecture is the engineering model of this Turing machine. The language that programs these models are based on mathematics i.e. lambda calculus. This makes present machine and language models Computation By Design
It is this Computation By Design the strength of present day IT systems and its weakness. The fact that all systems in the world will be able to represented as a computation is its fundamental limitation. We are now experiencing this foundational limitation thru the different deficiencies experienced as vulnerable information systems, poor software productivity, high defect rates, poor software project success rates etc.
Instruction per second is the Key performance measure (KPM) that measure performance in machines built on "compute by design" technology. This gives the total throughput that can go thru a particular processor. So for example if the IPS is 15 KIPS then that processor has a total throughput of 15K instructions a second. Although this number is huge, it does not really reflect to a buyer that it is the value he/she is buying. It all depends on how the application programs are written to take advantage of this throughput possibility. This tells how in real life this is just a engineering number that cannot be relied to make business purchasing decisions.
Technologies that have "Compute By Design" foundations which served us so far is becoming a bottleneck. Increasingly the application needs are shifting from this fundamental computing paradigm to new needs. Due to the miniaturization of devices and the increasing networked nature of the information topology the older models of compute infrastructure in a single computer in a data center is no more tenable. This is because of the nature of expectation from software and computer hardware in general.
The application demands now have moved away from just pure computation and data processing. Now we expect our information systems to be secure and offer privacy even when it supports new sharing and workflow demands on it. All this is expected to work even when the information systems are built across a network of devices. Vulnerability of information systems, Poor software productivity, Poor information systems quality, Increasing arthritic conditions of our software systems, Lack of privacy primitives in our software systems and the Data intensive nature of our AI systems (This reflects the past and is not conditioned for the VUCA world) are the symptoms of our reliance on our present Compute By Design based systems.
Quoting Andrew Grove, we are at an inflection point in computing. compute by design has reached its limits. We need a new beginning.