Difference between revisions of "Computation By Design"

From Connection Machine
Jump to navigation Jump to search
m
m
Line 1: Line 1:
<p>All present day information technology is powered by general purpose microprocessors. These microprocessors, based on [https://en.wikipedia.org/wiki/Von_Neumann_architecture Von Neumann architecture], at its core does compute. The microprocessor takes instructions and operates on data and produce an output. The instructions and data are laid out in a stream. The processor takes instruction from the instruction stream and manipulate data in the data stream.</p>
+
<p>All present day information technology is powered by general purpose microprocessors. These microprocessors, based on [https://en.wikipedia.org/wiki/Von_Neumann_architecture Von Neumann architecture], at its core does computation. It takes instructions and operates on data and produce an output. The processor takes instruction from the instruction stream and manipulate data in the data stream.</p>
  
 
<p>The programs are written which tell the list of instructions and the sequence in which they advance. These programs are compiled and stored in a instruction stream in RAM. The processor takes each instruction, from the instruction stream, executes that instruction which manipulates the data in the data stream. This paradigm of data transformation is used to model complex information system. For eg: DBMS to store data, languages to process data. ERP to do business, Mail to send and receive mail, Chat to send and receive quick msgs etc. are some of the well known use cases. </p>
 
<p>The programs are written which tell the list of instructions and the sequence in which they advance. These programs are compiled and stored in a instruction stream in RAM. The processor takes each instruction, from the instruction stream, executes that instruction which manipulates the data in the data stream. This paradigm of data transformation is used to model complex information system. For eg: DBMS to store data, languages to process data. ERP to do business, Mail to send and receive mail, Chat to send and receive quick msgs etc. are some of the well known use cases. </p>
  
<p>High Level languages are used to develop systems using these computational languages. The purpose of these high level languages are to reduce the number of lines coded and also to increase the ease of development COBOL for business, Java for object oriented programming, C for engineering running systems, etc. These languages expressed the domain in languages that were closer to the domain and it was directly converted to machine language so it can be executed directly on the processor. These high level languages still retained the sequentiality of languages even though it eased the way in which a domain can be expressed.</p>
+
<p>High Level languages are used to develop these systems. The purpose of these high level languages are to reduce the number of lines coded and also to increase the ease of development. For eg. COBOL for business, Java for object oriented programming, C for engineering running systems, etc. These languages made it easier to express the domain and these were then directly compiled to machine language so it can be executed directly on the processor. Even thought, they made it easy to develop systems, they still retained the sequentiality of languages.</p>
  
 
<p>Any technology should have science behind to make it sustainable and able to express all kind of situations. That science defines what the design constraints of the technology. In the case of the present day processor technology, that is the Turing machine, [https://en.wikipedia.org/wiki/Von_Neumann_architecture Von Neumann architecture] and the language.  The present technology is based on the assumption anything computable can be computed on [[https://en.wikipedia.org/wiki/Turing_machine Turing Machine]]. This abstract Turing machine is implemented with a Von Neumann architecture. The language that programs these models are based on mathematics i.e. lambda calculus. So the machine developed is "Computation by design" as the core models that define this is computation.</p>
 
<p>Any technology should have science behind to make it sustainable and able to express all kind of situations. That science defines what the design constraints of the technology. In the case of the present day processor technology, that is the Turing machine, [https://en.wikipedia.org/wiki/Von_Neumann_architecture Von Neumann architecture] and the language.  The present technology is based on the assumption anything computable can be computed on [[https://en.wikipedia.org/wiki/Turing_machine Turing Machine]]. This abstract Turing machine is implemented with a Von Neumann architecture. The language that programs these models are based on mathematics i.e. lambda calculus. So the machine developed is "Computation by design" as the core models that define this is computation.</p>

Revision as of 12:41, 25 November 2020

All present day information technology is powered by general purpose microprocessors. These microprocessors, based on Von Neumann architecture, at its core does computation. It takes instructions and operates on data and produce an output. The processor takes instruction from the instruction stream and manipulate data in the data stream.

The programs are written which tell the list of instructions and the sequence in which they advance. These programs are compiled and stored in a instruction stream in RAM. The processor takes each instruction, from the instruction stream, executes that instruction which manipulates the data in the data stream. This paradigm of data transformation is used to model complex information system. For eg: DBMS to store data, languages to process data. ERP to do business, Mail to send and receive mail, Chat to send and receive quick msgs etc. are some of the well known use cases.

High Level languages are used to develop these systems. The purpose of these high level languages are to reduce the number of lines coded and also to increase the ease of development. For eg. COBOL for business, Java for object oriented programming, C for engineering running systems, etc. These languages made it easier to express the domain and these were then directly compiled to machine language so it can be executed directly on the processor. Even thought, they made it easy to develop systems, they still retained the sequentiality of languages.

Any technology should have science behind to make it sustainable and able to express all kind of situations. That science defines what the design constraints of the technology. In the case of the present day processor technology, that is the Turing machine, Von Neumann architecture and the language. The present technology is based on the assumption anything computable can be computed on [Turing Machine]. This abstract Turing machine is implemented with a Von Neumann architecture. The language that programs these models are based on mathematics i.e. lambda calculus. So the machine developed is "Computation by design" as the core models that define this is computation.

This is the strength of present day IT systems and its Achilles heel. The fact that all systems in the world will be able to represented as a computation is its fundamental limitation. We are now experiencing this foundational limitation thru the different deficiencies experienced as vulnerable information systems, poor software productivity, high defect rates, poor software project success rates etc.

This "compute by design" foundations is illustrated by Key performance measure (KPM) that measure the computing power. It is illustrated as the [Instruction per second]. This gives the total throughput that can go thru a particular processor. So for example if the IPS is 15 KIPS then that processor has a total throughput of 15K instructions a second. Although this number is huge, it does not really reflect to a buyer that it is the value he/she is buying. It all depends on how the application programs are written to take advantage of this throughput possibility. This tells how in real life this is just a engineering number that cannot be relied to make business purchases.

But this "compute by design" foundations which served us so far is becoming a bottleneck. Increasingly the application needs are shifting from this fundamental computing paradigm to new needs. Due to the miniaturization of devices and the increasing networked nature of the information topology the older models of compute infrastructure in a single computer in a data center is no more tenable. This is because of the nature of expectation from software and computer hardware in general.

The application demands now have moved away from just pure computation and data processing. Now we expect our information systems to be secure and offer privacy even when it supports new sharing and workflow demands on it. All this is expected to work even when the information systems are built across a network of devices. Vulnerability of information systems, Poor software productivity, Poor information systems quality, Increasing arthritic conditions of our software systems, Lack of privacy primitives in our software systems and the Data intensive nature of our AI systems (This reflects the past and is not conditioned for the VUCA world).

Quoting Andrew Grove, we are at an inflection point in computing. compute by design has reached its limits. We need a new beginning.