Difference between revisions of "Computation By Design"

From Connection Machine
Jump to navigation Jump to search
m
m
 
(10 intermediate revisions by the same user not shown)
Line 1: Line 1:
<p>All present day information technology is powered by general purpose microprocessors. These microprocessors, based on [https://en.wikipedia.org/wiki/Von_Neumann_architecture Von Neumann architecture], at its core does computation. It takes instructions and operates on data and produce an output. The processor takes instruction from the instruction stream and manipulate data in the data stream.</p>
+
<p>Complex Information systems, like DBMS to store data, languages to process data. ERP to do business, sending and receiving mail, Chat to send and receive quick messages etc. are made up of many programs. These programs are compiled and stored in a instruction stream in [https://en.wikipedia.org/wiki/Random-access_memory RAM]. These programs are executed by microprocessors. These processors take instruction, from the instruction stream, executes that instruction. While executing, data in the data stream (again in RAM) is manipulated.</p>
  
<p>The programs are written which tell the list of instructions and the sequence in which they advance. These programs are compiled and stored in a instruction stream in RAM. The processor takes each instruction, from the instruction stream, executes that instruction which manipulates the data in the data stream. This paradigm of data transformation is used to model complex information system. For eg: DBMS to store data, languages to process data. ERP to do business, Mail to send and receive mail, Chat to send and receive quick msgs etc. are some of the well known use cases. </p>
+
<p>The present day programs and processor, has its roots in the computing theoretical foundation like [https://en.wikipedia.org/wiki/Turing_machine Turing Machine] , [https://en.wikipedia.org/wiki/Von_Neumann_architecture Von Neumann architecture] and [https://en.m.wikipedia.org/wiki/Lambda_calculus lambda calculus]. The present technology is based on the premise that anything computable can be computed on this abstract Turing machine model. The Von Neumann architecture is the engineering model of this Turing machine. The language that programs these models are based on mathematics i.e. lambda calculus. This makes present machine and languages <i>Computation By Design</i></p>  
  
<p>High Level languages are used to develop these systems. The purpose of these high level languages are to reduce the number of lines coded and also to increase the ease of development. For eg. COBOL for business, Java for object oriented programming, C for engineering running systems, etc. These languages made it easier to express the domain and these were then directly compiled to machine language so it can be executed directly on the processor. Even though, it eased development, it retained the sequentiality of languages.</p>
+
<p>Due to the miniaturization of devices and the increasing networked nature of the information topology, the application demands now have moved away from just pure computation and data processing. Information systems are expected to be <i>Secure by Default</i> and <i>Offer Privacy by default</i> even when serving new sharing and workflow demands. Vulnerability of information systems, Poor software productivity, Poor information systems quality, Increasing arthritic conditions of our software systems, Lack of privacy primitives in our software systems and the Data intensive nature of our AI systems (This reflects the past and is not conditioned for the [https://en.wikipedia.org/wiki/Volatility,_uncertainty,_complexity_and_ambiguity VUCA] world) are the symptoms of our reliance on our present <i> Compute By Design </i> based systems. </p>
  
<p>The presentday systems has its roots in the computing theoretical foundation like [https://en.wikipedia.org/wiki/Turing_machine Turing Machine] , [https://en.wikipedia.org/wiki/Von_Neumann_architecture Von Neumann architecture] and [ https://en.m.wikipedia.org/wiki/Lambda_calculus lamda.calculus].  The present technology is based on  the premise that anything computable can be computed on this abstract machine model. The Von Neumann architecture is the engineering model of this turing machine. The language that programs these models are based on mathematics i.e. lambda calculus. This makes present machine and language models <i>Computation By Design</i></p>
+
<p> Quoting [https://en.wikipedia.org/wiki/Andrew_Grove Andrew Grove], we are at an inflection point in computing. <b><i>compute by design</i></b> has reached its limits. We need a new beginning. </p>
 
 
<p>This is the strength of present day IT systems and its Achilles heel. The fact that all systems in the world will be able to represented as a computation is its fundamental limitation. We are now experiencing this foundational limitation thru the different deficiencies experienced as vulnerable information systems, poor software productivity, high defect rates, poor software project success rates etc.  </p>
 
 
 
<p>This "compute by design" foundations is measured by  Key performance measure (KPM) that measure the computing power. It is illustrated as the [[https://en.wikipedia.org/wiki/Instructions_per_second Instruction per second]]. This gives the total throughput that can go thru a particular processor. So for example if the IPS is 15 KIPS then that processor has a total throughput of 15K instructions a second. Although this number is huge, it does not really reflect to a buyer that it is the value he/she is buying. It all depends on how the application programs are written to take advantage of this throughput possibility. This tells how in real life this is just a engineering number that cannot be relied to make business purchases. </p>
 
 
 
<p>But this "compute by design" foundations which served us so far is becoming a bottleneck. Increasingly the application needs are shifting from this fundamental computing paradigm to new needs. Due to the miniaturization of devices and the increasing networked nature of the information topology the older models of compute infrastructure in a single computer in a data center is no more tenable. This is because of the nature of expectation from software and computer hardware in general.</p>
 
  
<p>The application demands now have moved away from just pure computation and data processing. Now we expect our information systems to be secure and offer privacy even when it supports new sharing and workflow demands on it. All this is expected to work even when the information systems are built across a network of devices. Vulnerability of information systems, Poor software productivity, Poor information systems quality, Increasing arthritic conditions of our software systems, Lack of privacy primitives in our software systems and the Data intensive nature of our AI systems (This reflects the past and is not conditioned for the VUCA world). </p>
+
<p style="text-align:center" color="red"><b>  <u>Computation By Design</u> transforms data. </b> </p></p>
 
 
<p> Quoting [https://en.wikipedia.org/wiki/Andrew_Grove Andrew Grove], we are at an inflection point in computing. <b><i>compute by design</i></b> has reached its limits. We need a new beginning. </p>
 

Latest revision as of 12:40, 2 December 2020

Complex Information systems, like DBMS to store data, languages to process data. ERP to do business, sending and receiving mail, Chat to send and receive quick messages etc. are made up of many programs. These programs are compiled and stored in a instruction stream in RAM. These programs are executed by microprocessors. These processors take instruction, from the instruction stream, executes that instruction. While executing, data in the data stream (again in RAM) is manipulated.

The present day programs and processor, has its roots in the computing theoretical foundation like Turing Machine , Von Neumann architecture and lambda calculus. The present technology is based on the premise that anything computable can be computed on this abstract Turing machine model. The Von Neumann architecture is the engineering model of this Turing machine. The language that programs these models are based on mathematics i.e. lambda calculus. This makes present machine and languages Computation By Design

Due to the miniaturization of devices and the increasing networked nature of the information topology, the application demands now have moved away from just pure computation and data processing. Information systems are expected to be Secure by Default and Offer Privacy by default even when serving new sharing and workflow demands. Vulnerability of information systems, Poor software productivity, Poor information systems quality, Increasing arthritic conditions of our software systems, Lack of privacy primitives in our software systems and the Data intensive nature of our AI systems (This reflects the past and is not conditioned for the VUCA world) are the symptoms of our reliance on our present Compute By Design based systems.

Quoting Andrew Grove, we are at an inflection point in computing. compute by design has reached its limits. We need a new beginning.

Computation By Design transforms data.