Wednesday, August 15, 2007

Starters for IT


So to begin here are some laws i.e. rules of IT game:

Moore's Law is the empirical observation made in 1965 that the number of transistors on an integrated circuit for minimum component cost doubles every 24 months.It is attributed to Gordon E. Moore (born 1929), a co-founder of Intel. Although it is sometimes quoted as every 18 months, Intel's official Moore's Law page, as well as an interview with Gordon Moore himself, states that it is every two years.Under the assumption that chip "complexity" is proportional to the number of transistors, regardless of what they do, the law has largely held the test of time to date.Moore's law is not about just the density of transistors that can be achieved, but about the density of transistors at which the cost per transistor is the lowest.

Wirth's law in computing was made popular by Niklaus Wirth in 1995.The law states
Software gets slower faster than hardware gets faster.

or

Software is decelerating faster than hardware is accelerating.

Hardware is clearly getting faster over time, and some of that development is quantified by Moore's law; Wirth's law points out that this does not imply that work is actually getting done faster. Programs tend to get bigger and more complicated over time, and sometimes programmers even rely on Moore's law to justify writing slow code, thinking that it won't be a problem because the hardware will get faster anyway.

As an example of Wirth's law, one can observe that booting a modern PC with a modern operating system usually takes longer than it did to boot a PC five or ten years ago.Software is being used to create software. By using Integrated Development Environments, Compilers and Code Libraries, programmers are further and further divorced from the machine and closer and closer to the software user's needs. This results in many layers of interpretation which take a complex requirement in human understandable form and convert it into a very large number of the extremely limited set of instructions that can be performed by a computer.

Gates' Law is a humorous and ironic observation that the speed of commercial software generally slows by fifty percent every 18 months.Though the law's name is meant to refer to Bill Gates, Gates himself did not actually formulate it. Rather, the term is attributed to Gates based on the noted tendency of Microsoft products to slow down with each successive feature or patch.

Bell's Law of Computer Classes: "Roughly every decade a new, lower priced computer class forms based on a new programming platform, network, and interface resulting in new usage and the establishment of a new industry."
Established market class computers aka platforms are introduced and continue to evolve at roughly a constant price.In 2005 the computer classes include: mainframes (60's); minicomputers (70's); personal computers and workstations evolving into a network enabled by Local Area Networking or Ethernet (80's); web browser client-server structure that were enabled by the Internet (90's); web services e.g. Microsoft's .Net (2000's) or the Grid; cell phone sized devices c(2000); Wireless Sensor Networks aka motes (>c2005). Bell predicts home and body area networks will form by 2010.

Brooks's law was stated by Fred Brooks in his 1975 book The Mythical Man-Month as "Adding manpower to a late software project makes it later." Likewise, Brooks memorably stated "The bearing of a child takes nine months, no matter how many women are assigned." While Brooks's law is often quoted, the line before it in The Mythical Man-Month is almost never quoted: "Oversimplifying outrageously, we state Brooks's Law."One reason for the seeming contradiction is that software projects are complex engineering endeavors, and new workers on the project must first become educated in the work.Another significant reason is the communication overheads increase as the number of people increase.

Metcalfe's law states that the value of a telecommunications network is proportional to the square of the number of users of the system (n2). First formulated by Robert Metcalfe in regard to Ethernet, Metcalfe's law explains many of the network effects of communication technologies and networks such as the Internet and World Wide Web.
The law has often been illustrated using the example of fax machines: A single fax machine is useless, but the value of every fax machine increases with the total number of fax machines in the network, because the total number of people with whom each user may send and receive documents increases.

Since a user cannot connect to itself, the reasoning goes, the actual calculation is the number of diagonals and sides in an n-gon (see also the triangular numbers):
\frac{n(n-1)}{2}

However, that value, which simplifies to (n2n) / 2, is Big O proportional to the square of the number of users, so this remains the same as Metcalfe's original law.

Metcalfe's Law can be applied to more than just telecommunications devices. Metcalfe's Law can be applied to almost any computer systems that exchange data. Examples of applications include:


Reed's law is the assertion of David P. Reed that the utility of large networks, particularly social networks, can scale exponentially with the size of the network.

The reason for this is that the number of possible sub-groups of network participants is 2^N - N - 1 \, , where N is the number of participants. This grows much more rapidly than either

  • the number of participants, N, or
  • the number of possible pair connections, \frac{N(N-1)}{2} (which follows Metcalfe's law)

so that even if the utility of groups available to be joined is very small on a per-group basis, eventually the network effect of potential group membership can dominate the overall economics of the system.

Given a set A of N people, it has 2N possible subsets. This is not difficult to see, since we can form each possible subset by simply choosing for each element of A one of two possibilities: whether to include that element, or not.

However, this includes the (one) empty set, and N singletons, which are not properly subgroups. So 2NN − 1 subsets remain, which is exponential, like 2N.

From David P. Reed's, "The Law of the Pack" (Harvard Business Review, February 2001, pp 23-4):

"Even Metcalfe's Law understates the value created by a group-forming network as it grows. Let's say you have a GFN with n members. If you add up all the potential two-person groups, three-person groups, and so on that those members could form, the number of possible groups equals 2n. So the value of a GFN increases exponentially, in proportion to 2n. I call that Reed's Law. And its implications are profound."


That all for today..........Will be getting deeper as we go on.........

No comments: