Tuesday, September 4, 2012

Day 1 - Team J



Business Analysis: A potted history


Pre-History to 1890's

            The earliest aides for calculations started with the use of marked sticks, going back to at least 35,000 BC and progressed with the invention of newer tools such as the abacus, the concept of zero and then that of a fraction, followed by algebra, logarithms, and calculus, as well as various mechanical, water, textile looms and steam devices. Up until the late 1800's, such devices were mainly used for astrology, accounting and taxes, inventories, astronomy, navigation, reproducible pattern weaving and clocks.

As far back as Aristotle, the division and departmentalizing of labour has been something society and business has been putting down on paper proposed theories and solutions for. These theories and solutions are important to the BA, as the BA is often responsible for mapping processes and reengineering or modifying them.
In 1776, Adam Smith published the first example of a business process, the production of a pin. He showed that by identifying the steps in a process, you could use division of labour in the application of the process assigning specialists to each step and hence improving the quality of the product and the speed of its production.
Then in 1890, the US Census was undertaken. If the census had been collated in the same manner as the smaller 1880 US Census which had taken 7 years to complete, the 1890 US Census would have taken more than 10 years to complete, finishing after the next census was due, because of the increase in the population size. So a device of electro-mechanical relays was developed by Herman Hollerith that used punch cards as a means of memory, with the punch card concept taken from the textile industry and their use in looms. The collation of the data was completed in 6 weeks, though the exercise ended up costing almost double that of the 1880 US census. It was also the first large scale census that attempted to gain some profile on those being counted, instead of just age, location and sex, allowing greater in depth analysis of the data. The 1890 census was also the genesis of what was to become IBM.

1900 to 1940's

            From 1900 to the 1940's, physics and applied mathematics, chemistry and electronics, all contributed to a vast number of significant advances in understanding and inventions that led to the invention of radio, radar and television, as well as the birth of binary computers and by 1939, the first vacuum tube computer was created at Iowa State University in the US. A common use of computing aides through this period was for the calculation of trajectories for shells fired from artillery and the trajectories of bombs dropped from planes.

Vilfredo Pareto, an economist, made the observation in 1906, that 20% of the population owned 80% of the property in Italy. He then compared this observation to other countries and found the results to be similar. This was later generalized as the Pareto principle. It is often applied in business analysis, to focus on the 20% of functionality that will provide 80% of the returns, and software engineering, where 80% of defects are often found in 20% of the code, where the percentages are more figurative than literal. Essentially, most of the gains to be made with any work effort can be done by addressing only a small proportion of the problem space, beyond which you begin to run into the law of diminishing returns.
Henry Gantt in the release of his Gantt charts in 1910 is acknowledged as being heavily influenced by Taylors work. The first such chart was actually created by Karol Adamiecki in 1896, but Adamiecki first published them in 1931 as Harmonogram or Harmonograf, so today we refer to them as Gantt charts.
In 1926, Henry Ford introduced the concept of Just In Time, describing it as "dock to factory floor". As its name implies, it is a process where stock is ordered only as required, with little stock held, thus reducing storage costs but potentially increasing purchasing costs and transport costs. It also increases the risk to the manufacture if anything goes wrong with the supplier or the transport link.


1950's

            1951 saw the first business computer used by Lyons Tea for processing of their payroll. By the mid 1950's, there were estimated to be in excess of 100 computers in the world. Mostly as research platforms in universities or to support governmental needs such as census counting and as part of defence early warning systems in response to the cold war. The late 1950's saw the release of FORTRAN, the brainchild of John W. Backus, by IBM, which was the first high level language and LISP, another high level language favoured by Artificial Intelligence programmers, developed by John McCarthy while at the Massachusetts Institute of Technology in 1958. IBM released the first dot matrix printer and Texas Instruments invented Integrated Circuits.

The Algorithmic Language (ALGOL) was released in 1958 as a result of collaboration between U.S. and European researchers. While it never made much impact commercially, it did become the de-facto standard for the next 30 years for describing software algorithms, and served as the basis of many computing languages that were to follow.

 
1960's

            The term software engineering started being used in the late 1950's and early 1960's, beginning the conceptualization of software development as an engineering discipline. It is the youngest of the engineering disciplines.

Computers based on transistors are regarded as second generation computers, with the first transistorised computer appearing in 1953, though the era usually refers to computers created between 1959 to 1964. This period saw an explosion of computer languages being released, including COBOL in 1961. Large business organisations were beginning to adopt computers, primarily to perform their payrolls. The first computer game was written in 1962 and the mouse invented in 1963. Computers based on integrated circuits are regarded as third generation computers, with the era covering computers created between 1964 and 1972. Interactive computing with a graphical interface, mouse, full screen word processing and hypertext was demonstrated in 1968. In 1969, ARPANET, the forerunner of the current Internet was created by the US Department of Defense as research into networking and as a way of hardening their defensive capabilities through redundancy in the event of a nuclear attack.

1970's

            Fourth generation computers based on Large Scale Integration (LSI) circuits refer to computers created from 1972 to the present. In 1970, IBM released the first RAM chip of 128 bytes. 1971 saw the first portable electronic calculator released by Texas Instruments. Niklaus Wirth in 1970 released the Pascal computing language. With the release of the first microprocessor by Intel in 1972 and Ethernet in 1973, the 1970's saw a range of small household personal computers being released to the public and a proliferation of Local Area Networks (LAN) to connect people's computers. The Altair, Apple I, Z80, Commodore, Apple II, Tandy TRS-80, Apple Macintosh, etc. were all released over a period of a few years. The 1970's saw the release of the C programming language by Dennis Ritchie which led to the development of Unix, and the Basic programming language along with the founding of Microsoft to market it.

By the mid 1970's, the concept of information economies began to emerge. Increasing value was placed on the generation of useful information from raw data. This in turn led to the emergence of data warehouses and Management Information Systems (MIS) giving rise to Information Management (IM). Coupled with these systems, manual workflow processes were developed to manage the movement of documents through organisations, which over the following decades transitioned across to IT platforms with the inclusion of document management and imaging systems.
1980's

            The 1980's saw the release of graphical computing platforms such as Apple's Lisa in 1983 and Microsoft Windows in 1985. 1989 saw the invention of the World Wide Web - combining hypertext and networking.
By the 1980's, the concept of an Information Revolution or an Information Age was being used to describe the use of computers and information. A key role for a business analyst is to understand what data can be used to extract information, and the value of that information. They also need to be able to understand how to use that information to drive further process change to improve returns to the business or to initiate new initiatives.
The management manuals of the Toyota Production System (TPS) were roughly translated into English in 1980. This introduced the term Lean Manufacturing. Though it wasn't until the 1990's that the term came into wide use with the publication of the best seller The Machine That Changed the World : The Story of Lean Production. As part of the rebuilding of Japan, Taiichi Ohno an engineer at Toyota visited the US and took inspiration from the writings of Henry Ford and Federick Taylor, the principles put in place by W. Edwards Deming who was involved in the Japanese reconstruction, and an American supermarket just in time inventory system. The focus of Lean Manufacturing is on reducing waste, whether that be idle time or material. Both Scrum and eXtreme Programming have taken inspiration from Lean Manufacturing principles.
Bill Smith of Motorola pioneered Six Sigma in 1986. It developed as a means of keeping factory defects within a certain tolerance and has become a primary element of many organisations' Total Quality Management (TQM) initiatives. Its applicability and use as a process for managing software development to control quality has been seriously questioned.

The British government in 1987, persuaded the International Standards Organisation (ISO) to adopt the standard BS 5750 as an International Standard, which had been in place in Britain since World War II as ISO 9000. It was also influenced by existing US and other defence military standards. The emphasis was a conformance to procedures rather than the process of management. ISO 9001, ISO 9002, and ISO 9003 (which have subsequently been combined into ISO 9001) apply to the quality assurance of software development processes.

1990's

            1991 saw the release of Linux and the release of the Pretty Good Privacy encryption program. 1993 saw the release of NT 3.1 and the year heralded the explosive growth of the Internet with the release of the graphical web browser, NCSA Mosaic and Netscape Navigator the year after. Java was released by Sun in 1995 and in the same year Microsoft released Windows 95 which under the covers was MS-DOS 7.0. Yahoo! and Altavista search engines where also launched in 1995, followed by Google in 1998. The mid to late 1990's was the era of chasing after the next killer application. The chase for the next big thing led to over inflated stock prices for many IT and web based businesses, often without any sound business case. The undercurrent of hysteria around addressing any Y2K issues also led to inflated IT salaries. Following the New Year of 2000 with no significant issues caused by the date change even in those countries who largely ignored the issue, the result was somewhat anti-climatic. Then the dot.com bubble burst in March 2000.

2000's

            Since the dot.com crisis, many larger organisations in the Western nations have offshored and outsourced much of their IT. Some companies are beginning to take it further by doing so in multiple countries to take advantage of development and support around the clock and to insure that any local crises will not unduly affect them. This has also led to a large number of people from lower cost countries being sponsored by companies to work for short durations in the country that the company is geographically located.
As a business analyst, knowledge, empathy and tolerance of other cultures and nationalities is very important, but it is also a difficult and challenging aspect of modern large corporations. An important skill is to be able to use and communicate effectively using a range of tools - email, white boards, messaging, phone, conference calls, video calls, shared applications, etc. It is also beneficial to be aware of the potential sensitivities and frictions that may exist in such environments. Another consideration in distributed teams is the increased management overhead to address the negative impact to communications.
The early 2000's saw the emergence of Business Process Management systems - growing out of workflow systems. The associated notation languages are still evolving. The emergence of these systems and tools led to the development of what has come to be known as Business Process Management (BPM) which is about continuous evolutionary change in line with Non-Linear Management as distinct from the large once off changes of Michael Hammer's Business Process Reengineering, an example of Linear Management.

There has also been a move away from specialist division of labour with a corresponding impact to the traditional span of control. Teams are less hierarchical and more cross functional. There is an increasing tendency to organise teams based on vertical capabilities rather than horizontal. So rather than a team of BA's, a team of developers and a team of testers, a team is made up of BA's, developers and testers, and other members from other departments and the team is responsible for the whole life cycle of a functional piece of work. The members of these teams are managed using matrix management, first introduced in the early 1970's to manage a pool of resources who cycle across projects.

By,
Ankit Agrawal
Team - J



Article Reference – http://businessanalyst.wikia.com

No comments:

Post a Comment