Thursday, March 18, 2010

The evolution in telecommunication: from wire to wireless.

Evolution in telecommunications

Wired communication refer to transmission over a wire communication channel like fiber optic. On the other hand, communication technologies that don't use wire like wifi is known as wireless.

Early signaling and Telegraphy

Semaphore - a type of signaling, in which visual cues represent letters or words. Morse code - the transmission of a series of short and long pulses (dots and dashes) that represented characters. Duplexing - simultaneously transmitting a signal in both directions along the same wire. Multiplexing - simultaneously transmitting an indeterminate number of multiple signals over one circuit. 1856 - Western Union Telegraph Company was founded. 1861 – Over two thousand telegraph offices operated across the United States.



Telephone technology

1878- The first telephone exchange opened in New Haven, Connecticut. Connected 21 separate lines.

In 1913, N.J. Reynolds, a Western Electric engineer, developed a better automatic switch, the crossbar switch. It used a grid of horizontal and vertical bars, with electromagnets at their ends. The horizontal bars could rotate up and down to connect to specific vertical bars and thus complete circuits. Original version could complete 10 simultaneous connections. By the 1970 a single crossbar could connect 35,000 connections.



In the mid-20th century AT&T integrated electronics into crossbar switches l1965 – first electronic switching system was used to handle up to 65,000 two-way voice circuits. Until 1970 all telephone switches depended on a continuous physical connection to complete and maintain the call.

1976 – New electronic switching device was put into service. Time division switching - a transmission technique in which samples from multiple incoming lines are digitized, then each sample is issued to the same circuit, in a predetermined sequence, before finally being transmitted to the correct outbound line.



Space division switching - manipulating the physical space between two lines, thereby closing a circuit to connect a call. Local switching center (often called a local office) - a place where multiple phone lines from homes and businesses in one geographic area converge and terminate. Tandem switching center - an exchange where lines from multiple local offices converge and terminate. Toll switching center - an exchange where lines from multiple tandem switching centers converge and terminate.



Wireless Technology

Telegraphs and telephones are examples of wireline, or wire-bound technology, because they rely on physically connected wires to transmit and receive signals. Wireless technology - relies on the atmosphere to transmit and receive signals.



Examples of wireless technology are Phones, Radios, Televisions, Satellite communications.

1894- Italian physicist Guglielmo Marconi a method of transmitting electromagnetic signals through the air. His invention relied on an induction coil.

Induction coil is made by winding wire in a either one or multiple layers around a metal rod to form a coil then applying a charge. Charged wire induces an electromagnetic field that generates voltage Marconi connected an induction coil to a telegraph key. Each time the key was pressed the coil discharged a voltage through the air between to brass surfaces. Metal filings in a glass cylinder became charged and cohered. The length of time they cohered translated into short and long pulses. Pulses were relayed to a Morse code printer. Marconi invention used the same type of signals sent and received by a telegraph.

Vacuum tube - a sealed container made of glass, metal, or ceramic, that contains, in a vacuum, a charged plate that transmits current to a filament. Audion - patented in 1907 by DeForest, is a type of vacuum tube that contains an additional electrode in the middle of the positive and negative electrodes. Boosts or amplifies a signal. First instants of signal amplification and it formed the basis for all subsequent radio and television advances. 1912- Edwin Armstrong improved the Audion. He discovered that by feeding the signal back the tube the power of the Audion could be increased.

Continued experimentation resulted in the invention of Frequency modulation. Frequency modulation is technology used in FM radio and other forms of wireless technology. In Frequency modulation one wave containing the information to be transmitted (for example, on a classical FM radio station, a violin concerto) is combined with another wave, called a carrier wave, whose frequency is constant. Frequency is the number of times each second that a sine wave completes a full cycle.

The advent of FM radio afforded the best clarity of all wireless technologies then available. Walkie-Talkies use frequency modulation 1946- Bell Laboratories connect the first wireless car phone to the St. Louis network. 1962- Telstar Satellite successfully transmitted television and telephone conversation across the Atlantic for the first time.

Geosynchronous - means that satellites orbit the earth at the same rate as the earth turns. Uplink - a broadcast from an earth-based transmitter to an orbiting satellite. At the satellite, a transponder receives the uplink, then transmits the signals to another earth-based location in a downlink.



The evolution of data management technology: from traditional file to data warehouse.

Evolution of data management technology

Computers can now store all forms of information: records, documents, images, sound recordings, videos, scientific data, and many new data formats. Society has made great strides in capturing, storing, managing, analyzing, and visualizing this data. These tasks are generically called data management. This article sketches the evolution of data management systems. There have been six distinct phases in data management. Initially, data was manually processed. The next step used punched-card equipment and electromechanical machines to sort and tabulate millions of records. The third phase stored data on magnetic tape and used stored-program computers to perform batch processing on sequential files. The fourth phase introduced the concept of a database schema and on-line navigational access to the data. The fifth step automated access to relational databases and added distributed and client server processing. We are now in the early stages of sixth-generation systems that store richer data types, notably documents, images, voice, and video data. These sixth-generation systems are the storage engines for the emerging Internet and intranets.

Early data management systems automated traditional information processing. Today they allow fast, reliable, and secure access to globally distributed data. Tomorrow's systems will access and summarize richer forms of data. It is argued that multimedia databases will be a cornerstone of cyberspace.



Traditional approaches to master data management The enterprise application Traditional approaches to master data include the use of existing enterprise applications, data warehouses and even middleware. Some organizations approach the master data issue by leveraging dominant and seemingly domain-centric applications, such as a customer relationship management (CRM) application for the customer domain or an enterprise resource planning (ERP) application for the product domain. However, CRM and ERP, among other enterprise applications, have been designed and implemented to automate specific business processes such as customer on-boarding, procure-to-pay and order-to-cash—not to manage data across these processes. The result is that a specific data domain, such as customer or product, may actually reside within multiple processes, and therefore multiple applications.

In this scenario using application masters, it is difficult to determine which iteration of customer, product or account—if any—is complete and correct. Additional complexity occurs as organizations attempt to maintain the correct copy of the data, and identify and understand all of the systems that can update a particular domain, those that consume portions of the updates, and the frequency rate at which this consumption occurs. It quickly becomes apparent to organizations that have undergone such a project that the process-automating application cannot also manage data across the enterprise.



The data warehouse Alternately, some enterprise initiatives have attempted to repurpose new or existing data warehouses to serve as a master data repository. As data warehouses aggregate enterprise information, the warehouse is often viewed as a starting point for companies attempting to master their data. However, data warehouses have inherent design characteristics to optimize reporting and analysis, and to drive sophisticated insight to the business. This design, while effective for its primary use, cannot scale well within an operational environment—even in the case of dynamic warehousing—when measured against the needs of most businesses today. Based on its fundamental design, the data warehouse also lacks data management capabilities. Essential functionality such as operational business services, collaborative workflows and real-time analytics that are critical to success in these types of master data implementations require large amounts of custom coding. Similarly, data management capabilities—data changes that trigger events and intelligent understanding of unique views required by consuming systems—are also absent from a data warehouse.

Integration middleware Enterprise information integration (EII) or enterprise application integration (EAI) technologies used to federate and synchronize systems and data have also been presented as substitutes for data management products. Although these solutions can tie together disparate pieces of architecture either at the data tier (EII) or at the application tier (EAI), they do not provide either a physical or virtual repository to manage these key data elements. And much like warehouses, they lack data functionality. The management of data processes poses yet another challenge. Choosing to build functionality within this middleware technology can affect performance in its core competency: the integration of applications and data. Without a true master data solution to complement it, the implementation of EII and EAI technology can actually add to the architectural complexity of the business and perpetuate master data problems with point-to-point integration. In most cases, these methods fail because they are designed to treat data symptoms, such as fragmented data or systems that are out of sync, not the root cause of the master data problem. That root cause is that data is tightly coupled to applications and business processes and this data is not being managed by a single, independent resource that can capture the complete and current enterprise understanding of the domain(customer,product, account or supplier).



While EII and EAI technologies specialize in specific functions such as data federation, data quality or aggregate analytics, they do not manage the essential data processes or data changes that can initiate other processes such as quality and data stewardship. Attempting to manage these data processes virtually can mean that an essential fact—like the correct address of a customer—must be determined one very transaction; for example, determining whether address 1 from system A or address 2 from system B is correct. It is also necessary to persist this information because the data is created and changed over time— this time frame is known as the information life cycle—to capture net new data like privacy preferences and to deliver this information in context to all of the relevant consumers, typically on demand via business services.

The problem with traditional approaches

The following example illustrates the problem. A customer contact occurs in the call center. This action initiates an address change to a customer record. The address change is immediately reflected in the CRM application, but the billing system is not updated. The customer’s bill for that month is sent to the wrong address and the analytics are skewed because the data warehouse did not receive the required change. The ERP system, on the other hand, has a third address, confusing data stewards and forcing another customer contact to try to correct the error. The result is a poor customer service experience for the customer. No single application has the ability to manage the “golden copy” of this customer information to ensure all systems receive the necessary changes, as well as triggering duplicate suspect processing (matching the customer with an already existing address), event handling (such as alerting a data steward to the problem) and analyzing whether a product offer should be made due to the change. While existing systems are automating their associated business processes, this dynamic data is actually driving process changes of its own. Integration technology or a data warehouse in combination with extensive customization may provide the ability to link some of these applications and data elements. But does this integration occur frequently enough to avoid discrepancies across the enterprise? What if the address change was originally made to the billing system when the customer received the last invoice statement? Will this information be overwritten by the dated CRM address? What happens with the addition of another channel such as a self-service Web application that also has an address update capability?



The evolution of master data management solutions

In general, master data management (MDM) solutions should offer the following:
• Consolidate data locked within the native systems and applications
• Manage common data and common data processes independently with functionality for use in business processes
• Trigger business processes that originate from data change
• Provide a single understanding of the domain—customer, product, account, location— for the enterprise

MDM products, however, address these four requirements very differently. Some products decouple data linked to source systems so they can dynamically create a virtual view of the domain, while others include the additional ability to physically store master data and persist and propagate this information. Some products are not designed for a specific usage style, while others provide a single usage of this master data. Even more mature products provide all of the usage types required in today’s complex business—collaborative, operational and analytic—as out-of-the-box functionality. These mature products also provide intelligent data management by recognizing changes in the information and triggering additional processes as necessary. Finally, MDM products vary in their domain coverage, ranging from specializing in a single domain such as customer or product to spanning multiple and integrated domains. Those that span multiple domains help to harness not only the value of the domain, but also the value between domains, also known as relationships. Relationships may include customers to their locations, to their accounts or to products they have purchased. This combination of multiple domains, multiple usage styles and the full set of capabilities between creating a virtual view and performance in a transactional environment is known as multiform master data management. Figure 2 depicts these different solutions and their placement in functional maturity versus MDM evolutionary stages.



Ref:
1. From Data Management to Information Integration: A Natural Evolution
Mary Roth, Senior engineer and Manager , IBM Silicon Valley Lab

2. IBM Multiform Master Data Management: The evolution of MDM applications.
http://www.itworldcanada.com/WhitePaperLibrary/PdfDownloads/IBM-LI-Evolution_of_MDM.pdf

The comparison between general purpose application software and function-specific application software.

General purpose application software is much broader in use. Word processors for example, can handle every form of writing, aside from calligraphy. Spreadsheet programs like Excel handle a significant portion of data processing problems (with databases taking the rest).

Examples of general purpose applications:


Word Processing Software: This software enables the users to create and edit documents. The most popular examples of this type of software are MS-Word, WordPad, Notepad and some other text editors.


Database Software: Database is a structured collection of data. A computer database relies on database software to organize the data and enable the database users to achieve database operations. Database software allows the users to store and retrieve data from databases. Examples are Oracle, MSAccess, etc.


Spreadsheet Software: Excel, Lotus 1-2-3 and Apple Numbers are some examples of spreadsheet software. Spreadsheet software allows users to perform calculations. They simulate paper worksheets by displaying multiple cells that make up a grid.


Multimedia Software: They allow the users to create and play audio and video media. They are capable of playing media files. Audio converters, players, burners, video encoders and decoders are some forms of multimedia software. Examples of this type of software include Real Player and Media Player.


Presentation Software: The software that is used to display information in the form of a slide show is known as presentation software. This type of software includes three functions, namely, editing that allows insertion and formatting of text, methods to include graphics in the text and a functionality of executing the slide shows. Microsoft PowerPoint is the best example of presentation software.

Function-specific application software is very specific in its use. Engineering programs often fall under this category - there is a program that does slope stability analysis and nothing else, for instance. Special purpose software may also be created in house and tailored to the specific needs of the company.

Examples of function-specific application software:


Engineering application software: Visual Basics


Business application software: Point of Sales (POS)


Accounting application software: User Business System (UBS accounting)

In general, special purpose software is intended to perform a very specific function, while general purpose software is intended to perform a broader class of functions.

IS Hardware : The evolution of computer systems; from mainframe computer systems to microcomputer systems.


  1. First Generation (1939-1954) - vacuum tube
  2. Second Generation Computers (1954-1959) - transistor
  3. Third Generation Computers (1959-1971) - IC
  4. Fourth Generation (1971-1991) - microprocessor
  5. Fifth Generation (1991 and Beyond)
http://history.sandiego.edu/GEN/recording/computer1.html



In the late 1970s computer systems could be classified into microcomputers, minicomputers and mainframe computers:

A microcomputer: a single user computer system (cost L2000 to L5000) based on an 8-bit microprocessor (Intel 8080, Zilog Z80, Motorola 6800). These were used for small industrial (e.g. small control systems), office (e.g. word-processing, spreadsheets) and program development (e.g. schools, colleges) applications.

A minicomputer: a medium sized multi-user system (cost L20000 to L200000) used within a department or a laboratory. Typically it would support 4 to 16 concurrent users depending upon its size and area of application, e.g. CAD in a design office.

A mainframe computer: a large multi-user computer system (cost L500000 upwards) used as the central computer service of a large organisation, e.g. Gas Board customer accounts. Large organisations could have several mainframe and minicomputer systems, possibly on different sites, linked by a communications network.



Introduction to Digital Computer System
by Dr Sheikh Sharif Iqbal
http://faculty.kfupm.edu.sa/EE/sheikhsi/EE_390_Digital_System_Engineering/Extra_lecture_1.pdf

Early computing devices were mechanical machines with gears and levers (
such as Abacus and slide-rule, that is also available today as toys)

“Colossus Mark-I” was the 1st electronic computer, used during World War 2 (1943) to decipher military codes. It had 1500
Vacuum tubes and could process 5000 characters/second.

ENIAC” was a reprogrammable digital computer, built in 1946 to calculate the artillery firing tables of US army. This huge device had 17468 vacuum tubes, 1500 relays, 70000 resistors, 10000 capacitors and consumed almost 150 kilowatt power. (ENIAC was able to store a maximum of twenty 10-digit decimal numbers and could discriminate the sign of a number, compare quantities for equality, add, subtract, multiply, divide, and extract square roots)
*(
ENIAC = Electronic Numerical Integrator and Computer)

Hyperlinked data: Vacuum tube looks like light bulbs and generally used to amplify, or otherwise modify, a signal by controlling the movement of electrons in an evacuated space. Currently the vacuum tube has been replaced by the much smaller and less expensive transistor, either as a discrete device or in an integrated circuit.

Hyperlinked data: Electronic Numerical Integrator and Computer

UNIVAC” was the 1st commercial mainframe computer used by US government in 1951. It used 5200 vacuum tubes that required 125 kW of power to perform 1900 operations per second. (It was able to store 1000 words of 12 characters and running on a clock of 2.25 MHz. This 13 ton device was priced in the range of 100’s of thousands of dollar)
*(
UNIVAC = UNIVersal Automatic Computer)

IBM 650” was the 1st mass produced computer as 2000 of this computers were manufactured and supplied from 1953 to 1962. (It consisted of a console unit, power unit, rotating drum memory and a card reader unit and processed 17,000 instructions per second.)

Complete History: http://library.thinkquest.org/18268/History/hist_c_50s.htm (
In that same period, several other companies in different countries were also involved in the development process of digital computers and a complete description can be found in the given URL:



In 1954, the advent of commercial silicon based junction transistors introduced the 2nd generation of computers, where vacuum tubes were replaced with junction transistors.

In 1955, the 1st fully transistorized digital computer, “
TRADIC”, was manufactured using almost 700-800 transistors and 10000 diodes. (This 3 cubic feet device was about twenty times faster than the vacuum tube computers and required less than 100 watts of power)
*(
TRADIC = TRAnsistor DIgital Computer or TRansistorized Airborne DIgital Computer)

IBM 7000 series was the 1st transistorized computer made by IBM with one million instructions per second capabilities.

In 1958, the invention of integrated cirucits (IC’s) by Jack Kilby and semiconductor IC’s by Robert Noyce (1959) led to a new generation of commercial computers called Mini-computers. (
Through 1960’s this scaled-down versions main-frame computers, equipped with keyboard and monitors, were popular due to its reasonable price (USD 10,000 and upwards), acceptable size and performance)

In 1971, the release of 1st commercial
microprocessor (Intel 4004) revolutionized the computer industry and resulted in a new generation of micro-computers, also known as personal computers (PC’s). (Later In 1973, a French company developed “Micral”, a low precession personal computer. Due to its high cost of $1,750, this computer was never marketed)



Microprocessor: Also known as central processing unit (CPU) or the brain of the personal computer. It is an integrated circuit built on a tiny piece of silicon and contains thousands, or even millions, of transistors, which are interconnected via superfine traces of aluminum. These transistors work together to store and manipulate data so that the microprocessor can perform a wide variety of useful functions as dictated by software.

In 1977, 1st desktop computer, “TRS-80”, was released with most of the required accessories and an inbuilt programming language “BASIC”. (
At a price 600 USD, nearly 10,000 of these machines were sold around the world)

In 1981, IBM’s 1st PC was marketed. It ran on Intel’s 4.77 MHz 8088 microprocessor and came with MS-DOS operating system. The main components of PC’s are briefly introduced in the next slide.



Technical specifications of Intel 4004: (1) Maximum clock speed is 740 kHz and executed ≈60000 instructions/sec. (2) Register set contains 16 registers of 4 bits each (3) Separate program and data storage with single multiplexed 4-bit bus for transferring: 12- bit addresses, 8-bit instructions and 4-bit data words. (4) Instruction set contains 46 instructions. (5) A Japanese company used them to build calculators.

In recent days, microprocessors (μP) are implemented using very large-scale integrated (VLSI) circuit technology, where millions of transistors are interconnected via superfine traces of aluminum and works together to store and manipulate information as dictated by the controlling software.

The evolution of microprocessors has been known to follow
Moore's Law and from the humble beginning as processors in calculators (Intel 4004), microprocessors today dominates most of the digital systems from mainframes to the smallest handheld computers.



Moore’s Law: Gordon E. Moore was the co-founder of “Intel”. His empirical observation that appeared in the “Electronics Magazine of 19 April 1965” stated that the complexity of integrated circuits, with respect to minimum component cost, doubles every 24 months and in fiture all the components of a CPU can be built on a single wafer. This rule is generally followed, unconsciously, since early 1970.

Tuesday, February 2, 2010

Introduction of Members

Hi, we are group APPLE. Our group consists of 5 members and here is a little introduction on our members.

1)
Tan Jason
900916-03-5435
1000829
Pasir Pekan, Kelantan
Quote: I'm a small figure in ta big world hoping to one day beat the likes of Bill Gates or Warren Buffet and be the richest dude alive. Might be a long shot but still, who mind a lil' day dream?

2)
Ong Chia Yee
901219-03-5338
1001150
Kota Bharu, Kelantan
Quote: I hate to work but I want to be rich enough to buy anything I want anywhere and anytime one day. This should be my wildest dream.

3)
Ho Oshin
880629-08-5338
1002018
Sungai Siput, Perak
Quote: I love shopping. I wish to have lots of money so that I can buy whatever I want. Besides, I think that sleeping and eating are the most enjoyable activities in my life!

4)
Cheng Lee San
880115-08-5904
1002017
Sungai Siput, Perak
Quote: I like travelling and I wish to be rich so I can travel around the world and eat all types of food from every country. I think this is the most enjoyable things in my life.

5)
Peter Chrysologus Boon Yew Hann
880730-06-5227
1001097
Kuantan, Pahang
Quote: Unlike my other group members who day dream to be rich and wealthy, I on the other hand plan to be richer than them. *evil laughs* I find lots of things enjoyable in life; from chilling out with friends to doing the most adventurous stuff like sky-diving, bungee- jumping, etc. My wildest dream would probably be having a really perfect picture family of my own, a wife and a couple of kids maybe. It would be my lifetime adventure.