Google Search

Friday, November 14, 2008

Sunday, September 14, 2008

Online Wallet


Online Wallet

An online wallet is a program or web service that allows users to store and control their online shopping information, like logins, passwords, shipping address and credit card details, in one central place.

Friday, August 29, 2008

The Internet Society (ISOC)

The Internet Society (ISOC) is a nonprofit organisation founded in 1992 to provide leadership in Internet related standards, education, and policy. With offices in Washington, USA, and Geneva, Switzerland, it is dedicated to ensuring the open development, evolution and use of the Internet for the benefit of people throughout the world.

The Internet Society provides leadership in addressing issues that confront the future of the Internet, and is the organisational home for the groups responsible for Internet infrastructure standards, including the Internet Engineering Task Force (IETF) and the Internet Architecture Board (IAB).

The Internet Society acts not only as a global clearinghouse for Internet information and education but also as a facilitator and coordinator of Internet-related initiatives around the world. For over 15 years ISOC has run international network training programs for developing countries and these have played a vital role in setting up the Internet connections and networks in virtually every country connecting to the Internet during this time.

The Internet Society has more than 80 organisational and more than 28,000 individual members in over 90 chapters around the world. ISOC has also created regional bureaus to better serve the regional Internet community. The Latin American and Caribbean bureau is located in Buenos Aires, Argentina, the African bureau in Addis Ababa, Ethiopia and the South and Southeast Asian bureau in Suva, Fiji.

Through its sponsored events, developing-country training workshops, tutorials, public policy, and regional and local chapters, the Internet Society serves the needs of the growing global Internet community. From commerce to education to social issues, our goal is to enhance the availability and utility of the Internet on the widest possible scale.

The Society's individual and organisation members are bound by a common stake in maintaining the viability and global scaling of the Internet. They comprise the companies, government agencies, and foundations that have created the Internet and its technologies as well as innovative new entrepreneurial organisations contributing to maintain that dynamic. Visit their home pages to see how Internet innovators are creatively using the network.

At the start of 2008, ISOC launched a set of longer term, strategic activities, called "initiatives". The initiatives which will drive ISOC's activities in 2008-2010 are:

* Enabling Access
* InterNetWorks
* Trust & Identity

The Society is governed by its Board of Trustees, elected by its membership around the world.

How to Contact the Society:

Internet Society International Secretariat
1775 Wiehle Ave., Suite 102
Reston, VA 20190
Tel: +1 703 326 9880
Fax: +1 703 326 9881
USA

Join ISOC!

Please check our Contact page for a list of departments and e-mail addresses.

Wednesday, August 20, 2008

Online Broadband

Online Broadband
Type of broadband will i use to play online on ps3 here in the philippines

You will need to subscribe to a wired broadband connection. PLDT has myDSL, Globe has Globe Boradband, Eastern Telecoms has evoDSL and Bayan Telecommunications has Bayan DSL. There's also Greendot, SkyCable and myDestiny.






Fixed wireless like Smart Bro or Globe's Speak n Surf are not recommended for gaming because they have longer latency. Meaning, the response rate to the gaming server is slower because of the delays caused by wireless transmissions to the cellsites.

PLDT and Globe has affordable rates starting from 512Kbsp to 1.2Mbps which costs between Php995 to Php1,995.
Online Broadband

Tuesday, August 19, 2008

Information technology (IT) -Online Internet

Information technology (IT), as defined by the Information Technology Association of America (ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." IT deals with the use of electronic computers and computer software to convert, store, protect, process, transmit, and securely retrieve information. Today, the term information technology has ballooned to encompass many aspects of computing and technology, and the term is more recognizable than ever before. The information technology umbrella can be quite large, covering many fields. IT professionals perform a variety of duties that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as the management and administration of entire systems. When computer and communications technologies are combined, the result is information technology, or "infotech". Information Technology (IT) is a general term that describes any technology that helps to produce, manipulate, store, communicate, and/or disseminate information. Presumably, when speaking of Information Technology (IT) as a whole, it is noted that the use of computers and information are associated
.

Information technology (IT)

Sunday, August 17, 2008

Internet Access

Internet Online ICT
Common methods of home access include dial-up, landline broadband (over coaxial cable, fiber optic or copper wires), Wi-Fi, satellite and 3G technology cell phones.

Public places to use the Internet include libraries and Internet cafes, where computers with Internet connections are available. There are also Internet access points in many public places such as airport halls and coffee shops, in some cases just for brief use while standing. Various terms are used, such as "public Internet kiosk", "public access terminal", and "Web payphone".
Many hotels now also have public terminals, though these are usually fee-based. These terminals are widely accessed for various usage like ticket booking, bank deposit, online payment etc. Wi-Fi provides wireless access to computer networks, and therefore can do so to the Internet itself. Hotspots providing such access include Wi-Fi cafes, where would-be users need to bring their own wireless-enabled devices such as a laptop or PDA.
These services may be free to all, free to customers only, or fee-based. A hotspot need not be limited to a confined location. A whole campus or park, or even an entire city can be enabled. Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services covering large city areas are in place in London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. The Internet can then be accessed from such places as a park bench.[9]

Apart from Wi-Fi, there have been experiments with proprietary mobile wireless networks like Ricochet, various high-speed data services over cellular phone networks, and fixed wireless services.
High-end mobile phones such as smartphones generally come with Internet access through the phone network. Web browsers such as Opera are available on these advanced handsets, which can also run a wide variety of other Internet software. More mobile phones have Internet access than PCs, though this is not as widely used. An Internet access provider and protocol matrix differentiates the methods used to get online.
Internet Online ICT
source:wikipedia

Sunday, August 10, 2008

ICT Solutions - How To Build A Website

ICT

Step 1: Choose a web hosting company. It is that company which host web pages on there server on your behalf in lieu of some monthly fees, known as the web hosting fees. The hosting fees can be any thing from US $ 3 to US $ 100 or more depending on the services provided and your website size and traffic. There are several companies which provide free hosting like Geocities and Bravenet.
But these companies put there ads on your web pages to provide such type of service to you. Select your host company on the basis of : monthly hosting fees, server's up time, bandwidth, disk space, number of password protected user accounts, website management tools and services, number of sub-domains and tools and scripts supported. Hiring a local and reputed host is recommended, as it becomes easier and cheaper to interact with him in case of any technical snag or other website related issues.

Step2: Register a domain name. You can register it for up to 10 years. Usually there is an additional fee for registration but they are some hosts which provide free domain name registration like yahoo web hosting.

Step3: Almost all website hosting companies provide website development tools and services. There is an additional fee for website design, development and promotion. However for website's promotion, a separate company which is specialist in internet marketing should be hired. This is because site promotion is very important. What's the use of setting up a site, if nobody visits it in the first place? Since promotion is a continuous process and requires significant amount of expenditure, therefore in this case getting value for your money becomes very important. Generally companies which are also involved in design and development are not well equipped/ skilled enough for effective website's promotion.

Step4: It is important for you, to know how a site is developed and by whom. A typical web team consists of designers, graphic designers, developers, content developers, internet marketers and a team leader. A designer creates layout or blueprint of a website i.e. how the site will look like. His job is to make the site attention grabbing and visually appealing. Graphic designer is responsible for creating graphic images like graphic buttons, animation etc. A developer's job is to build the functionality of the site .i.e. how it should respond to a particular action by a visitor. For e.g. when a visitor fills a form and click on the 'submit' button, then the visitor's details should automatically get stored in a database after proper validation. For these actions to take place, a developer writes certain programs or codes. So he is basically a programmer.

Content developers are generally copy writers, whose job is to develop interesting contents for the website. Internet marketers are responsible for promoting the site through 'Search Engine Optimization' (SEO), google and yahoo ad words, affiliate marketing, e-mail marketing, banner advertisement etc. The team leader works a client servicing executive. He acts as a bridge between the client and the web team. He gets the requirements from the client and then gets them executed through his team. He is also responsible for developing coordination among the team members. All team members report to him.

Step5: Acquire as much knowledge as possible on personal level about website hosting, design, development and promotion before hiring any company. This will help you in making better choices and help in effective monitoring of you work done by others.

ICT
Article Source: http://EzineArticles.com/?expert=Ankita_Sharma

Thursday, August 7, 2008

Need networking ideas on getting new business going

Create a myspace page net work

1 Buy a domain
2. Setup of a web hosting
3. Create a website
4. Publish your website

For beginners I think yahoo web hosting is the best and easiest to use. Yahoo has question and answer section that can solve most problems and direct phone line for additional support.
http://smallbusiness.yahoo.com/
http://help.yahoo.com/l/us/yahoo/smallbu...
http://help.yahoo.com/l/us/yahttp://smallbusiness.yahoo.com/contactus...
http://geocities.yahoo.com/Create a myspace page net work

1 Buy a domain
2. Setup of a web hosting
3. Create a website
4. Publish your website

For beginners I think yahoo web hosting is the best and easiest to use. Yahoo has question and answer section that can solve most problems and direct phone line for additional support.
http://smallbusiness.yahoo.com/
http://help.yahoo.com/l/us/yahoo/smallbu...
http://help.yahoo.com/l/us/yahttp://smallbusiness.yahoo.com/contactus...
http://geocities.yahoo.com/

Network security

Network security
People going out of college doing this job making 100-120 thousand dollars a year. I am currently enrolled in a computer repair and networking class, i was just wondering what all it takes to get network security as a career (somewhat of an Anti-hacker, if you will)

However The two classes you listed are not enough really. In addition, I'd take computer security classes and consider Novel, Microsoft and Cisco certification as well. Ideally, you should have a college degree in computer security .However The two classes you listed are not enough really. In addition, I'd take computer security classes and consider Novel, Microsoft and Cisco certification as well. Ideally, you should have a college degree in computer security.
Network security

Wednesday, August 6, 2008

Networks that led to the Internet

Networks that led to the Internet

ARPANET
ARPANET Len Kleinrock and the first IMP.[4] Promoted to the head of the information processing office at DARPA, Robert Taylor intended to realize Licklider's ideas of an interconnected networking system. Bringing in Larry Roberts from MIT, he initiated a project to build such a network. The first ARPANET link was established between the University of California, Los Angeles and the Stanford Research Institute on 22:30 hours on October 29, 1969. By 5 December 1969, a 4-node network was connected by adding the University of Utah and the University of California, Santa Barbara. Building on ideas developed in ALOHAnet, the ARPANET grew rapidly. By 1981, the number of hosts had grown to 213, with a new host being added approximately every twenty days.[5][6]

ARPANET became the technical core of what would become the Internet, and a primary tool in developing the technologies used. ARPANET development was centered around the Request for Comments (RFC) process, still used today for proposing and distributing Internet Protocols and Systems. RFC 1, entitled "Host Software", was written by Steve Crocker from the University of California, Los Angeles, and published on April 7, 1969. These early years were documented in the 1972 film Computer Networks: The Heralds of Resource Sharing. International collaborations on ARPANET were sparse. For various political reasons, European developers were concerned with developing the X.25 networks. Notable exceptions were the Norwegian Seismic Array (NORSAR) in 1972, followed in 1973 by Sweden with satellite links to the Tanum Earth Station and University College London.

X.25 and public access

Following on from ARPA's research, packet switching network standards were developed by the International Telecommunication Union (ITU) in the form of X.25 and related standards. In 1974, X.25 formed the basis for the SERCnet network between British academic and research sites, which later became JANET. The initial ITU Standard on X.25 was approved in March 1976. This standard was based on the concept of virtual circuits.

The British Post Office, Western Union International and Tymnet collaborated to create the first international packet switched network, referred to as the International Packet Switched Service (IPSS), in 1978. This network grew from Europe and the US to cover Canada, Hong Kong and Australia by 1981. By the 1990s it provided a worldwide networking infrastructure.[7]

Unlike ARPAnet, X.25 was also commonly available for business use. Telenet offered its Telemail electronic mail service, but this was oriented to enterprise use rather than the general email of ARPANET.

The first dial-in public networks used asynchronous TTY terminal protocols to reach a concentrator operated by the public network. Some public networks, such as CompuServe used X.25 to multiplex the terminal sessions into their packet-switched backbones, while others, such as Tymnet, used proprietary protocols. In 1979, CompuServe became the first service to offer electronic mail capabilities and technical support to personal computer users. The company broke new ground again in 1980 as the first to offer real-time chat with its CB Simulator.

Monday, July 7, 2008

Internet


Internet History
ICT

Internet History Prior to the widespread inter-networking that led to the Internet, most communication networks were limited by their nature to only allow communications (ICT) between the stations on the network, and the prevalent computer networking method was based on the central mainframe method. In the 1960s, computer researchers, Levi C. Finch and Robert W. Taylor pioneered calls for a joined-up global network to address interoperability problems. Concurrently, several research programs began to research principles of networking between separate physical networks, and this led to the development of Packet switching. These included Donald Davies (NPL), Paul Baran (RAND Corporation), and Leonard Kleinrock's MIT and UCLA research programs.



This led to the development of several packet switched networking solutions in the late 1960s and 1970s, including ARPANET, and X.25. Additionally, public access and hobbyist networking systems grew in popularity, including UUCP. They were however still disjointed separate networks, served only by limited gateways between networks. This led to the application of packet switching to develop a protocol for inter-networking, where multiple different networks could be joined together into a super-framework of networks. By defining a simple common network system, the Internet protocol suite, the concept of the network could be separated from its physical implementation. This spread of inter-network began to form into the idea of a global inter-network that would be called 'The Internet', and this began to quickly spread as existing networks were converted to become compatible with this. This spread quickly across the advanced telecommunication networks of the western world, and then began to penetrate into the rest of the world as it became the de-facto international standard and global network. However, the disparity of growth led to a digital divide that is still a concern today.

Following commercialisation and introduction of privately run Internet Service Providers in the 1980s, and its expansion into popular use in the 1990s, the Internet has had a drastic impact on culture and commerce. This includes the rise of near instant communication by e-mail, text based discussion forums, the World Wide Web. Investor speculation in new markets provided by these innovations would also lead to the inflation and collapse of the Dot-com bubble, a major market collapse. But despite this, growth of the Internet continued, and still does.

Before the Internet
ICT
In the 1950s and early 1960s, prior to the widespread inter-networking that led to the Internet, most communication networks were limited by their nature to only allow communications between the stations on the network. Some networks had gateways or bridges between them, but these bridges were often limited or built specifically for a single use. One prevalent computer networking method was based on the central mainframe method, simply allowing its terminals to be connected via long leased lines. This method was used in the 1950s by Project RAND to support researchers such as Herbert Simon, in Pittsburgh, Pennsylvania, when collaborating across the continent with researchers in Sullivan, Illinois, on automated theorem proving and artificial intelligence.

Three terminals and an ARPA
ICT

A fundamental pioneer in the call for a global network, J.C.R. Licklider, articulated the ideas in his January 1960 paper, Man-Computer Symbiosis. "A network of such [computers], connected to one another by wide-band communication lines [which provided] the functions of present-day libraries together with anticipated advances in information storage and retrieval and [other] symbiotic functions."
—J.C.R. Licklider, [1]

In October 1962, Licklider was appointed head of the United States Department of Defense's Advanced Research Projects Agency, now known as DARPA, within the information processing office. There he formed an informal group within DARPA to further computer research. As part of the information processing office's role, three network terminals had been installed: one for System Development Corporation in Santa Monica, one for Project Genie at the University of California, Berkeley and one for the Compatible Time-Sharing System project at the Massachusetts Institute of Technology (MIT). Licklider's identified need for inter-networking would be made obviously evident by the problems this caused. "For each of these three terminals, I had three different sets of user commands. So if I was talking online with someone at S.D.C. and I wanted to talk to someone I knew at Berkeley or M.I.T. about this, I had to get up from the S.D.C. terminal, go over and log into the other terminal and get in touch with them. [...] I said, it's obvious what to do (But I don't want to do it): If you have these three terminals, there ought to be one terminal that goes anywhere you want to go where you have interactive computing. That idea is the ARPAnet."

—Robert W. Taylor, co-writer with Licklider of "The Computer as a Communications Device", in an interview with the New York Times, [2]

Packet switching

At the tip of the inter-networking problem lay the issue of connecting separate physical networks to form one logical network, with much wasted capacity inside the assorted separate networks. During the 1960s, Donald Davies (NPL), Paul Baran (RAND Corporation), and Leonard Kleinrock (MIT) developed and implemented packet switching. The notion that the Internet was developed to survive a nuclear attack has its roots in the early theories developed by RAND, but is an urban legend, not supported by any Internet Engineering Task Force or other document. Early networks used for the command and control of nuclear forces were message switched, not packet-switched, although current strategic military networks are, indeed, packet-switching and connectionless. Baran's research had approached packet switching from studies of decentralisation to avoid combat damage compromising the entire network.[3]

Networks that led to the Internet

ARPANET
ARPANET Len Kleinrock and the first IMP.[4] Promoted to the head of the information processing office at DARPA, Robert Taylor intended to realize Licklider's ideas of an interconnected networking system. Bringing in Larry Roberts from MIT, he initiated a project to build such a network. The first ARPANET link was established between the University of California, Los Angeles and the Stanford Research Institute on 22:30 hours on October 29, 1969. By 5 December 1969, a 4-node network was connected by adding the University of Utah and the University of California, Santa Barbara. Building on ideas developed in ALOHAnet, the ARPANET grew rapidly. By 1981, the number of hosts had grown to 213, with a new host being added approximately every twenty days.[5][6]

ARPANET became the technical core of what would become the Internet, and a primary tool in developing the technologies used. ARPANET development was centered around the Request for Comments (RFC) process, still used today for proposing and distributing Internet Protocols and Systems. RFC 1, entitled "Host Software", was written by Steve Crocker from the University of California, Los Angeles, and published on April 7, 1969. These early years were documented in the 1972 film Computer Networks: The Heralds of Resource Sharing. International collaborations on ARPANET were sparse. For various political reasons, European developers were concerned with developing the X.25 networks. Notable exceptions were the Norwegian Seismic Array (NORSAR) in 1972, followed in 1973 by Sweden with satellite links to the Tanum Earth Station and University College London.

X.25 and public access

Following on from ARPA's research, packet switching network standards were developed by the International Telecommunication Union (ITU) in the form of X.25 and related standards. In 1974, X.25 formed the basis for the SERCnet network between British academic and research sites, which later became JANET. The initial ITU Standard on X.25 was approved in March 1976. This standard was based on the concept of virtual circuits.

The British Post Office, Western Union International and Tymnet collaborated to create the first international packet switched network, referred to as the International Packet Switched Service (IPSS), in 1978. This network grew from Europe and the US to cover Canada, Hong Kong and Australia by 1981. By the 1990s it provided a worldwide networking infrastructure.[7]

Unlike ARPAnet, X.25 was also commonly available for business use. Telenet offered its Telemail electronic mail service, but this was oriented to enterprise use rather than the general email of ARPANET.

The first dial-in public networks used asynchronous TTY terminal protocols to reach a concentrator operated by the public network. Some public networks, such as CompuServe used X.25 to multiplex the terminal sessions into their packet-switched backbones, while others, such as Tymnet, used proprietary protocols. In 1979, CompuServe became the first service to offer electronic mail capabilities and technical support to personal computer users. The company broke new ground again in 1980 as the first to offer real-time chat with its CB Simulator.

There were also the America Online (AOL) and Prodigy dial in networks and many bulletin board system (BBS) networks such as FidoNet. FidoNet in particular was popular amongst hobbyist computer users, many of them hackers and amateur radio operators.

UUCP

In 1979, two students at Duke University, Tom Truscott and Jim Ellis, came up with the idea of using simple Bourne shell scripts to transfer news and messages on a serial line with nearby University of North Carolina at Chapel Hill. Following public release of the software, the mesh of UUCP hosts forwarding on the Usenet news rapidly expanded. UUCPnet, as it would later be named, also created gateways and links between FidoNet and dial-up BBS hosts. UUCP networks spread quickly due to the lower costs involved, and ability to use existing leased lines, X.25 links or even ARPANET connections. By 1981 the number of UUCP hosts had grown to 550, nearly doubling to 940 in 1984.

Merging the networks and creating the Internet

TCP/IP
Main article: Internet protocol suite

Map of the TCP/IP test network in January 1982 With so many different network methods, something was needed to unify them. Robert E. Kahn of DARPA and ARPANET recruited Vinton Cerf of Stanford University to work with him on the problem. By 1973, they had soon worked out a fundamental reformulation, where the differences between network protocols were hidden by using a common internetwork protocol, and instead of the network being responsible for reliability, as in the ARPANET, the hosts became responsible. Cerf credits Hubert Zimmerman, Gerard LeLann and Louis Pouzin (designer of the CYCLADES network) with important work on this design.[8]

At this time, the earliest known use of the term Internet was by Vinton Cerf, who wrote:

“ Specification of Internet Transmission Control Program. ”

"Request for Comments No. 675" (Network Working Group, electronic text (1974)[9]

With the role of the network reduced to the bare minimum, it became possible to join almost any networks together, no matter what their characteristics were, thereby solving Kahn's initial problem. DARPA agreed to fund development of prototype software, and after several years of work, the first somewhat crude demonstration of a gateway between the Packet Radio network in the SF Bay area and the ARPANET was conducted. On November 22, 1977[10] a three network demonstration was conducted including the ARPANET, the Packet Radio Network and the Atlantic Packet Satellite network—all sponsored by DARPA. Stemming from the first specifications of TCP in 1974, TCP/IP emerged in mid-late 1978 in nearly final form. By 1981, the associated standards were published as RFCs 791, 792 and 793 and adopted for use. DARPA sponsored or encouraged the development of TCP/IP implementations for many operating systems and then scheduled a migration of all hosts on all of its packet networks to TCP/IP. On 1 January 1983, TCP/IP protocols became the only approved protocol on the ARPANET, replacing the earlier NCP protocol.[11]

ARPANET to Several Federal Wide Area Networks: MILNET, NSI, and NSFNet

After the ARPANET had been up and running for several years, ARPA looked for another agency to hand off the network to; ARPA's primary mission was funding cutting edge research and development, not running a communications utility. Eventually, in July 1975, the network had been turned over to the Defense Communications Agency, also part of the Department of Defense. In 1983, the U.S. military portion of the ARPANET was broken off as a separate network, the MILNET. MILNET subsequently became the unclassified but military-only NIPRNET, in parallel with the SECRET-level SIPRNET and JWICS for TOP SECRET and above. NIPRNET does have controlled security gateways to the public Internet.

The networks based around the ARPANET were government funded and therefore restricted to noncommercial uses such as research; unrelated commercial use was strictly forbidden. This initially restricted connections to military sites and universities. During the 1980s, the connections expanded to more educational institutions, and even to a growing number of companies such as Digital Equipment Corporation and Hewlett-Packard, which were participating in research projects or providing services to those who were. Several other branches of the U.S. government, the National Aeronautics and Space Agency (NASA), the National Science Foundation (NSF), and the Department of Energy (DOE) became heavily involved in internet research and started development of a successor to ARPANET. In the mid 1980s all three of these branches developed the first Wide Area Networks based on TCP/IP. NASA developed the NASA Science Network, NSF developed CSNET and DOE evolved the Energy Sciences Network or ESNet. More explicitly, NASA developed a TCP/IP based Wide Area Network, NASA Science Network (NSN), in the mid 1980s connecting space scientists to data and information stored anywhere in the world. In 1989, the DECnet-based Space Physics Analysis Network (SPAN) and the TCP/IP-based NASA Science Network (NSN) were brought together at NASA Ames Research Center creating the first multiprotocol wide area network called the NASA Science Internet, or NSI. NSI was established to provide a total integrated communications infrastructure to the NASA scientific community for the advancement of earth, space and life sciences. As a high-speed, multiprotocol, international network, NSI provided connectivity to over 20,000 scientists across all seven continents.

In 1984 NSF developed CSNET exclusively based on TCP/IP. CSNET connected with ARPANET using TCP/IP, and ran TCP/IP over X.25, but it also supported departments without sophisticated network connections, using automated dial-up mail exchange. This grew into the NSFNet backbone, established in 1986, and intended to connect and provide access to a number of supercomputing centers established by the NSF.[12]

Transition toward an Internet

The term "Internet" was adopted in the first RFC published on the TCP protocol (RFC 675[13]: Internet Transmission Control Program, December 1974). It was around the time when ARPANET was interlinked with NSFNet, that the term Internet came into more general use,[14] with "an internet" meaning any network using TCP/IP. "The Internet" came to mean a global and large network using TCP/IP. Previously "internet" and "internetwork" had been used interchangeably, and "internet protocol" had been used to refer to other networking systems such as Xerox Network Services.[15]

As interest in wide spread networking grew and new applications for it arrived, the Internet's technologies spread throughout the rest of the world. TCP/IP's network-agnostic approach meant that it was easy to use any existing network infrastructure, such as the IPSS X.25 network, to carry Internet traffic. In 1984, University College London replaced its transatlantic satellite links with TCP/IP over IPSS. Many sites unable to link directly to the Internet started to create simple gateways to allow transfer of e-mail, at that time the most important application. Sites which only had intermittent connections used UUCP or FidoNet and relied on the gateways between these networks and the Internet. Some gateway services went beyond simple e-mail peering, such as allowing access to FTP sites via UUCP or e-mail.

TCP/IP becomes worldwide

The first ARPANET connection outside the US was established to NORSAR in Norway in 1973, just ahead of the connection to Great Britain. These links were all converted to TCP/IP in 1982, at the same time as the rest of the Arpanet.

CERN, the European internet, the link to the Pacific and beyond

Between 1984 and 1988 CERN began installation and operation of TCP/IP to interconnect its major internal computer systems, workstations, PC's and an accelerator control system. CERN continued to operate a limited self-developed system CERNET internally and several incompatible (typically proprietary) network protocols externally. There was considerable resistance in Europe towards more widespread use of TCP/IP and the CERN TCP/IP intranets remained isolated from the Internet until 1989.

In 1988 Daniel Karrenberg, from CWI in Amsterdam, visited Ben Segal, CERN's TCP/IP Coordinator, looking for advice about the transition of the European side of the UUCP Usenet network (much of which ran over X.25 links) over to TCP/IP. In 1987, Ben Segal had met with Len Bosack from the then still small company Cisco about purchasing some TCP/IP routers for CERN, and was able to give Karrenberg advice and forward him on to Cisco for the appropriate hardware. This expanded the European portion of the Internet across the existing UUCP networks, and in 1989 CERN opened its first external TCP/IP connections.[16] This coincided with the creation of Réseaux IP Européens (RIPE), initially a group of IP network administrators who met regularly to carry out co-ordination work together. Later, in 1992, RIPE was formally registered as a cooperative in Amsterdam.

At the same time as the rise of internetworking in Europe, ad hoc networking to ARPA and in-between Australian universities formed, based on various technologies such as X.25 and UUCPNet. These were limited in their connection to the global networks, due to the cost of making individual international UUCP dial-up or X.25 connections. In 1989, Australian universities joined the push towards using IP protocols to unify their networking infrastructures. AARNet was formed in 1989 by the Australian Vice-Chancellors' Committee and provided a dedicated IP based network for Australia.

The Internet began to penetrate Asia in the late 1980s. Japan, which had built the UUCP-based network JUNET in 1984, connected to NSFNet in 1989. It hosted the annual meeting of the Internet Society, INET'92, in Kobe. Singapore developed TECHNET in 1990, and Thailand gained a global Internet connection between Chulalongkorn University and UUNET in 1992.[17]

Digital divide

While developed countries with technological infrastructures were joining the Internet, developing countries began to experience a digital divide separating them from the Internet. On an essentially continental basis, they are building organizations for Internet resource administration and sharing operational experience, as more and more transmission facilities go into place.

Africa

At the beginning of the 1990s, African countries relied upon X.25 IPSS and 2400 baud modem UUCP links for international and internetwork computer communications. In 1996 a USAID funded project, the Leland initiative, started work on developing full Internet connectivity for the continent. Guinea, Mozambique, Madagascar and Rwanda gained satellite earth stations in 1997, followed by Côte d'Ivoire and Benin in 1998.

Africa is building an Internet infrastructure. AfriNIC, headquartered in Mauritius, manages IP address allocation for the continent. As do the other Internet regions, there is an operational forum, the Internet Community of Operational Networking Specialists.[18]

There are a wide range of programs both to provide high-performance transmission plant, and the western and southern coasts have undersea optical cable. High-speed cables join North Africa and the Horn of Africa to intercontinental cable systems. Undersea cable development is slower for East Africa; the original joint effort between New Partnership for Africa's Development (NEPAD) and the East Africa Submarine System (Eassy) has broken off and may become two efforts.[19]

Asia and Oceania

The Asia Pacific Network Information Centre (APNIC), headquartered in Australia, manages IP address allocation for the continent. APNIC sponsors an operational forum, the Asia-Pacific Regional Internet Conference on Operational Technologies (APRICOT).[20]

In 1991, the People's Republic of China saw its first TCP/IP college network, Tsinghua University's TUNET. The PRC went on to make its first global Internet connection in 1995, between the Beijing Electro-Spectrometer Collaboration and Stanford University's Linear Accelerator Center. However, China went on to implement its own digital divide by implementing a country-wide content filter.[21]

Latin America

As with the other regions, the Latin American and Caribbean Internet Addresses Registry (LACNIC) manages the IP address space and other resources for its area. LACNIC, headquartered in Uruguay, operates DNS root, reverse DNS, and other key services.

Opening the network to commerce

The interest in commercial use of the Internet became a hotly debated topic. Although commercial use was forbidden, the exact definition of commercial use could be unclear and subjective. UUCPNet and the X.25 IPSS had no such restrictions, which would eventually see the official barring of UUCPNet use of ARPANET and NSFNet connections. Some UUCP links still remained connecting to these networks however, as administrators cast a blind eye to their operation. During the late 1980s, the first Internet service provider (ISP) companies were formed. Companies like PSINet, UUNET, Netcom, and Portal Software were formed to provide service to the regional research networks and provide alternate network access, UUCP-based email and Usenet News to the public. The first dial-up on the West Coast, was Best Internet[22] - now Verio, opened in 1986. The first dialup ISP in the East was world.std.com, opened in 1989.

This caused controversy amongst university users, who were outraged at the idea of noneducational use of their networks. Eventually, it was the commercial Internet service providers who brought prices low enough that junior colleges and other schools could afford to participate in the new arenas of education and research.

By 1990, ARPANET had been overtaken and replaced by newer networking technologies and the project came to a close. In 1994, the NSFNet, now renamed ANSNET (Advanced Networks and Services) and allowing non-profit corporations access, lost its standing as the backbone of the Internet. Both government institutions and competing commercial providers created their own backbones and interconnections. Regional network access points (NAPs) became the primary interconnections between the many networks and the final commercial restrictions ended.

IETF and a standard for standards

The Internet has developed a significant subculture dedicated to the idea that the Internet is not owned or controlled by any one person, company, group, or organization. Nevertheless, some standardization and control is necessary for the system to function. The liberal Request for Comments (RFC) publication procedure engendered confusion about the Internet standardization process, and led to more formalization of official accepted standards.

The IETF started in January of 1985 as a quarterly meeting of U.S. government funded researchers. Representatives from non-government vendors were invited starting with the fourth IETF meeting in October of that year. Acceptance of an RFC by the RFC Editor for publication does not automatically make the RFC into a standard. It may be recognized as such by the IETF only after experimentation, use, and acceptance have proved it to be worthy of that designation. Official standards are numbered with a prefix "STD" and a number, similar to the RFC naming style. However, even after becoming a standard, most are still commonly referred to by their RFC number.

In 1992, the Internet Society, a professional membership society, was formed and the IETF was transferred to operation under it as an independent international standards body.

NIC, InterNIC, IANA and ICANN

The first central authority to coordinate the operation of the network was the Network Information Centre (NIC) at Stanford Research Institute (SRI) in Menlo Park, California. In 1972, management of these issues was given to the newly created Internet Assigned Numbers Authority (IANA). In addition to his role as the RFC Editor, Jon Postel worked as the manager of IANA until his death in 1998.

As the early ARPANET grew, hosts were referred to by names, and a HOSTS.TXT file would be distributed from SRI International to each host on the network. As the network grew, this became cumbersome. A technical solution came in the form of the Domain Name System, created by Paul Mockapetris. The Defense Data Network—Network Information Center (DDN-NIC) at SRI handled all registration services, including the top-level domains (TLDs) of .mil, .gov, .edu, .org, .net, .com and .us, root nameserver administration and Internet number assignments under a United States Department of Defense contract.[23] In 1991, the Defense Information Systems Agency (DISA) awarded the administration and maintenance of DDN-NIC (managed by SRI up until this point) to Government Systems, Inc., who subcontracted it to the small private-sector Network Solutions, Inc.[24][25]

Since at this point in history most of the growth on the Internet was coming from non-military sources, it was decided that the Department of Defense would no longer fund registration services outside of the .mil TLD. In 1993 the U.S. National Science Foundation, after a competitive bidding process in 1992, created the InterNIC to manage the allocations of addresses and management of the address databases, and awarded the contract to three organizations. Registration Services would be provided by Network Solutions; Directory and Database Services would be provided by AT&T; and Information Services would be provided by General Atomics.[26]

In 1998 both IANA and InterNIC were reorganized under the control of ICANN, a California non-profit corporation contracted by the US Department of Commerce to manage a number of Internet-related tasks. The role of operating the DNS system was privatized and opened up to competition, while the central management of name allocations would be awarded on a contract tender basis.

Use and culture
E-mail and Usenet

E-mail is often called the killer application of the Internet. However, it actually predates the Internet and was a crucial tool in creating it. E-mail started in 1965 as a way for multiple users of a time-sharing mainframe computer to communicate. Although the history is unclear, among the first systems to have such a facility were SDC's Q32 and MIT's CTSS.[27]

The ARPANET computer network made a large contribution to the evolution of e-mail. There is one report[28] indicating experimental inter-system e-mail transfers on it shortly after ARPANET's creation. In 1971 Ray Tomlinson created what was to become the standard Internet e-mail address format, using the @ sign to separate user names from host names.[29]

A number of protocols were developed to deliver e-mail among groups of time-sharing computers over alternative transmission systems, such as UUCP and IBM's VNET e-mail system. E-mail could be passed this way between a number of networks, including ARPANET, BITNET and NSFNet, as well as to hosts connected directly to other sites via UUCP.

In addition, UUCP allowed the publication of text files that could be read by many others. The News software developed by Steve Daniel and Tom Truscott in 1979 was used to distribute news and bulletin board-like messages. This quickly grew into discussion groups, known as newsgroups, on a wide range of topics. On ARPANET and NSFNet similar discussion groups would form via mailing lists, discussing both technical issues and more culturally focused topics (such as science fiction, discussed on the sflovers mailing list).

From gopher to the WWW

As the Internet grew through the 1980s and early 1990s, many people realized the increasing need to be able to find and organize files and information. Projects such as Gopher, WAIS, and the FTP Archive list attempted to create ways to organize distributed data. Unfortunately, these projects fell short in being able to accommodate all the existing data types and in being able to grow without bottlenecks.[citation needed]

One of the most promising user interface paradigms during this period was hypertext. The technology had been inspired by Vannevar Bush's "Memex"[30] and developed through Ted Nelson's research on Project Xanadu and Douglas Engelbart's research on NLS.[31] Many small self-contained hypertext systems had been created before, such as Apple Computer's HyperCard. Gopher became the first commonly-used hypertext interface to the Internet. While Gopher menu items were examples of hypertext, they were not commonly perceived in that way.

In 1989, whilst working at CERN, Tim Berners-Lee invented a network-based implementation of the hypertext concept. By releasing his invention to public use, he ensured the technology would become widespread.[32] For his work in developing the world wide web, Berners-Lee received the Millennium technology prize on 2004. One early popular web browser, modeled after HyperCard, was ViolaWWW.

Scholars generally agree,[citation needed] however, that the turning point for the World Wide Web began with the introduction[33] of the Mosaic web browser[34] in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the High-Performance Computing and Communications Initiative, a funding program initiated by then-Senator Al Gore's High Performance Computing and Communication Act of 1991 also known as the Gore Bill .[35] Indeed, Mosaic's graphical interface soon became more popular than Gopher, which at the time was primarily text-based, and the WWW became the preferred interface for accessing the Internet. (Gore's reference to his role in "creating the Internet", however, was ridiculed in his presidential election campaign. See the full article Al Gore and information technology).

Mosaic was eventually superseded in 1994 by Andreessen's Netscape Navigator, which replaced Mosaic as the world's most popular browser. While it held this title for some time, eventually competition from Internet Explorer and a variety of other browsers almost completely displaced it. Another important event held on January 11, 1994, was The Superhighway Summit at UCLA's Royce Hall. This was the "first public conference bringing together all of the major industry, government and academic leaders in the field [and] also began the national dialogue about the Information Superhighway and its implications."[36]

24 Hours in Cyberspace, the "the largest one-day online event" (February 8, 1996) up to that date, took place on the then-active website, cyber24.com.[37][38] It was headed by photographer Rick Smolan.[39] A photographic exhibition was unveiled at the Smithsonian Institution's National Museum of American History on 23 January 1997, featuring 70 photos from the project.[40]

Search engines

Even before the World Wide Web, there were search engines that attempted to organize the Internet. The first of these was the Archie search engine from McGill University in 1990, followed in 1991 by WAIS and Gopher. All three of those systems predated the invention of the World Wide Web but all continued to index the Web and the rest of the Internet for several years after the Web appeared. There are still Gopher servers as of 2006, although there are a great many more web servers.

As the Web grew, search engines and Web directories were created to track pages on the Web and allow people to find things. The first full-text Web search engine was WebCrawler in 1994. Before WebCrawler, only Web page titles were searched. Another early search engine, Lycos, was created in 1993 as a university project, and was the first to achieve commercial success. During the late 1990s, both Web directories and Web search engines were popular—Yahoo! (founded 1995) and Altavista (founded 1995) were the respective industry leaders.

By August 2001, the directory model had begun to give way to search engines, tracking the rise of Google (founded 1998), which had developed new approaches to relevancy ranking. Directory features, while still commonly available, became after-thoughts to search engines.

Database size, which had been a significant marketing feature through the early 2000s, was similarly displaced by emphasis on relevancy ranking, the methods by which search engines attempt to sort the best results first. Relevancy ranking first became a major issue circa 1996, when it became apparent that it was impractical to review full lists of results. Consequently, algorithms for relevancy ranking have continuously improved. Google's PageRank method for ordering the results has received the most press, but all major search engines continually refine their ranking methodologies with a view toward improving the ordering of results. As of 2006, search engine rankings are more important than ever, so much so that an industry has developed ("search engine optimizers", or "SEO") to help web-developers improve their search ranking, and an entire body of case law has developed around matters that affect search engine rankings, such as use of trademarks in metatags. The sale of search rankings by some search engines has also created controversy among librarians and consumer advocates.

Dot-com bubble

The suddenly low price of reaching millions worldwide, and the possibility of selling to or hearing from those people at the same moment when they were reached, promised to overturn established business dogma in advertising, mail-order sales, customer relationship management, and many more areas. The web was a new killer app—it could bring together unrelated buyers and sellers in seamless and low-cost ways. Visionaries around the world developed new business models, and ran to their nearest venture capitalist. Of course some of the new entrepreneurs were truly talented at business administration, sales, and growth; but the majority were just people with ideas, and didn't manage the capital influx prudently. Additionally, many dot-com business plans were predicated on the assumption that by using the Internet, they would bypass the distribution channels of existing businesses and therefore not have to compete with them; when the established businesses with strong existing brands developed their own Internet presence, these hopes were shattered, and the newcomers were left attempting to break into markets dominated by larger, more established businesses. Many did not have the ability to do so.

The dot-com bubble burst on March 10, 2000, when the technology heavy NASDAQ Composite index peaked at 5048.62 (intra-day peak 5132.52), more than double its value just a year before. By 2001, the bubble's deflation was running full speed. A majority of the dot-coms had ceased trading, after having burnt through their venture capital and IPO capital, often without ever making a profit.

Worldwide Online Population Forecast

In its "Worldwide Online Population Forecast, 2006 to 2011," JupiterResearch anticipates that a 38 percent increase in the number of people with online access will mean that, by 2011, 22 percent of the Earth's population will surf the Internet regularly.

JupiterResearch says the worldwide online population will increase at a compound annual growth rate of 6.6 percent during the next five years, far outpacing the 1.1 percent compound annual growth rate for the planet's population as a whole. The report says 1.1 billion people currently enjoy regular access to the Web.

North America will remain on top in terms of the number of people with online access. According to JupiterResearch, online penetration rates on the continent will increase from the current 70 percent of the overall North American population to 76 percent by 2011. However, Internet adoption has "matured," and its adoption pace has slowed, in more developed countries including the United States, Canada, Japan and much of Western Europe, notes the report.

As the online population of the United States and Canada grows by about only 3 percent, explosive adoption rates in China and India will take place, says JupiterResearch. The report says China should reach an online penetration rate of 17 percent by 2011 and India should hit 7 percent during the same time frame. This growth is directly related to infrastructure development and increased consumer purchasing power, notes JupiterResearch.

By 2011, Asians will make up about 42 percent of the world's population with regular Internet access, 5 percent more than today, says the study.

Penetration levels similar to North America's are found in Scandinavia and bigger Western European nations such as the United Kingdom and Germany, but JupiterResearch says that a number of Central European countries "are relative Internet laggards." Brazil "with its soaring economy," is predicted by JupiterResearch to experience a 9 percent compound annual growth rate, the fastest in Latin America, but China and India are likely to do the most to boost the world's online penetration in the near future.

For the study, JupiterResearch defined "online users" as people who regularly access the Internet by "dedicated Internet access" devices. Those devices do not include cell phones.[41]

Historiography

Some concerns have been raised over the historiography of the Internet's development. This is due to lack of centralised documentation for much of the early developments that led to the Internet.

"The Arpanet period is somewhat well documented because the corporation in charge - BBN - left a physical record. Moving into the NSFNET era, it became an extraordinarily decentralised process. The record exists in people's basements, in closets. [...] So much of what happened was done verbally and on the basis of individual trust."
—Doug Gale, [42]

Footnotes

1.^ J. C. R. Licklider (1960). "Man-Computer Symbiosis".
2.^ An Internet Pioneer Ponders the Next Revolution. An Internet Pioneer Ponders the Next
Revolution. Retrieved on November 25, 2005.
3.^ About Rand. Paul Baran and the Origins of the Internet. Retrieved on January 14, 2006.
4.^ "The history of the Internet," http://www.lk.cs.ucla.edu/personal_history.html
5.^ Hafner, Katie (1998). Where Wizards Stay Up Late: The Origins Of The Internet. Simon &
Schuster. 0-68-483267-4.
6.^ Ronda Hauben (2001). "From the ARPANET to the Internet".
7.^ Events in British Telecomms History. Events in British TelecommsHistory. Retrieved on
November 25, 2005.
8.^ Barry M. Leiner, Vinton G. Cerf, David D. Clark, Robert E. Kahn, Leonard Kleinrock, Daniel
C. Lynch, Jon Postel, Larry G. Roberts, Stephen Wolff (2003). "A Brief History of Internet".
9.^ "The Yale Book of Quotations" (2006) Yale University Press edited by Fred R. Shapiro
10.^ Computer History Museum and Web History Center Celebrate 30th Anniversary of
Internet Milestone. Retrieved on November 22, 2007.
11.^ Jon Postel, NCP/TCP Transition Plan, RFC 801
12.^ David Roessner, Barry Bozeman, Irwin Feller, Christopher Hill, Nils Newman (1997). "The
Role of NSF's Support of Engineering in Enabling Technological Innovation".
13.^ RFC 675 - SPECIFICATION OF INTERNET TRANSMISSION CONTROL PROGRAM
14.^ Tanenbaum, Andrew S. (1996). Computer Networks. Prentice Hall. 0-13-394248-1.
15.^ Mike Muuss (5th January 1983). "Aucbvax.5690 TCP-IP Digest, Vol 1 #10". fa.tcp-ip.
(Web link).
16.^ Ben Segal (1995). "A Short History of Internet Protocols at CERN".
17.^ Internet History in Asia. 16th APAN Meetings/Advanced Network Conference in Busan.
Retrieved on December 25, 2005.
18.^ ICONS webpage
19.^ Nepad, Eassy partnership ends in divorce,(South African) Financial Times FMTech, 2007
20.^ APRICOT webpage
21.^ A brief history of the Internet in China. China celebrates 10 years of being connected to the
Internet. Retrieved on December 25, 2005.
22.^ Best Internet Communications: Press Release: Low Cost Web Site
23.^ DDN NIC. IAB Recommended Policy on Distributing Internet Identifier Assignment.
Retrieved on December 26, 2005.
24.^ GSI-Network Solutions. TRANSITION OF NIC SERVICES. Retrieved on December 26,
2005.
25.^ Thomas v. NSI, Civ. No. 97-2412 (TFH), Sec. I.A. (DCDC April 6, 1998)
26.^ NIS Manager Award Announced. NSF NETWORK INFORMATION SERVICES AWARDS.
Retrieved on December 25, 2005.
27.^ The Risks Digest. Great moments in e-mail history. Retrieved on April 27, 2006.
28.^ The History of Electronic Mail. The History of Electronic Mail. Retrieved on December 23,
2005.
29.^ The First Network Email. The First Network Email. Retrieved on December 23, 2005.
30.^ Vannevar Bush (1945). "As We May Think".
31.^ Douglas Engelbart (1962). "Augmenting Human Intellect: A Conceptual Framework".
32.^ The Early World Wide Web at SLAC. The Early World Wide Web at SLAC: Documentation
of the Early Web at SLAC. Retrieved on November 25, 2005.
33.^ Mosaic Web Browser History - NCSA, Marc Andreessen, Eric Bina
34.^ NCSA Mosaic - September 10, 1993 Demo
35.^ Vice President Al Gore's ENIAC Anniversary Speech
36.^ UCLA Center for Communication Policy
37.^ Mirror of Official site map
38.^ Mirror of Official Site
39.^ "24 Hours in Cyberspace" (and more)
40.^ The human face of cyberspace, painted in random images
41.^ Brazil, Russia, India and China to Lead Internet Growth Through 2011
42.^ An Internet Pioneer Ponders the Next Revolution. Illuminating the net's Dark Ages.
Retrieved on February 26, 2008.

References
  • Abbate, Janet. Inventing the Internet. Cambridge: MIT Press, 1999.
  • Campbell-Kelly, Martin; Aspray, William. Computer: A History of the Information Machine. New York: BasicBooks, 1996.
  • Graham, Ian S. The HTML Sourcebook: The Complete Guide to HTML. New York: John Wiley and Sons, 1995.
  • Krol, Ed. Hitchhiker's Guide to the Internet, 1987.Krol, Ed. Whole Internet User's Guide and Catalog. O'Reilly & Associates, 1992.
  • Scientific American Special Issue on Communications, Computers, and Networks, September, 1991.
( from www.wikipedia.org)

Thursday, July 3, 2008

Networking


Networks can be categorized in several different ways. One approach defines the type of network according to the geographic area it spans. Local area networks (LANs), for example, typically reach across a single home, whereas wide area networks (WANs), reach across cities, states, or even across the world. The Internet is the world's largest public WAN.



Network Design
Computer networks also differ in their design. The two types of high-level network design are called client-server and peer-to-peer. Client-server networks feature centralized server computers that store email, Web pages, files and or applications. On a peer-to-peer network, conversely, all computers tend to support the same functions. Client-server networks are much more common in business and peer-to-peer networks much more common in homes.
A network topology represents its layout or structure from the point of view of data flow. In so-called bus networks, for example, all of the computers share and communicate across one common conduit, whereas in a star network, all data flows through one centralized device. Common types of network topologies include bus, star, ring and mesh.


Network Protocols
In networking, the communication language used by computer devices is called the protocol. Yet another way to classify computer networks is by the set of protocols they support. Networks often implement multiple protocols to support specific applications. Popular protocols include TCP/IP, the most common protocol found on the Internet and in home networks.
Wired vs Wireless Networking
Many of the same network protocols, like TCP/IP, work in both wired and wireless networks. Networks with Ethernet cables predominated in businesses, schools, and homes for several decades. Recently, however, wireless networking alternatives have emerged as the premier technology for building new computer networks.

( from www.wikipedia.org)

Computer Application

COMPUTER HARDWARE

Computer hardware is the physical part of a computer, including its digital circuitry, as distinguished from the computer software that executes within the hardware. The hardware of a computer is infrequently changed, in comparison with software and hardware data, which are "soft" in the sense that they are readily created, modified or erased on the computer. Firmware is a special type of software that rarely, if ever, needs to be changed and so is stored on hardware devices such as read-only memory (ROM) where it is not readily changed (and is, therefore, "firm" rather than just "soft").
Most computer hardware is not seen by normal users. It is in embedded systems in automobiles, microwave ovens, electrocardiograph machines, compact disc players, and other devices. Personal computers, the computer hardware familiar to most people, form only a small minority of computers (about 0.2% of all new computers produced in 2003).


Typical PC hardware

A typical personal computer consists of a case or chassis in a tower shape (desktop) and the following parts:


Motherboard

Motherboard - the "body" or mainframe of the computer, through which all other components interface.
Central processing unit (CPU) - Performs most of the calculations which enable a computer to function, sometimes referred to as the "brain" of the computer.
Computer fan - Used to lower the temperature of the computer; a fan is almost always attached to the CPU, and the computer case will generally have several fans to maintain a constant airflow. Liquid cooling can also be used to cool a computer, though it focuses more on individual parts rather than the overall temperature inside the chassis.
Random Access Memory (RAM) - Fast-access memory that is cleared when the computer is powered-down. RAM attaches directly to the motherboard, and is used to store programs that are currently running.
Firmware is loaded from the Read only memory ROM run from the Basic Input-Output System (BIOS) or in newer systems Extensible Firmware Interface (EFI) compliant
Internal Buses - Connections to various internal components.
PCI
PCI-E
USB
HyperTransport
CSI (expected in 2008)
AGP (being phased out)
VLB (outdated)
External Bus Controllers - used to connect to external peripherals, such as printers and input devices. These ports may also be based upon expansion cards, attached to the internal buses.
parallel port (outdated)
serial port (outdated)
USB
firewire
SCSI (On Servers and older machines)
PS/2 (For mice and keyboards, being phased out and replaced by USB.)
ISA (outdated)
EISA (outdated)
MCA (outdated)

Power supply

Main article: Computer power supply
A case control, and (usually) a cooling fan, and supplies power to run the rest of the computer, the most common types of power supplies are AT and BabyAT (old) but the standard for PC's actually are ATX and Micro ATX.

Storage controllers

Controllers for hard disk, CD-ROM and other drives like internal Zip and Jaz conventionally for a PC are IDE/ATA; the controllers sit directly on the motherboard (on-board) or on expansion cards, such as a Disk array controller. IDE is usually integrated, unlike SCSI which is found in most servers. The floppy drive interface is a legacy MFM interface which is now slowly disappearing. All these interfaces are gradually being phased out to be replaced by SATA and SAS.

Video display controller

Main article: Graphics card
Produces the output for the visual display unit. This will either be built into the motherboard or attached in its own separate slot (PCI, PCI-E, PCI-E 2.0, or AGP), in the form of a Graphics Card.

Removable media devices

Main article: Computer storage
CD (compact disc) - the most common type of removable media, inexpensive but has a short life-span.
CD-ROM Drive - a device used for reading data from a CD.
CD Writer - a device used for both reading and writing data to and from a CD.
DVD (digital versatile disc) - a popular type of removable media that is the same dimensions as a CD but stores up to 6 times as much information. It is the most common way of transferring digital video.
DVD-ROM Drive - a device used for reading data from a DVD.
DVD Writer - a device used for both reading and writing data to and from a DVD.
DVD-RAM Drive - a device used for rapid writing and reading of data from a special type of DVD.
Blu-ray - a high-density optical disc format for the storage of digital information, including high-definition video.
BD-ROM Drive - a device used for reading data from a Blu-ray disc.
BD Writer - a device used for both reading and writing data to and from a Blu-ray disc.
HD DVD - a high-density optical disc format and successor to the standard DVD. It was a discontinued competitor to the Blu-ray format.
Floppy disk - an outdated storage device consisting of a thin disk of a flexible magnetic storage medium.
Zip drive - an outdated medium-capacity removable disk storage system, first introduced by Iomega in 1994.
USB flash drive - a flash memory data storage device integrated with a USB interface, typically small, lightweight, removable, and rewritable.
Tape drive - a device that reads and writes data on a magnetic tape, usually used for long term storage.

Internal storage

Hardware that keeps data inside the computer for later use and remains persistent even when the computer has no power.
Hard disk - for medium-term storage of data.
Solid-state drive - a device similar to hard disk, but containing no moving parts.
Disk array controller - a device to manage several hard disks, to achieve performance or reliability improvement.

Sound card

Main article: Sound card
Enables the computer to output sound to audio devices, as well as accept input from a microphone. Most modern computers have sound cards built-in to the motherboard, though it is common for a user to install a separate sound card as an upgrade.

Networking

Main article: Computer networks
Connects the computer to the Internet and/or other computers.
Modem - for dial-up connections
Network card - for DSL/Cable internet, and/or connecting to other computers.
Direct Cable Connection - Use of a null modem, connecting two computers together using their serial ports or a Laplink Cable, connecting two computers together with their parallel ports.
dial up connections broad band connections

( from www.wikipedia.org)

Computer

A programmable machine. The two principal characteristics of a computer are:
It responds to a specific set of instructions in a well-defined manner.
It can execute a prerecorded list of instructions (a program).

Modern computers are electronic and digital.
The actual machinery wires, transistors, and circuits is called hardware; the instructions and data are called software.

All general-purpose computers require the following hardware components:
memory : Enables a computer to store, at least temporarily, data and programs.
mass storage device : Allows a computer to permanently retain large amounts of data. Common mass storage devices include disk drives and tape drives.
input device : Usually a keyboard and mouse, the input device is the conduit through which data and instructions enter a computer.
output device : A display screen, printer, or other device that lets you see what the computer has accomplished.
central processing unit (CPU): The heart of the computer, this is the component that actually executes instructions.

In addition to these components, many others make it possible for the basic components to work together efficiently.

For example, every computer requires a bus that transmits data from one part of the computer to another.
Computers can be generally classified by size and power as follows, though there is considerable overlap:
personal computer : A small, single-user computer based on a microprocessor. In addition to the microprocessor, a personal computer has a keyboard for entering data, a monitor for displaying information, and a storage device for saving data.
workstation : A powerful, single-user computer. A workstation is like a personal computer, but it has a more powerful microprocessor and a higher-quality monitor.
minicomputer : A multi-user computer capable of supporting from 10 to hundreds of users simultaneously.
mainframe : A powerful multi-user computer capable of supporting many hundreds or thousands of users simultaneously.
supercomputer : An extremely fast computer that can perform hundreds of millions of instructions per second.


Introduction of ICT

Information Technology (IT), as defined by the Information Technology Association of America (ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." IT deals with the use of electronic computers and computer software to convert, store, protect, process, transmit, and securely retrieve information.
Today, the term information technology has ballooned to encompass many aspects of computing and technology, and the term is more recognizable than ever before. The information technology umbrella can be quite large, covering many fields. IT professionals perform a variety of duties that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as the management and administration of entire systems. When computer and communications technologies are combined, the result is information technology, or "infotech". Information Technology (IT) is a general term that describes any technology that helps to produce, manipulate, store, communicate, and/or disseminate information. Presumably, when speaking of Information Technology (IT) as a whole, it is noted that the use of computers and information are associated.

Add to Google Reader or Homepage