Imagine never having to buy a new computer again. Imagine never having to step foot in a store to buy software again. Imagine a pocket PC equipped with the power of a desktop computer. These ideas are the basis for a revolutionary worldwide change in both networking and computing. From a post-nuclear method of communication, the Internet has evolved into a diverse communication tool. As it continues to expand towards its limits, a need for an entirely new Internet has become apparent. The Internet2 is an ongoing effort to connect computers and share data in ways never addressed in the past. As this new Internet develops, there will likely be numerous sociological effects on civilization including economics and culture.

In 1957, Russia sent the first satellite into space. As many Russians gawked at their accomplishment, Americans stared at the sky in fear. The feeling of being invulnerable to enemy missile attacks evaporated overnight. The need for some sort of attack defense system became apparent to the US government. If a nuclear attack were to happen “post-nuclear America would need a command-and-control network, linked from city to city, state to state, base to base” (Bruce, sec. 1). A group of scientists, Advanced Research Projects Agency (ARPA), was formed by the government to increase the United States’ technological advancements and to prevent being “surprised” by an attack. This agency had the top scientists working for America’s defense effort, and it was soon evident that “advanced computing would come to dominate their work” (Griffiths, par. 3).

Leonard Klienrock, a MIT scientist, was appointed to start a computer research program dedicated to creating a network of computers that could send data back and forth amongst themselves. This development led to ARPANET (Advanced Research Projects Agency Network), a collaboration of the best ideas of these scientists. It would use a system of sending packets of data across the network to send and receive messages. Each of these packets would be separately addressed which would start in one destination and end in another. Each packet however could take any route through the network getting between the two points (Bruce, sec. 1). ARPANET would also “apply state-of-the-art technology to US defense and to avoid being surprised by technological advances of the enemy” (Griffiths, par. 2).

In “December, 1971 ARPANET linked 23 host computers to each other”(Griffiths, par. 5). This was a huge step in computer networking. It soon became apparent that scientists were using ARPANET for more than just work purposes but as a high speed “electronic post-office” (Bruce, sec. 1). To many people’s surprise “the main traffic on ARPANET was not long-distance computing. Instead, it was news and personal messages. Researchers were using ARPANET to collaborate on projects, to trade notes on work, and eventually, to downright gossip and schmooze” (Bruce, sec. 1).

ARPANET soon became popular due to its ease of use. People no longer needed to have the same type of computer to be on the network. The only requirement was that the computer spoke the “computer language” the network system used. The computer language was called Network Control Protocol (NCP). If a computer was on the ARPANET network more than likely that meant it used NCP. NCP was used to break up the information/data being sent over the network into pieces, and then the receiving computer would grab and reassemble these packets. ARPANET grew through time but soon this system was “superceded by a higher-level, more sophisticated standard known as TCP/IP Transmission control Protocol/ Internet Protocol” (Bruce, sec. 1). As popular as NCP was, not all computers on the ARPANET used NCP. This meant that not all the computers were capable of sending and receiving data between each other. It was clear that TCP/IP should become the new standard because it was far superior to the older protocols. With this switch all computers became network compatible and all users could now send and receive packets of data to each other.

In 1972, the Internet finally was born. ARPANET went public with its new TCP/IP standard, a crucial stage in the development of networking. The Internet was very affordable to its users for the amount of resources it made available to them. The Internet did not charge for access time or long-distance phone services. Clients only had to worry about their own computer and their own section of line. (Griffiths, par. 21) Electronic Mail (E-mail) became available in 1972 as well. E-mail was regularly compared to the US Mail “which is scornfully known by Internet regulars as ‘snailmail’” since it was and is much faster (Griffiths, par. 27). The Internet was still mainly science-led, but was gradually starting to catch the eye of normal everyday people. In 1984, a concept called DNS (Domain Name Server) was introduced. This new idea designed a tiered Internet addressing system. Instead of manually typing in the addresses to go to certain web pages, DNS automatically assigned the web pages an address with extensions such as .edu, .com, .gov, and .org. With this in place, it was easier for users to find the websites they needed. By 1989, “the number of hosts surpassed 100,000 for the first time and had climbed to 300,000 a year later” (Griffiths, par. 17).

In 1992 the World Wide Web (WWW) was released, which is currently used today. The World Wide Web is a network of sites that can be searched and retrieved by a special protocol known as a Hypertext Transfer Protocol (HTTP) (Griffiths, par. 8). This was the first step in creating browsers such as Internet Explorerand Netscape Navigator. File transferring also became a huge hit amongst people and is a main topic of genre the Internet deals with today. With files getting bigger and bigger each day the time required to transfer data between people is taking longer and longer. Researchers have been developing a concept called the Internet2 to break this bottleneck.

With the current Internet facing many problems that cannot be easily solved, researchers have been scrambling to create a better Internet. Unfortunately, the current Internet is required for every day use and is not suitable for testing new Internet technologies. Thanks to the support of numerous universities and private researchers, new high-speed networks, such as the National LambdaRail (NLR) and the Abilene Network, have been installed covering most urban areas of the continental United States. The National LambdaRail has been constructed using ultra fast fiber-optic cables currently capable of transferring data at up to 400 Gb/s (gigabytes per second). Luis Villazon of Maximum PC explains, “Each link is made up of a pair of humongous fiber-optic cables that use Dense Wave Division Multiplexing (DWDM) to support a total of 40 simultaneous channels each at 10 Gb/s using different wave lengths.” (Villazon, 41) Similar to splitting sunlight into its individual colors, DWDM works by using a grating (a prism that reflects light at different angles) to separate the light into 40 different colors or effectively channels. (Fiber Optic, sec. 2) Because these optic cables can support 40 different channels, this one network can be used to test multiple projects at once without interfering with each other.

Researchers are working on many different projects on NLR, including increasing Internet protocol (IP addresses, the unique numbers that identify a computer on a network) space and increasing reliability of Internet packet delivery (Quality of Service). Villazon points out,

“One of the projects even treats the entire LambdaRail as a local-area network. It actually uses Ethernet protocols over a network that’s 3,000 miles from end to end. Imagine a network where you could run Microsoft Word in New York, but save your documents on a file server in L.A. Or switch on a diskless workstation in Houston and boot it with an OS operating system loaded directly from Microsoft in Seattle” (Villazon, 41).

With the speed of the network growing rapidly, effort has been shifted to design a new more efficient Internet, deemed the Internet2. While in its developmental stage, this system remains more commonly known as “The Grid”. The Grid is a radical new concept that allows not only Internet packets to be shared (similar to our current day internet), but also CPU (central processing unit) processing power. The system works by distributing processing power across the network, taking advantage of idling computers whose CPU’s are not in use. This also means that as more and more computers are connected to a Grid, the overall processing power of the system increases. This is very beneficial for companies because they can cut the overhead cost of buying and maintaining extra computers. Werner Ederer, IBM’s Program Manager for Grid, explains “take any medium-sized organization, because their load distribution is not predictable, they have to keep up to 40 times more computing power than they need.” (Villazon, 43)

The TeraGrid is an example of how this advanced Grid technology can be put to use. It is a multimillion-dollar “computing and storage infrastructure designed to engage the science and engineering community to utilize to catalyze new discoveries.” (TeraGrid ‘About’, par. 1) It is composed of five sites across the United States, and as of spring of 2004 was capable of processing 20 teraflops (trillions of operations per second) and storing nearly a petabyte, or roughly one million gigabytes, of data. The system continues to expand its capabilities as one news report explains,

“The systems currently in production represent the first of two deployments, with the completed TeraGrid scheduled to provide over 20 teraflops of capability. The phase two hardware, which will add more than 11 teraflops of capacity, was installed in December 2003 and is scheduled to be available to the research community in spring of 2004.” (First Phase, par. 3)

All of this computing power means better modeling, including climate maps, simulations of stars, fluid dynamics, and even simulations as complex as the Big Bang.

With Grid computing networks in place, many aspects of daily life will be affected. The network will likely be setup in a similar fashion to current day power plants. Users will be able to not only buy processing power as needed from a centralized network, but they will also be able to sell their own processing power back to the network. With the ability to buy as much CPU power as needed, handheld devices, such as Pocket PCs, will be able to harness the power of a current day supercomputer. No longer will our devices be bounded by the speed of their internal CPU’s.

Ask any experienced person of the Internet to name its most exciting feature, and the answer is almost always the same: “Its future.” Currently, a basic foundation for what will be the second Internet, or “Internet2” has been laid out and is now going through test phases. As it slowly approaches its public debut questions arise: Just what does the future hold for the Internet? How will society and everyday life be affected by the changeover to this new system? What are the possibilities for the new Internet2? It seems the possibilities are endless. As the second Internet grows, expands, and changes, the culture and society it is in will grow and change with it. The changes will range from big to small, including everything from the way we vote to the way our education system is run. For example, experts surveyed about predictions for what the future of the Internet will hold for education say “they predict virtual classes will become more widespread, with students grouped by interest and skill in the future, rather than by age” (Boese, par. 10). This is just a fraction of what could possibly happen when Internet2 is widespread.

The experts surveyed also made other predictions. Most of the experts predicted, “the most radical changes caused by the Internet2 will hit news organizations and publishing, citing blogs as a catalyst” (Boese, par. 7). The experts predicted this because they believe the way media relays information to the public will be changed drastically. Some of the experts surveyed were even so bold as say,

“Soon being offline will not be an option. ... There will be huge demand for: security, wireless access, and entertainment. Advertisers will continue to flee print and broadcast media, fracturing that market and forcing them into niches. When everything is available to everyone at the same time there will be no dominant killer-advertising channel.” (Boese, par. 22)

This is significant because it will radically change the way we experience the entertainment industry and the media. The television industry will also be greatly affected by the Internet2. Experts surveyed believe, “video blogs would replace television channels” (Boese, par. 8). Gilmour defines blogs as,

“an online journal comprised of links and postings in reverse chronological order, meaning the most recent posting appears at the top of the page. As Meg Hourihan, co-founder of Pyra Labs, the blogging software company acquired by Google in February 2003, has noted, weblogs are ‘post-centric’ -- the posting is the key unit -- rather than ‘page-centric,’ as with more traditional websites. Weblogs typically link to other websites and blog postings, and many allow readers to comment on the original post, thereby allowing audience discussions.” (Gilmour, Cited in ‘What is a Blog’)

Applying this to a format where in addition to text, video is available would mean the demand for TV shows would decrease drastically. This decrease would lead to decreased income from ad revenue, ultimately leading to the demise of television.

Already, scientists from America and Europe have used the Internet2 to “transfer 859 gigabytes of data in less than 17 minutes,” a record speed that is “equivalent to transferring a full-length DVD movie in four seconds.” (Kuchinskas, par. 7) Having widespread access to such speeds could translate into never having to wait for massive amounts of data ever again. This is significant, it could mean that one would never have to go to a computer store to buy software again, almost entirely eliminating the need for compact discs. Regardless of the amount of data, the software can be sent to your computer almost instantly. Whatever the future truly does hold for this new Internet, one thing is for certain, life as we know it will be changed forever.

It is amazing how a simple communication system has evolved into a diverse and powerful tool used by millions each day. A device originally intended for use as a communication tool after a nuclear attack is now used everyday. Due to the limitations of the current World Wide Web, there is a strong need for an entirely new network. Grid computing continues to aid in numerous research projects by providing scientists with the necessary computing power. The Internet2 will revolutionize the way we connect by sharing both Internet packets and processing power. The Internet2 continues to grow with support from numerous research communities and will soon be wired into the fabric of society.

Authors: Evan Demorest, Eric Kubacki, Petir Abdal

Works Cited

Boese, Christene. “The Internet imagined: 'We are immigrants to the future.'” January 26, 2005. March 15, 2005.

“The Fiber Optic Association.” 2003. April 4, 2005.

“First Phase of TeraGrid Goes into Production.” January 26, 2004. TeraGrid. March 15, 2005.

Griffiths, Richard T. “From ARPANET to the World Wide Web.” October 2002. March 15, 2005.

Kuchinskas, Susan. “Scientists Set Internet2 Speed Record”. March 17, 2005. September 2, 2004.

“National LambdaRail Homepage.” n.d. March 15, 2005.

Sterling, Bruce. “Short History of the Internet.” Fantasy and Science Fiction. February 1993. March 20, 2003.

“TeraGrid ’About’.” n.d. March 23, 2005.

Villazon, Luis. “Internet 2: Son of Internet.” Maximum PC. Mar. 2005: 40-44.

“What is a Blog?” n.d. March 17, 2005.

Log in or register to write something here or to contact authors.