IILM INSTITUTE FOR HIGHER EDUCATION, GURGAON
TABLE OF CONTENTS 1. INTRODUCTION 2. STAGES OF DEVELOPMENT 3. CURRENT SNAPSHOT 4. FAR REACHING APPLICATIONS 5.
INTERNET MENACE
6. FUTURE VISION 7. REFERENCES
INTRODUCTION WHAT IS INTERNET? The Internet, in simplest terms, is the large group of millions of computers around the world that are all connected to one another. These computers are connected by phone lines, fiber optic lines, coaxial cable, satellites, and wireless connections.The various applications of the Internet are :• The largest network of networks in the world. • Uses TCP/IP protocols and packet switching . • Runs on any communications substrate. • The World-Wide Web (the web or WWW) • Electronic Mail (E-Mail) • File Transfer Protocol (FTP) • Internet Relay chat (IRC) • USENET (a news service)
The World Wide Web:The www is the reason the Internet has become as popular as it has. This is the part of the Internet that the majority of users see — the websites and the pages that make them up. The web is the most widely used service of the Internet, accessed through a web browser like Internet Explorer or Netscape Navigator. These pieces of software are gradually integrating other parts of the Internet into them (most notably email and ftp), so that eventually we will have one interface to the entire array of services the Internet offers. The web is an immense collection of web pages, linked together with hypertext links. Thousands of new pages of information are added to the heaving web every hour. Each page is placed on a server, a computer continually connected to the rest of the web. The information is then available to anyone else with access to the Internet. Web pages can have a mixture of text, graphics and multimedia. Nowadays, there's information on practically anything you could be interested in available somewhere on the web. You can use a search engine to find what you want.
E-mail:Electronic Mail works in much the same way as traditional mail (now charmingly labelled 'snail-mail') does. Anyone is allowed to sign up for an email address and then people can send you messages, or attach files from their computer and send them too. The main benefit of email is the close to instantaneous delivery of messages that occurs. You can send an email to the other side to the world
and it will arrive in less than a minute. You can also sign up to weekly newsletters and have information you want delivered right to your computer.
File Transfer Protocol:While web pages are transferred between computers using the http protocol, other types of files are sent using FTP. People can share files, like music and videos, among each other and the rest of the world by uploading them to a server and allowing others to download them to their own computers.
Internet Relay Chat:IRC is a service that allows you to connect to your chosen channel and talk in real-time to people with the same interests as you. You can download » mIRC and start chatting right away.
USENET:USENET (Unix User Network) is a system of bulletin boards where you and anyone else can post messages and people will read and reply to them. As with IRC, you will find boards set up for all sorts of groups of people. The search engine Google has set up a web-interface for these discussion boards.
STAGES OF DEVELOPMENT • • •
•
•
•
•
STAGE1 Three terminals and an ARPA STAGE2 Packet Switching STAGE3 Networks that led to the Internet o 3.1 ARPANET o 3.2 X.25 and public access o 3.3 UUCP STAGE4 Merging the networks and creating the Internet o 4.1 TCP/IP o 4.2 ARPANET to Several Federal Wide Area Networks: MILNET, NSI, and NSFNet o 4.3 The transition toward an Internet STAGE5 TCP/IP becomes worldwide o 5.1 CERN, the European internet, the link to the Pacific and beyond o 5.2 A digital divide 5.2.1 Africa 5.2.2 Asia and Oceania 5.2.3 Latin America STAGE6 Opening the network to commerce o 6.1 The IETF and a standard for standards o 6.2 NIC, InterNIC, IANA and ICANN STAGE7 Use and culture o 7.1 Email and Usenet—The growth of the text forum o 7.2 A world library—from gopher to the WWW o 7.3 Finding what you need—The search engine
o o
7.4 The dot-com bubble 7.5 Worldwide Online Population Forecast
1. Three terminals and an ARPA:Advanced Research Projects Agency was renamed to Defense Advanced Research Projects Agency (DARPA) in 1972. A fundamental pioneer in the call for a global network, J.C.R. Licklider, articulated the idea in his January 1960 paper, Man-Computer Symbiosis. "A network of such [computers], connected to one another by wide-band communication lines" which provided "the functions of present-day libraries together with anticipated advances in information storage and retrieval and [other] symbiotic functions:—J.C.R. Licklider In October 1962, Licklider was appointed head of the United States Department of Defense's DARPA information processing office, and formed an informal group within DARPA to further computer research. As part of the information processing office's role, three network terminals had been installed: one for System Development Corporation in Santa Monica, one for Project Genie at the University of California, Berkeley and one for the Multics project SHOPPING at the Massachusetts Institute of Technology (MIT). Licklider's need for internetworking would be made evident by the problems this caused.
2. Packet Switching:At the tip of the inter-networking problem lay the issue of connecting separate physical networks to form one logical network, with much wasted capacity inside the assorted separate network. During the 1960s, Donald Davies (NPL), Paul Baran (RAND Corporation), and Leonard Kleinrock (MIT) developed and implemented packet switching. The notion that the Internet was developed to survive a nuclear attack has its roots in the early theories developed by RAND, but is an urban legend, not supported by any Internet Engineering Task Force or other document. Early networks used for the command and control of nuclear forces were message switched, not packet-switched, although current strategic military networks are, indeed, packet-switching and connectionless. Baran's research had approached packet switching from studies of decentralisation to avoid combat damage compromising the entire network.
3. Networks that led to the Internet:• ARPANET:ARPANET became the technical core of what would become the Internet, and a primary tool in developing the technologies used. ARPANET development was centered around the Request for Comments (RFC) process, still used today for proposing and distributing Internet Protocols and Systems. RFC 1, entitled "Host Software", was written by Steve Crocker from the University of
California, Los Angeles, and published on April 7, 1969. These early years were documented in the 1972 film Computer Networks: The Heralds of Resource Sharing.
• X.25 and public access:Following on from ARPA's research, packet switching network standards were developed by the International Telecommunication Union (ITU) in the form of X.25 and related standards. In 1974, X.25 formed the basis for the SERCnet network between British academic and research sites, which later became JANET. The initial ITU Standard on X.25 was approved in March 1976. This standard was based on the concept of virtual circuits. Unlike ARPAnet, X.25 was also commonly available for business use. Telenet offered its Telemail electronic mail service, but this was oriented to enterprise use rather than the general email of ARPANET.
• UUCP:In 1979, two students at Duke University, Tom Truscott and Jim Ellis, came up with the idea of using simple Bourne shell scripts to transfer news and messages on a serial line with nearby University of North Carolina at Chapel Hill. Following public release of the software, the mesh of UUCP hosts forwarding on the Usenet news rapidly expanded. UUCPnet, as it would later be named, also created gateways and links between FidoNet and dial-up BBS hosts. UUCP networks spread quickly due to the lower costs involved, and ability to use existing leased lines, X.25 links or even ARPANET connections. By 1981 the number of UUCP hosts had grown to 550, nearly doubling to 940 in 1984.
4. Merging the networks and creating the Internet:• TCP/IP:With the role of the network reduced to the bare minimum, it became possible to join almost any networks together, no matter what their characteristics were, thereby solving Kahn's initial problem. DARPA agreed to fund development of prototype software, and after several years of work, the first somewhat crude demonstration of a gateway between the Packet Radio network in the SF Bay area and the ARPANET was conducted. By November 1977 a three network demonstration was conducted including the ARPANET, the Packet Radio Network and the Atlantic Packet Satellite network—all sponsored by DARPA. Stemming from the first specifications of TCP in 1974, TCP/IP emerged in midlate 1978 in nearly final form. By 1981, the associated standards were published as RFCs 791, 792 and 793 and adopted for use. DARPA sponsored or encouraged the development of TCP/IP implementations for many operating systems and then scheduled a migration of all hosts on all of its packet networks to TCP/IP. On 1 January 1983, TCP/IP protocols became the only approved protocol on the ARPANET, replacing the earlier NCP protocol.
Map of the TCP/IP test network in January 1982
•
ARPANET to Several Federal Wide Area Networks: MILNET, NSI, and NSFNet:-
After the ARPANET had been up and running for several years, ARPA looked for another agency to hand off the network to; ARPA's primary mission was funding cutting-edge research and development, not running a communications utility. Eventually, in July 1975, the network had been turned over to the Defense Communications Agency, also part of the Department of Defense. In 1983, the U.S. military portion of the ARPANET was broken off as a separate network, the MILNET. MILNET subsequently became the unclassified but military-only NIPRNET, in parallel with the SECRET-level SIPRNET and JWICS for TOP SECRET and above. NIPRNET does have controlled security gateways to the public Internet. In 1984 NSF developed CSNET exclusively based on TCP/IP. CSNET connected with ARPANET using TCP/IP, and ran TCP/IP over X.25, but it also supported departments without sophisticated network connections, using automated dial-up mail exchange. This grew into the NSFNet backbone, established in 1986, and intended to connect and provide access to a number of supercomputing centers established by the NSF.
• The transition toward an Internet:The term "Internet" was adopted in the first RFC published on the TCP protocol (RFC 675: Internet Transmission Control Protocol, December 1974). It was around the time when ARPANET was interlinked with NSFNet, that the term Internet came into more general use,[11] with "an internet" meaning any network using TCP/IP. "The Internet" came to mean a global and large network using TCP/IP. Previously "internet" and "internetwork" had been used interchangeably, and "internet protocol" had been used to refer to other networking systems such as Xerox Network Services.
5. TCP/IP becomes worldwide:The first ARPANET connection outside the US was established to NORSAR in Norway in 1973, just ahead of the connection to Great Britain. These links were all converted to TCP/IP in 1982, at the same time as the rest of the Arpanet.
• CERN, the European internet, the link to the Pacific and beyond:Between 1984 and 1988 CERN began installation and operation of TCP/IP to interconnect its major internal computer systems, workstations, PC's and an accelerator control system. CERN continued to operate a limited selfdeveloped system CERNET internally and several incompatible (typically proprietary) network protocols externally. There was considerable resistance in Europe towards more widespread use of TCP/IP and the CERN TCP/IP intranets remained isolated from the rest of the Internet until 1989.
In 1988 Daniel Karrenberg, from CWI in Amsterdam, visited Ben Segal, CERN's TCP/IP Coordinator, looking for advice about the transition of the European side of the UUCP Usenet network (much of which ran over X.25 links) over to TCP/IP. In 1987, Ben Segal had met with Len Bosack from the then still small company Cisco about purchasing some TCP/IP routers for CERN, and was able to give Karrenberg advice and forward him on to Cisco for the appropriate hardware. This expanded the European portion of the Internet across the existing UUCP networks, and in 1989 CERN opened its first external TCP/IP connections.[13] This coincided with the creation of Réseaux IP Européens (RIPE), initially a group of IP network administrators who met regularly to carry out co-ordination work together. Later, in 1992, RIPE was formally registered as a cooperative in Amsterdam.
A Digital divide:While developed countries with technological infrastructures were joining the Internet, developing countries began to experience a digital divide separating them from the Internet. On an essentially continental basis, they are building organizations for Internet resource administration and sharing operational experience, as more and more transmission facilities go into place.
• Africa:Africa is building an Internet infrastructure. AfriNIC, headquartered in Mauritius, manages IP address allocation for the continent. As do the other Internet regions, there is an operational forum, the Internet Community of Operational Networking Specialists.
• Asia and Oceania:The Asia Pacific Network Information Centre (APNIC), headquartered in Mauritius, manages IP address allocation for the continent. APNIC sponsors an operational forum, the Asia-Pacific Regional Internet Conference on Operational Technologies (APRICOT).
• Latin America:As with the other regions, the Latin American and Caribbean Internet Addresses Registry (LACNIC) manages the IP address space and other resources for its area. LACNIC, headquartered in Uruguay, operates DNS root, reverse DNS, and other key services
6. Opening the network to commerce:During the late 1980s, the first Internet service provider (ISP) companies were formed. Companies like PSINet, UUNET, Netcom, and Portal Software were formed to provide service to the regional research networks and provide alternate network access, UUCP-based email and Usenet News to the public. The first dial-up in the West Coast, was Best Internet[1] - now Verio Communications, opened in 1986. The first dialup ISP in the East was world.std.com, opened in 1989.
• The IETF and a standard for standards:The Internet has developed a significant subculture dedicated to the idea that the Internet is not owned or controlled by any one person, company, group, or organization. Nevertheless, some standardization and control is necessary for the system to function. The liberal Request for Comments (RFC) publication procedure engendered confusion about the Internet standardization process, and led to more formalization of official accepted standards. The IETF started in January of 1985 as a quarterly meeting of U.S. government funded researchers. Representatives from non-government vendors were invited starting with the fourth IETF meeting in October of that year. In 1992, the Internet Society, a professional membership society, was formed and the IETF was transferred to operation under it as an independent international standards body.
• NIC, InterNIC, IANA and ICANN:The first central authority to coordinate the operation of the network was the Network Information Centre (NIC) at Stanford Research Institute (SRI) in Menlo Park, California. In 1972, management of these issues was given to the newly created Internet Assigned Numbers Authority (IANA). In addition to his role as the RFC Editor, Jon Postel worked as the manager of IANA until his death in 1998. In 1998 both IANA and InterNIC were reorganized under the control of ICANN, a California non-profit corporation contracted by the US Department of Commerce to manage a number of Internet-related tasks. The role of operating the DNS system was privatized and opened up to competition, while the central management of name allocations would be awarded on a contract tender basis.
7. Use and culture • Email and Usenet—The growth of the text forum E-mail is often called the killer application of the Internet. However, it actually predates the Internet and was a crucial tool in creating it. E-mail started in 1965 as a way for multiple users of a time-sharing mainframe computer to communicate. Although the history is unclear, among the first systems to have such a facility were SDC's Q32 and MIT's CTSS. The ARPANET computer network made a large contribution to the evolution of e-mail. There is one report indicating experimental intersystem e-mail transfers on it shortly after ARPANET's creation. In 1971 Ray Tomlinson created what was to become the standard Internet e-mail address format, using the @ sign to separate user names from host names. A number of protocols were developed to deliver e-mail among groups of time-sharing computers over alternative transmission systems, such as UUCP and IBM's VNET e-mail system. E-mail could be passed this way between a number of networks, including ARPANET, BITNET and NSFNet, as well as to hosts connected directly to other sites via UUCP.
The News software developed by Steve Daniel and Tom Truscott in 1979 was used to distribute news and bulletin board-like messages. This quickly grew into discussion groups, known as newsgroups, on a wide range of topics. On ARPANET and NSFNet similar discussion groups would form via mailing lists, discussing both technical issues and more culturally focused topics (such as science fiction, discussed on the sflovers mailing list).
• A world library—From gopher to the WWW:As the Internet grew through the 1980s and early 1990s, many people realized the increasing need to be able to find and organize files and information. Projects such as Gopher, WAIS, and the FTP Archive list attempted to create ways to organize distributed data. Unfortunately, these projects fell short in being able to accommodate all the existing data types and in being able to grow without bottlenecks. In 1991, Tim Berners-Lee was the first to develop a network-based implementation of the hypertext concept. This was after Berners-Lee had repeatedly proposed his idea to the hypertext and Internet communities at various conferences to no avail—no one would implement it for him. Working at CERN, Berners-Lee wanted a way to share information about their research. By releasing his implementation to public use, he ensured the technology would become widespread. Subsequently, Gopher became the first commonly-used hypertext interface to the Internet. While Gopher menu items were examples of hypertext, they were not commonly perceived in that way. One early popular web browser, modeled after HyperCard, was ViolaWWW. Scholars generally agree, however, that the turning point for the World Wide Web began with the introduction of the Mosaic (web browser) in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at UrbanaChampaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the High-Performance Computing and Communications Initiative, a funding program initiated by then-Senator Al Gore's High Performance Computing and Communication Act of 1991 also known as the Gore Bill . Indeed, Mosaic's graphical interface soon became more popular than Gopher, which at the time was primarily text-based, and the WWW became the preferred interface for accessing the Internet. (Gore's reference to his role in "creating the Internet", however, was ridiculed in his Presidential election campaign: see full article Al Gore contributions to the internet and technology). Mosaic was eventually superseded in 1994 by Andreessen's Netscape Navigator, which replaced Mosaic as the world's most popular browser. Competition from Internet Explorer and a variety of other browsers has almost completely displaced it. Another important event held on January 11, 1994, was The Superhighway Summit at UCLA's Royce Hall.
• Finding what you need—The search engine:Even before the World Wide Web, there were search engines that attempted to organize the Internet. The first of these was the Archie search engine from McGill University in 1990, followed in 1991 by WAIS and Gopher. All three of those systems predated the invention of the World Wide Web but all continued to index the Web and the rest of the Internet for several years after the Web appeared. There are still Gopher servers as of 2006, although there are a great many more web servers. As the Web grew, search engines and Web directories were created to track pages on the Web and allow people to find things. The first full-text Web search engine was WebCrawler in 1994. Before WebCrawler, only Web page titles were searched. Another early search engine, Lycos, was created in 1993 as a university project, and was the first to achieve commercial success. During the late 1990s, both Web directories and Web search engines were popular—Yahoo! (founded 1995) and Altavista (founded 1995) were the respective industry leaders. By August 2001, the directory model had begun to give way to search engines, tracking the rise of Google (founded 1998), which had developed new approaches to relevancy ranking. Directory features, while still commonly available, became after-thoughts to search engines.
• The dot-com bubble The suddenly low price of reaching millions worldwide, and the possibility of selling to or hearing from those people at the same moment when they were reached, promised to overturn established business dogma in advertising, mail-order sales, customer relationship management, and many more areas. The web was a new killer app—it could bring together unrelated buyers and sellers in seamless and low-cost ways. Visionaries around the world developed new business models, and ran to their nearest venture capitalist. Of course a proportion of the new entrepreneurs were truly talented at business administration, sales, and growth; but the majority were just people with ideas, and didn't manage the capital influx prudently. Additionally, many dot-com business plans were predicated on the assumption that by using the Internet, they would bypass the distribution channels of existing businesses and therefore not have to compete with them; when the established businesses with strong existing brands developed their own Internet presence, these hopes were shattered, and the newcomers were left attempting to break into markets dominated by larger, more established businesses. Many did not have the ability to do so.
• Worldwide Online Population Forecast:In its "Worldwide Online Population Forecast, 2006 to 2011," JupiterResearch anticipates that a 38 percent increase in the number of people with online access will mean that, by 2011, 22 percent of the Earth's population will surf the Internet regularly. JupiterResearch says the worldwide online population will increase at a compound annual growth rate of 6.6 percent during the next five years, far outpacing the 1.1 percent compound annual growth rate for the planet's population as a whole. The report says 1.1 billion people currently enjoy regular access to the Web. North America will remain on top in terms of the number of people with online access. According to JupiterResearch, online penetration rates on the continent will increase from the current 70 percent of the overall North American population to 76 percent by 2011. However, Internet adoption has "matured," and its adoption pace has slowed, in more developed countries including the United States, Canada, Japan and much of Western Europe, notes the report. As the online population of the United States and Canada grows by about only 3 percent, explosive adoption rates in China and India will take place, says JupiterResearch. The report says China should reach an online penetration rate of 17 percent by 2011 and India should hit 7 percent during the same time frame. This growth is directly related to infrastructure development and increased consumer purchasing power, notes JupiterResearch. By 2011, Asians will make up about 42 percent of the world's population with regular Internet access, 5 percent more than today, says the study. Penetration levels similar to North America's are found in Scandinavia and bigger Western European nations such as England and Germany, but JupiterResearch says that a number of Central European countries "are relative Internet laggards. Penetration levels similar to North America's are found in Scandinavia and bigger Western European nations such as England and Germany, but JupiterResearch says that a number of Central European countries "are relative Internet laggards. For the study, JupiterResearch defined "online users" as people who regularly access the Internet by "dedicated Internet access" devices. Those devices do not include cell phones.
CURRENT SNAPSHOT By September 2007 the Internet Reached Two Important Milestones:-
WORLD INTERNET USAGE AND POPULATION STATISTICS % Usage Internet Usage Population Population Population Growth World Regions Usage, % of ( 2007 Est.) % of World ( Penetratio 2000Latest Data World n) 2007 Africa 933,448,292 14.2 % 43,995,700 4.7 % 3.5 % 874.6 % Asia 3,712,527,624 56.5 % 459,476,825 12.4 % 36.9 % 302.0 % Europe 809,624,686 12.3 % 337,878,613 41.7 % 27.2% 221.5 % Middle East 193,452,727 2.9 % 33,510,500 17.3 % 2.7 % 920.2 % North America 334,538,018 5.1 % 234,788,864 70.2 % 18.9% 117.2 % Latin 556,606,627 8.5 % 115,759,709 20.8 % 9.3 % 540.7 % America/Caribbean Oceania / Australia 34,468,443 0.5 % 19,039,390 55.2 % 1.5 % 149.9 % 100.0 WORLD TOTAL 6,574,666,417 100.0 % 1,244,449,601 18.9 % 244.7 % % NOTES: (1) Internet Usage and World Population Statistics are for September 30, 2007. . (2) Demographic (Population) numbers are based on data contained in the world-gazetteer website. (3) Internet usage information comes from data published by Nielsen//NetRatings, by the International Telecommunications Union, by local NICs, and other other reliable sources. Copyright © 2007, Miniwatts Marketing Group. All rights reserved worldwide.
FAR REACHING APPLICATIONS Web 2.0 – the next “internet” revolution ~ it’s happening Now revolution ~ its happening Now!! It is a catchphrase used to describe a way of using the Web -what that way entails is debated, but generally its “networking”. Internet has touched &
given manifold benefits to a common in all areas of his life. He has got a new tech savvy look which has revolutionized his life completely.
Web 2.0 ~ so what is it ? Web 1.0 was about connecting computers and making technology more efficient for computers
Web 2.0 is: • • • • •
Is definitely about people Its applications that are harnessing the power of collective intelligence of users Its about connecting people, and making technology more efficient for people. It uses social networking, Podcasts, wikis, bloggingetc to disseminate information. It generates huge interest online and must be taken seriouslywhen we are designing our web strategy
Web 2.0 Methodologies ~ Social Networking Now Web sites are becoming a place where relationships are formed and opinions exchanged by consumers. • Examples in tourism are IgoUgo, My Space, You Tube, Trip Advisor, Yahoo Trip Planner –these web sites are about creating consumer generated contentnot just publishing sterile brochure content. • Social networks allow consumers to “congregate”around subjects and to operate outside the established contact points so that they can exchange unfiltered views and experiences good and bad. • Organisationscan themselves create their own social networking spaceon their website orwork with established players like My Space,You Tube.
Web 2.0 Methodologies ~ Blogs (weblog weblog) Weblogs provide dialogue on varied subjects . provide dialogue on varied subjects . • Blogscan combine text images, links etc. • Blogscan be used by companies to solicit feedback and experiences and also be used for tracking consumer opinions, and can be formal or informal
• •
Are seen by consumers as a personal communication channel and as an impartial and informal way of getting factual, unbiased and accurate information. Companies need a “BlogStrategy”~ options are to control, respond or remain silent
Many public and private sector tourism sites today have blogs-usually editorial controlled by the site operator –supplier
Web 2.0 Methodologies ~ Podcasting Is an audio file that can be downloaded from a web Is an audio file that can be downloaded from a website onto an IPOD/MP3 player and of course a site onto an IPOD/MP3 player and of course a “mobile phone mobile phone”. • Used to deliver sound experiences of travel destinations and guidebooks, music, cultural experiences etc and talks and speeches by well know personalities • Used by operators like Lonely Planet and other travel guides, Virgin, Online Booking engines (Orbitz, Expedia ) and National and local Newspapers
Web 2.0 Methodologies ~ RSS (Really Web 2.0 Methodologies ~ RSS (Really Simple Syndication) Is a file format that is used to subscribe to content Is a file format that is used to subscribe to content on a regular basis on a regular basis – news, latest offers, news, latest offers, promotions etc promotions etc • This allows the consumer to keep track of news, blogs, events and “deals”without having to remember to check each site manually • Many airlines, agents, hotels and operators offer this facility mainly focussed on deals, specials and promotions • RSS does not compete with email, spam etc, works well with search engine positioning (inbound links from other web sites)’can syndicate and pass on content to other consumers • Expedia, Travelocity, Sheraton etc are good examples of RSS.
Web 2.0 Methodologies ~ Wikis Wiki is a type of web site that allows consumers is a type of web site that allows consumers to add, remove and edit the web site content to add, remove and edit the web site content themselves on line themselves on line • Wikishave little editorial control so the value of the comments made are questionable
• •
A good general example is wikipediaa popular education /reference web site An example is Trip Advisor Wikithat allows visitors to edit travel guides.
Web 2.0 Methodologies ~ Online Video Web 2.0 Methodologies ~ Online Video Requires Broadband Connection. • Video clips can be professionally produced or consumer -amateur generated. • Rich media to provide the ability to engage the consumer with factual video of destinations, accommodation, virtual tours etc. • Can be image and sound thus providing a “real life experience” • YouTubeand Travelisticfacilitates consumer and professional tourism video clips
Web 2.0 Methodologies ~ Tagging Tagging is a new way of indexing information on web sites. Information can have multiple tags and is very common in user generated content ( user generated content (particularly blogging) • Consumer sorts and finds information of interest on a site it istagged it for future use and more content added to the tag. Other travellers can also access others tags on certain sites. • Allows a consumer to find and tag relevant information clips andtag each and then later view collective information on a particular destination or hotel of everyone who has been before. • Used for saving and sorting consumers own content and browsing other consumers content • Another form of tagging is “geotagging”this allows an individual to add longitude and latitude to content and this allows contentto be shown on a map. • This is a new technology and allows computers and some newer mobile phones to geotaga picture and then upload to a site. This is really a consumer-focussed tool. Example sites are Flickr.com and Travelbuddy.com
Web 2.0 Methodologies ~ Mash –Ups and Open API (Application Programme and Open API (Application Programme Interface) Mash-ups is combining two different sources of information to create a new “experience”. Open -API is the technology that is used to form the Mash-up.
Good examples of mash-ups is combining Google Maps with images, sounds, videos etc. i.e Trip Advisor combines hotel rates on Google Maps and combining Google Maps with other information i.e tourist hiking trails, dive sites etc
Web 2.0 Methodologies ~ AJAX Web 2.0 Methodologies ~ AJAX (Asynchronous Java Script and XML) AJAX is a developing technique to make web pages more responsive to users by exchanging small amounts of data with the server so that the whole page does not need to be reloaded. • Can make web sites more responsive and provide information without having to refresh the browser each time. • Fare Aggregators (Kayak, Sidestep, Farecast) use AJAX to create a fast user experience when sorting through large numbers of flight options • Google maps uses AJAX to allow users to scroll, pan and zoom without reloading the map.
Web 2.0 Web 2.0 is shifting power from the traditional tourism supplier / intermediary to the customer ? • • •
Web 2.0 then is a multitude of different ways that companies can engage with customersto build loyalties, CRM and disseminate information. These technologies put much more power in the hand of the consumerthan Web 1.0 and Increasingly move the power of decision from the supplier to the consumer.
Yes! -but maybe the “power”in tourism has moved from the traditional industry (Agents, operators etc) to the “new information facilitators”-web site owners (some of whom are on-line operators and agents) who can manipulate consumers via the clever use and understanding of Web 2.0 technologies
INTERNET TURNING TO BE A MENACE
Internet Crime/ Cyber Crime is crime committed on the Internet, using the Internet and by means of the Internet .Such crimes include phishing, credit card frauds, bank robbery, illegal downloading, industrial espionage, child pornography, kidnapping children via chat rooms, scams, cyberterrorism, creation and/or distribution of viruses, spam and so on. All such crimes are computer related and facilitated crimes. With the evolution of the Internet, along came another revolution of crime where the perpetrators commit acts of crime and wrongdoing on the World Wide Web. Internet crime takes many faces and is committed in diverse fashions. The number of users and their diversity in their makeup has exposed the Internet to everyone. Some criminals in the Internet have grown up understanding this superhighway of information, unlike the older generation of users. This is why Internet crime has now become a growing problem in the United States. Some crimes committed on the Internet have been exposed to the world and some remain a mystery up until they are perpetrated against someone or some company. The different types of Internet crime vary in their design and how easily they are able to be committed. Internet crimes can be separated into two different categories. There are crimes that are only committed while being on the Internet and are created exclusively because of the World Wide Web. The typical crimes in criminal history are now being brought to a whole different level of innovation and ingenuity. Such new crimes devoted to the Internet are email “phishing”, hijacking domain names, virus immistion, and cyber vandalism. A couple of these crimes are activities that have been exposed and introduced into the world. People have been trying to solve virus problems by installing virus protection software and other software that can protect their computers. Other crimes such as email “phishing” are not as known to the public until an individual receives one of these fraudulent emails. These emails are cover faced by the illusion that the email is from your bank or another bank. When a person reads the email he/she is informed of a problem with he/she personal account or another individual wants to send the person some of their money and deposit it directly into their account. The email asks for your personal account information and when a person gives this information away, they are financing the work of a criminal.
FUTURE VISION One Last Thought One Last Thought – Web 3.0 is coming !!!!
Web 3.0 is referred to as the semantic web.The semantic web is where machines read web pages as humans do today. A web where search engines and software agents peruse “the Net” and find us what we’re looking for. The consumer becomes moreintegral with the web. The effect that Web 3.0 will have on our industry is…….???????
There is a relentless move toward broadband connections:-
A MESSAGE : “The Internet (and World Wide Web) has been created by some very bright, talented people who either had vision, or were inspired by other talented people’s visions.Though their ideas were not always popular, they pressed ahead.Their perseverance and hard work brought us to where we are today.There is a lot to be learned by studying these people, their early work and keeping in mind what they had to work with. Today, we owe a great deal for the wired world we enjoy, to the hard work of these people”
REFERENCES www.walthowe.com
www.wikipedia.com www.isoc.org www.davesite.com www.computerhistory.org www.elsop.com www.fordham.edu www.livinginternet.com www.let.leidenuniv.nl www.anu.edu.au www.google.co.in www.emarketer.com
www.internetworldstats.com