Bill Gates

American business magnate, software executive and philanthropist William Henry “Bill” Gates III was born October 28th, 1955. Bill Gates is the former chief executive and current chairman of Microsoft, the world’s largest personal-computer software company, which he co-founded with Paul Allen. He is consistently ranked among the world’s wealthiest people and was the wealthiest overall from 1995 to 2009, excluding 2008, when he was ranked third; in 2011 he was the third wealthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.

Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairmanthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.

Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairman.

Advertisements

OneWebDay

OneWebDay is an annual day of Internet celebration and awareness held on September 22. The stated goal of founder Susan P. Crawford is for OneWebDay to foster and make visible a global constituency that cares about the future of the Internet. The first OneWebDay was held on September 22, 2006. The idea was created by Susan P. Crawford, who was an ICANN board member at the time, in association with other Internet figures such as Doc Searls, David Weinberger, David R. Johnson, Mary Hodder, and David Isenberg, who would all join the board of what would eventually become a 501(c)3 corporation – OneWebDay Inc.

A website was established and a global network of events promoted. The 2006 OneWebDay’s celebration featured speakers Craig Newmark, Scott Heiferman, and Drew Schutte in New York City’s Battery Park.By 2008 OneWebDay had grown to more than 30 international events. In Washington Square Park, New York City, speakers included Crawford, John Perry Barlow, Jonathan Zittrain, Craig Newmark, and Lawrence Lessig. In 2009, Mitch Kapor took over chairmanship of OneWebDay. It was also announced that funding had been granted by the Ford Foundation.

In 2010, it was announced that OneWebDay would be combined with a new Mozilla Foundation year-round initiative called Drumbeat. A number of volunteers took over organizing the 2010 event. A new website and network was established. Events took place in several cities including New York City, Melbourne, Kolkata, Chennai, London, and Pachuca. In 2011, the main event was a presentation in New York City by Bob Frankston – “Infrastructure commons – the future of connectivity”. In 2012 the theme of OneWebDay was advancing local content while In 2013 the theme was accessibility, particularly in remembrance of web-accessibility advocate Cynthia Waddell, who died in April 2013.

In 2014 the theme for OneWebDay was Recognizing Core Internet Values, this included three videos: a TEDx talk by Dave Moskowitz, “The Internet Belongs to Everyone” from the United States State Department, and the Dynamic Coalition on Core Internet Values also gave a talk at the 2014 Internet Governance Forum. In 2015 and 2016 the theme was Connecting the Next Billion, During these events United States Under Secretary for Economic Growth, Energy and the Environment Catherine Novelli gave two speeches ‘Connecting The World’ at the USA-IGF. In 2017 the theme is Open The Pipes, which concerned the need for connectivity for Community Networks. For this event a speech was given by Internet Society CEO/President Kathy Brown at Mobile World Congress in Shanghai.

World Wide Web

The World Wide Web (abbreviated as WWW or W3, commonly known as the web)debuted on 23 August 1991. The World Wide Web is asystem of interlinked hypertext documents accessed via the Internet. With a web browser, one can view web pages that may contain text, images, videos, and other multimedia and navigate between them via hyperlinks.The web was developed between March 1989 and December 1990. Using concepts from his earlier hypertext systems such as ENQUIRE, British engineer Tim Berners-Lee, acomputer scientist and at that time employee of the CERN, now Director of the World Wide Web Consortium (W3C), wrote a proposal in March 1989 for what would eventually become the World Wide Web. The 1989 proposal was meant for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world. At CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, berners-Lee and Belgian computer scientist Robert Cailliau proposed in 1990 to use hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”, and Berners-Lee finished the first website in December that year. Berners-Lee posted the project on the alt.hypertext newsgroup on 7 August 1991

In the May 1970 issue of Popular Science magazine, Arthur C. Clarke predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe.In March 1989, Tim Berners-Lee wrote a proposal that referenced ENQUIRE, a database and software project he had built in 1980, and described a more elaborate information management system. With help from Robert Cailliau, he published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3″) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, [so that] authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.

The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration.The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.

On 6 August 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This date also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.” The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992, whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web. The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.

Berners-Lee’s breakthrough was to marry hypertext to the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.

An early popular web browser was ViolaWWW for Unix and the X Windowing System..Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore. Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.

After leaving the European Organization for Nuclear Research (CERN) In 1994, Tim Berners Lee founded The World Wide Web Consortium (W3C) at the theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agencyw (DARPA), which had pioneered the Internet. A year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain names and HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet. The web is a collection of documents and both client and server software using Internet protocols such as TCP/IP and HTTP. Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.

Www.internet.com

Credited with being the ‘Inventor of the World Wide Web, Tim Berners-Lee released files describing his idea for the World Wide Web On the 6th August in 1991 and WWW debuts as a publicly available service on the Internet.Born 8th June 1955, Sir Timothy John “Tim” Berners-Lee, OM, KBE, FRS, FREng, FRSA , also known as “TimBL”, is a British computer scientist, MIT professor and the inventor of the World Wide Web. He made a proposal for an information management system in March 1989 and on 25 December 1990, with the help of Robert Cailliau and a young student at CERN, he implemented the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the InternetIn 2004, Berners-Lee was knighted by Queen Elizabeth II for his pioneering work and is also the director of the World Wide Web Consortium (W3C), which oversees the Web’s continued development. He is also the founder of the World Wide Web Foundation, and is a senior researcher and holder of the Founders Chair at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). He is a director of The Web Science Research Initiative and a member of the advisory board of the MIT Center for Collective Intelligence.

In 2004, Berners-Lee was knighted by Queen Elizabeth II for his pioneering work. In April 2009, he was elected a foreign associate of the United States National Academy of Sciences.In June 2009 then British Prime Minister Gordon Brown (BOO! HISS!) announced Berners-Lee would work with the UK Government to help make data more open and accessible on the Web, building on the work of the Power of Information Task Force. Berners-Lee and Professor Nigel Shadbolt are the two key figures behind data.gov.uk, a UK Government project to open up almost all data acquired for official purposes for free re-use. Commenting on the opening up of Ordnance Survey data in April 2010 Berners-Lee said that: “The changes signal a wider cultural change in Government based on an assumption that information should be in the public domain unless there is a good reason not to—not the other way around.” He went on to say “Greater openness, accountability and transparency in Government will give people greater choice and make it easier for individuals to get more directly involved in issues that matter to them.”In November 2009, Berners-Lee launched the World Wide Web Foundation in order to “Advance the Web to empower humanity by launching transformative programs that build local capacity to leverage the Web as a medium for positive change.”

Berners-Lee is also one of the pioneer voices in favour of Net Neutrality, and has expressed the view that ISPs should supply “connectivity with no strings attached,” and should neither control nor monitor customers’ browsing activities without their expressed consent. He advocates the idea that net neutrality is a kind of human network right: “Threats to the Internet, such as companies or governments that interfere with or snoop on Internet traffic, compromise basic human network rights.”Berners-Lee is a co-director of the Open Data Institute.He was honoured as the ‘Inventor of the World Wide Web’ in a section of the 2012 Summer Olympics opening ceremony in which he also participated, working at a NeXT Computer. He tweeted: “This is for everyone”, instantly spelled out in LCD lights attached to the chairs of the 70,500 people in the audience.

Alan Turing OBE FRS

British  mathematician, logician, cryptanalyst, and computer scientist Alan Turing OBE, FRS was Born June 23rd, 1912 in Maida Vale, and grew up in Hastings. He displayed great individuality from a young age. At 14 he went to Sherborne School in Dorset. Turing read mathematics at Cambridge, he was a completely original thinker who shaped the modern world, and assisted in the development of the innovative Manchester computers. He was also highly influential in the development of computer science, providing a formalisation of the concepts of “algorithm” and “computation” with the Turing machine, which played a sinificant role in the creation of the modern computer. Turing is widely considered to be the father of computer science and artificial intelligece.He also became interested in mathematical biology and wrote a paper on the chemical basis of morphogenesis, and predicted oscillating chemical reactions such as the Belousov–Zhabotinsky reaction, which were first observed in the 1960s.

On 4 September 1939 the day after Britain declared war on Germany, Turing reported to Bletchley Park where he worked for the Government Code and Cypher School (GCCS)the forerunner of GCHQ, Britain’s codebreaking centre. For a time he was head of Hut 8, the section responsible for German naval cryptanalysis. Turing led a team whose ingenuity and intellect were turned to the task of breaking German ciphers. He devised a number of techniques for breaking German ciphers and One of Turing’s main contributions whilst there was to invent the Bombe, an electromechanical machine used to find the daily settings of the Enigma machine. as a result he played an absolutely vital part of the British war effort and It is without question that his efforts helped shorten the war significantly, saving the lives of millions of people.He was also a remarkable British hero who helped create the modern world. Now known as the father of computer science, his inventions contributed greatly to the groundwork for the modern computer.

After the war he worked at the National Physical Laboratory, where he created one of the first designs for a stored-program computer, the ACE. In 1948 Turing joined Max Newman’s Computing Laboratory at Manchester University, where he assisted in the development of the Manchester computers and invented a type of theoretical machine now called a Turing Machine, which formalized what it means to compute a number. Turing’s importance extends far beyond Turing Machines. His work deciphering secret codes drastically shortened World War II and pioneered early computer technology.He was also an early innovator in the field of artificial intelligence, and came up with a way to test if computers could think – now known as the Turing Test. Besides this abstract work, he was down to earth; he designed and built real machines, even making his own relays and wiring up circuits. This combination of pure math and computing machines was the foundation of computer science.

Despite his invaluable help during World War II AND all his other achievements, he was treated badly. A burglary at his home led Turing to admit to police that he was a practicing homosexual, at a time when it was illegal in Britain. This led to his arrest and conviction in 1952 for ‘gross indecency’. He was subsequently forced to choose between imprisonment and chemical castration. He chose chemical castration (treatment with female hormones) as an alternative to prison. As a result of his conviction he lost security clearance and was not allowed to continue his work, and Sadly On 8 June 1954 Turing committed suicide just over two weeks before his 42nd birthday.

Luckily since Turing’s birth, attitudes have changed towars homosexuality and The US-based Association of Computing Machinery has given The Turing Award annually since 1966. This is the computing world’s highest honour for technical contribution to the computing community and considered equivalent to the Nobel prize.On 10 September 2009, following an Internet campaign, British Prime Minister Gordon Brown also made an official public apology on behalf of the British government for “the appalling way he was treated”. Despite his valuable contributions Turing did not receive the recognition and plaudits that he deserved while alive, However this has now been redressed and there is now A fully functional replica of the Bombe which can be found today at Bletchley Park, along with the excellent Turing exhibition. Turing has also been immortalised on film in The Imitation Game starring Benedict Cumberbatch.

Tim Berners-Lee

English-American computer scientist and engineer, Sir Timothy John Berners-Lee OM KBE FRS FREng FRSA FBCS was born 8 June 1955 in London, England, one of four children born to Mary Lee Woods and Conway Berners-Lee. His parents worked on the first commercially-built computer, the Ferranti Mark 1. He attended Sheen Mount Primary School, and then went on to attend south west London’s Emanuel School from 1969 to 1973, at the time a direct grant grammar school, which became an independent school in 1975. A keen trainspotter as a child, he learnt about electronics from tinkering with a model railway. He studied at The Queen’s College, Oxford from 1973 to 1976, where he received a first-class degree bachelor of arts degree in physics.

After graduation, Berners-Lee worked as an engineer at the telecommunications company Plessey in Poole, Dorset. In 1978, he joined D. G. Nash in Ferndown, Dorset, where he helped create type-setting software for printers. Berners-Lee worked as an independent contractor at CERN from June to December 1980. While in Geneva, he proposed a project based on the concept of hypertext, to facilitate sharing and updating information among researchers. To demonstrate it, he built a prototype system named ENQUIRE. After leaving CERN in late 1980, he went to work at John Poole’s Image Computer Systems, Ltd, in Bournemouth, Dorset. He ran the company’s technical side for three years. The project he worked on was a “real-time remote procedure call” which gave him experience in computer networking. In 1984, he returned to CERN as a fellow.In 1989, CERN was the largest Internet node in Europe, and Berners-Lee saw an opportunity to join hypertext with the Internet:

“I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and—ta-da!—the World Wide Web.mCreating the web was really an act of desperation, because the situation without it was very difficult when I was working at CERN later. Most of the technology involved in the web, like the hypertext, like the Internet, multifont text objects, had all been designed already. I just had to put them together. It was a step of generalising, going to a higher level of abstraction, thinking about all the documentation systems out there as being possibly part of a larger imaginary documentation system.” This NeXT Computer was used by Berners-Lee at CERN and became the world’s first web server. Berners-Lee wrote his proposal in March 1989 and, in 1990, redistributed it. It then was accepted by his manager, Mike Sendall.[29] He used similar ideas to those underlying the ENQUIRE system to create the World Wide Web, for which he designed and built the first Web browser. His software also functioned as an editor (called WorldWideWeb, running on the NeXTSTEP operating system), and the first Web server, CERN HTTPd (short for Hypertext Transfer Protocol daemon).

He is commonly credited with inventing the World Wide Web (abbreviated as WWW or W3, commonly known as the web. The World Wide Web is a series of interlinked hypertext documents accessed via the Internet. With a web browser, one can view web pages that may contain text, images, videos, and other multimedia and navigate between them via hyperlinks.The web was developed between March 1989 and December 1990. Using concepts from his earlier hypertext systems such as ENQUIRE, British engineer Tim Berners-Lee, acomputer scientist and at that time employee of the CERN, now Director of the World Wide Web Consortium (W3C), wrote a proposal in March 1989 for what would eventually become the World Wide Web. The 1989 proposal was meant for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world. At CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, berners-Lee and Belgian computer scientist Robert Cailliau proposed in 1990 to use hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”, and Berners-Lee finished the first website in December that year. Berners-Lee posted the project on the alt.hypertext newsgroup on 7 August 1991

In a 1970 issue of Popular Science magazine, Arthur C. Clarke predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe.In March 1989, Tim Berners-Lee wrote a proposal that referenced ENQUIRE, a database and software project he had built in 1980, and described a more elaborate information management system. With help from Robert Cailliau, he published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3”) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, [so that] authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.

The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration.The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.

In 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23 1991. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.”[18]The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992,[19]whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web.[22]The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.

Berners-Lee’s breakthrough was to incorporate hypertext to the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.

An early popular web browser was ViolaWWW for Unix and the X Windowing System..Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore.[28] Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.

The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October 1994. It was founded at theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agency (DARPA), which had pioneered the Internet; a year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain namesand HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. It should be noted that The World Wide Web is different to the Internet, The world wide web is a collection of documents stored online, while the Internet is a means of accesssing them using Internet protocols such as TCP/IP and HTTP. Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.

World day against Cyber Censorship

World Day Against Cyber-Censorship takes place annually on March 12. It aims to rally computer users in fighting repression of online speech. Reporters Without Borders was also created this day to celebrate the work of brave individuals who have promoted free expression on the Internet. The annual Netizen Prize is awarded to bloggers, online journalists, and cyber-dissidents, who have demonstrated exceptional dedication to this cause. It was first observed on March 12, 2008 at the request of Reporters Without Borders and Amnesty International. A letter written by Jean-Francois Julliard, Secretary-General of Reporters Without Borders, and Larry Cox, Executive Director of Amnesty International, was sent to the Chief Executive Officers of Google, Yahoo! & Microsoft Corporation to request observation of the day.

The Electronic Frontier Foundation remains dedicated to reporting cases of online censorship from all regions of the world, and emphasize the importance of online anonymity in preserving individuals’ right to free speech, with an ongoing feature, This Week in Censorship, which covers global stories of imprisoned bloggers, filtered content, blocked websites, and instances of Internet disconnection. A broad array of reasons are offered as justification for censorship. Bloggers in Thailand face imprisonment for criticizing the monarch. In Pakistan, the Telecommunications Authority has blocked websites, banned words from SMS texts, and most recently, has released a request for proposals to build a national blocking and filtering system: All in the name of fighting “obscene content.” The Turkish government has implemented a so-called “democratic” opt-in filtering mechanism for content that is deemed unsuitable for children and families.

Another common trend is censorship enabled in the name of battling copyright violations. Through our Global Chokepoints project, we are monitoring instances of pro-copyright laws that justify filtering of content, websites blockages, or Internet disconnection to fight infringement. Censorship remains rampant in the Middle Eastern region. In Syria, Iran, and elsewhere, bloggers continue to face imprisonment, and common users have limited access to content online due to state-mandated blocking and filtering programs. Another ongoing issue being covered are authoritarian states using Western-based surveillance technologies to monitor and spy on their citizens. State authorities can use the collected data to arrest, harass, or torture individuals accused of participating in political dissent.