International Internet Day

International Internet Day takes place annually on 29 October. The Internet is the global system of interconnected computer networks that use the Internet protocol suite (TCP/IP) to link devices worldwide and build a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.

During the 1970’s Science fiction novellist Arthur C. Clarke had predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe

The origins of the Internet date back to the 1960’s when research was commissioned by the federal government of the United States to build robust, fault-tolerant communication with computer networks. Initial concepts of wide area networking originated in several computer science laboratories in the United States, United Kingdom, and France. The US Department of Defense awarded contracts as early as the 1960s, including for the development of the ARPANET project, directed by Robert Taylor and managed by Lawrence Roberts. The first message was sent over the ARPANET on 29 October 1969 from computer science Professor Leonard Kleinrock’s laboratory at University of California, Los Angeles (UCLA) to the second network node at Stanford Research Institute (SRI).

Donald Davies first demonstrated packet switching in 1967 at the National Physics Laboratory (NPL) in the UK, which became a testbed for UK research for almost two decades. Packet switching networks such as the NPL network, ARPANET, Tymnet, Merit Network, CYCLADES, and Telenet, were developed in the late 1960s and early 1970s using a variety of communications protocols. The ARPANET project led to the development of protocols for internet working, in which multiple separate networks could be joined into a network of networks. The Internet protocol suite (TCP/IP) was developed by Robert E. Kahn and Vint Cerf in the 1970s and became the standard networking protocol on the ARPANET, incorporating concepts from the French CYCLADES project directed by Louis Pouzin. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations.

The ARPANET, initially served as a backbone for interconnection of regional academic and military networks in the 1980s until Commercial Internet service providers (ISPs) began to emerge in the very late 1980s whereupon The ARPANET was decommissioned in 1990. Limited private connections to parts of the Internet by officially commercial entities emerged in several American cities by late 1989 and 1990, Initially the National Science Foundation Network acted as a new backbone in the 1980s, providing private funding for other commercial extensions, this led to worldwide participation in the development of new networking technologies, and the merger of many networks. The NSFNET was decommissioned in 1995, removing the last restrictions on the use of the Internet to carry commercial traffic and the linking of commercial networks and enterprises by the early 1990s and marked the beginning of the transition to the modern Internet, generating a sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network. Although the Internet had been widely used by academia since the 1980s, the commercialization incorporated its services and technologies into virtually every aspect of modern life.

In the 1980s, research at CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, by British computer scientist and Engineer Tim Berners-Lee and Belgian computer scientist Robert Cailliau proposed using hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”,

Berners Lee and Robert Cailliau wrote a proposal in March 1989 for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world Using concepts from his earlier hypertext systems such as ENQUIRE.  They published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3″) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, so that,authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.

The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration. The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.

On 6 August 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This date also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.” The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992, whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web. The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.

Berners-Lee’s combined hypertext with the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.

An early popular web browser was ViolaWWW for Unix and the X Windowing System..Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore. Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.

The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October 1994. It was founded at theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agency (DARPA), which had pioneered the Internet; a year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain names and HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet. The web is a collection of documents and both client and server software using Internet protocols such as TCP/IP and HTTP.Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.

The Internet has no centralized governance in either technological implementation or policies for access and usage; each constituent network sets its own policies. Only the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). Most traditional communications media, including telephony, radio, television, paper mail and newspapers are reshaped, redefined, or even bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, and video streaming websites. Newspaper, book, and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators. The Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, and social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their “brick and mortar” presence to serve a larger market or even sell goods and services entirely online. Business-to-business and financial services on the Internet affect supply chains across entire industries.

The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. Since the mid-1990s, the Internet has had a revolutionary impact on culture, commerce, and technology, including the rise of near-instant communication by electronic mail, instant messaging, voice over Internet Protocol (VoIP) telephone calls, two-way interactive video calls, and the World Wide Web with its discussion forums, blogs, social networking, and online shopping sites. The research and education community continues to develop and use advanced networks such as JANET in the United Kingdom and Internet2 in the United States. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1-Gbit/s, 10-Gbit/s, or more. The Internet’s takeover of the global communication landscape was almost instant in historical terms: it only communicated 1% of the information flowing through two-way telecommunications networks in the year 1993, already 51% by 2000, and more than 97% of the telecommunicated information by 2007. Today the Internet continues to grow, driven by ever greater amounts of online information, commerce, entertainment, and social networking. However, the future of the global internet may be shaped by regional differences in the world. In November 2006, the Internet was included on USA Today’s list of New Seven Wonders.

Advertisements

Bill Gates

American business magnate, software executive and philanthropist William Henry “Bill” Gates III was born October 28th, 1955. Bill Gates is the former chief executive and current chairman of Microsoft, the world’s largest personal-computer software company, which he co-founded with Paul Allen. He is consistently ranked among the world’s wealthiest people and was the wealthiest overall from 1995 to 2009, excluding 2008, when he was ranked third; in 2011 he was the third wealthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.

Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairmanthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.

Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairman.

National iPod day

National iPod Day is observed annually on October 23. Apple introduced the iPod on October 23, 2001. The iPod changed the way we listened to and purchased music. The first iPod was sold on November 10, 2001, for $399. However the price and Mac-only compatibility caused sales to be relatively slow until 2004. The iPod line came from Apple’s “digital hub” category, when the company began creating software for the growing market of personal digital devices. Digital cameras, camcorders and organizers had well-established mainstream markets, but the company found existing digital music players “big and clunky or small and useless” with user interfaces that were “unbelievably awful,” so Apple decided to develop its own. As ordered by CEO Steve Jobs, Apple’s hardware engineering chief Jon Rubinstein assembled a team of engineers to design the iPod line, including hardware engineers Tony Fadell and Michael Dhuey, and design engineer Sir Jonathan Ive. Rubinstein had already discovered the Toshiba hard disk drive while meeting with an Apple supplier in Japan, and purchased the rights to it for Apple, and had also already worked out how the screen, battery, and other key elements would work. The aesthetic was inspired by the 1958 Braun T3 transistor radio designed by Dieter Rams, while the wheel-based user interface was prompted by Bang & Olufsen’s BeoCom 6000 telephone. The product “the Walkman of the twenty-first century” was developed in less than one year and unveiled on October 23, 2001. Jobs announced it as a Mac-compatible product with a 5 GB hard drive that put “1,000 songs in your pocket.”

Apple did not develop the iPod software entirely in-house, instead using PortalPlayer’s reference platform based on two ARM cores. The platform had rudimentary software running on a commercial microkernel embedded operating system. PortalPlayer had previously been working on an IBM-branded MP3 player with Bluetooth headphones. Apple contracted another company, Pixo, to help design and implement the user interface under the direct supervision of Steve Jobs. As development progressed, Apple continued to refine the software’s look and feel. Starting with the iPod Mini, the Chicago font was replaced with Espy Sans. Later iPods switched fonts again to Podium Sans—a font similar to Apple’s corporate font, Myriad. Color display iPods then adopted some Mac OS X themes like Aqua progress bars, and brushed metal meant to evoke a combination lock. In 2007, Apple modified the iPod interface again with the introduction of the sixth-generation iPod Classic and third-generation iPod Nano by changing the font to Helvetica and, in most cases, splitting the screen in half by displaying the menus on the left and album artwork, photos, or videos on the right (whichever was appropriate for the selected item).

In 2006 Apple presented a special edition for iPod 5G of Irish rock band U2. Like its predecessor, this iPod has engraved the signatures of the four members of the band on its back, but this one was the first time the company changed the colour of the metal (not silver but black). This iPod was only available with 30GB of storage capacity. The special edition entitled purchasers to an exclusive video with 33 minutes of interviews and performance by U2, downloadable from the iTunes Store. In 2007, during a lawsuit with patent holding company Burst.com, Apple drew attention to a patent for a similar device that was developed in 1979. Kane Kramer applied for a UK patent for his design of a “plastic music box” in 1981, which he called the IXI.He was unable to secure funding to renew the US$120,000 worldwide patent, so it lapsed and Kramer never profited from his idea.

The name iPod was proposed by Vinnie Chieco, a freelance copywriter, who (with others) was called by Apple to figure out how to introduce the new player to the public. After Chieco saw a prototype, he thought of the movie 2001: A Space Odyssey and the phrase “Open the pod bay door, Hal!”, which refers to the white EVA Pods of the Discovery One spaceship. Chieco saw an analogy to the relationship between the spaceship and the smaller independent pods in the relationship between a personal computer and the music player. Apple researched the trademark and found that it was already in use. Joseph N. Grasso of New Jersey had originally listed an “iPod” trademark with the U.S. Patent and Trademark Office (USPTO) in July 2000 for Internet kiosks. The first iPod kiosks had been demonstrated to the public in New Jersey in March 1998, and commercial use began in January 2000, but had apparently been discontinued by 2001. The trademark was registered by the USPTO in November 2003, and Grasso assigned it to Apple Computer, Inc. in 2005.

The earliest recorded use in commerce of an “iPod” trademark was in 1991 by Chrysalis Corp. of Sturgis, Michigan, styled “iPOD”. In mid-2015, several new color schemes for all of the current iPod models were spotted in the latest version of iTunes, 12.2. Belgian website Belgium iPhone originally found the images when plugging in an iPod for the first time, and subsequent leaked photos were found by Pierre Dandumont. In 2017, Apple removed the iPod Nano and Shuffle from its stores, marking the end of Apple producing standalone music players. Currently, the iPod Touch is the only iPod produced by Apple.

INTERNATIONAL CAPS LOCK DAY

INTERNATIONAL CAPS LOCK DAY IS CELEBRATED ANNUALLY ON 22 OCTOBER. INTERNATIONAL CAPS LOCK DAY was founded 2000, when Derek Arnold of Iowa as a parody. after he decided that he, like so many other internet users, had simply had enough of people using all caps to emphasize themselves on the web. So he created INTERNATIONAL CAPS LOCK DAY to poke fun at those individuals who unnecessarily capitalize letters, words, and phrases and bring some sanity back to the web.The day became so popular with internet users that INTERNATIONAL CAPS LOCK DAY is now celebrated twice a year on June 28 and on October 22. The second observation on June 28 was added by Arnold in memory of American pitchman Billy Mays.

Caps Lock is a button on a computer keyboard that causes all letters of Latin-based scripts to be generated in capitals. It is a toggle key: each press reverses its action. Some keyboards also implement a light, so as to give visual feedback about whether it is on or off. Exactly what Caps Lock does depends on the keyboard hardware, the operating system, the device driver, and the keyboard layout. Usually, the effect is limited to letter keys; letters of Latin-based scripts are capitalised, while letters of other scripts (e.g. Arabic, Hebrew, Hindi) and non-letter characters are generated normally.

The Caps Lock key originated as a Shift lock key on mechanical typewriters. An early innovation in typewriters was the introduction of a second character on each typebar, thereby doubling the number of characters that could be typed, using the same number of keys. The second character was positioned above the first on the face of each typebar, and the typewriter’s Shift key caused the entire type apparatus to move, physically shifting the positioning of the typebars relative to the ink ribbon. Just as in modern computer keyboards, the shifted position was used to produce capitals and secondary characters.

The Shift lock key was introduced so the shift operation could be maintained indefinitely without continuous effort. It mechanically locked the typebars in the shifted position, causing the upper character to be typed upon pressing any key. Because the two shift keys on a typewriter required more force to operate and were meant to be pressed by the little finger, it could be difficult to hold the shift down for more than two or three consecutive strokes, therefore the introduction of the Shift lock key was also meant to reduce finger muscle pain caused by repetitive typing.

Mechanical typewriter shift lock is typically set by pushing both Shift and lock at the same time, and released by pressing Shift by itself. Computer Caps Lock is set and released by the same key, and the Caps Lock behavior in most QWERTY keyboard layouts differs from the Shift lock behavior in that it capitalizes letters but does not affect other keys, such as numbers or punctuation. Some early computer keyboards, such as the Commodore 64, had a Shift lock but no Caps Lock; others, such as the BBC Micro, had both, only one of which could be enabled at a time.

Typical Caps Lock behavior is that pressing the key sets an input mode in which all typed letters are uppercase, if applicable. The keyboard remains in Caps Lock mode and would generate all caps text until the key is pressed again. Keyboards often include a small LED to indicate that Caps Lock is active, either on the key itself or in a dedicated indicators area, where Scroll lock and Num lock indicators are also located. On the original IBM PC keyboard, this LED was exclusively controlled by the keyboard. Since the introduction of IBM AT, however, it is under control of the operating system. Small keyboards, such as netbook keyboards, forgo the indicators to conserve space, instead providing software that gives an on-screen or audio feedback.

In most cases, the status of the Caps Lock key only changes the meaning of the alphabet keys, not that of any other key. Microsoft Windows enforces this behavior only when a keyboard layout for a Latin-based script is active, e.g. the “English (United States)” layout but not the “Persian” layout. However, on certain non-QWERTY keyboard layouts, such as the French AZERTY and the German QWERTZ, Caps Lock still behaves like a traditional Shift lock, i.e., the keyboard behaves as if the Shift key is held down, causing the keyboard to input the alternative values of the keys; example the “5” key generates a “%” when Caps lock is pressed.

Depending on the keyboard layout used, the Shift key, when pressed in combination with a Latin-based letter button while Caps Lock is already on, is either ignored,[example needed] or nullifies the effect of Caps Lock, so that typed characters are in lowercase again. Microsoft Windows enforces the latter. While the typical locking behavior on keyboards with Caps Lock key is that of a toggle, each press reversing the shift state, some keyboard layouts implement a combi mode, where pressing a Shift key in Caps Lock mode will also release the Caps Lock mode, just as it typically happens in Shift lock mode. Some keyboard drivers include a configuration option to deactivate the Caps Lock key. This behavior allows users to decide themselves whether they want to use the key, or to disable it to prevent accidental activation.

However There are some protocols that make it appropriate to post in all Caps, such as when posting as part of a weather monitoring network. In this rare and perhaps singular case, all caps are how you indicate that something is in fact of importance, and with its collection of acronyms and shorthand makes sure its clear. However overuse of CAPS LOCK

Information overload day

Information Overload Day takes place annually on 20 October.  Overload Day is about taking control of the flow of information into your life, and limiting it when and if it’s necessary. It was established by a group of companies who wanted to create an awareness day to call attention to what happens when you overload your employees and customers with far too much information.

Research has shown that productivity is effected by the sheer amount of information flowing through our lives, with the average employee receiving no less than 93 emails a day. Combine that with the social media that rules our lives, the constant buzz of new text messages, and the old stand-by that is web browsing and getting overwhelmed is pretty understandable. It is also having a severe economic impact on the time spent for business purposes, not personal. A worker may be interrupted by an email just when they are deep in the middle of a project and will have to refcocus and start again after every interruption. These little pauses may not seem like much individually but combined they are adding up to a $180 Billion dollar impact.

A person can celebrate Information Overload Day by not checking your email so often. Log out of your email client, and log in only five times a day to help limit the amount of interruptions you get in a day. Turn your phone off including the vibration, and take those same stops to reply to text messages and return calls. These interruptions cost more time than you think, and you’d be surprised how productive you could be if you just eliminated them from your day.

The 20 October is also

  • Bridge Day,
  • international Independent Video Store Day,
  • Miss American Rose Day,
  • National Brandied Fruit Day
  • National Call in day for Health Reform
  • National Suspenders Day
  • Sweetest Day
  • International Day for Air traffic Controllers

Charles Babbage FRS

Mathematician, philosopher, inventor and mechanical engineer and English Polymath Charles Babbage, FRS sadly died on 18 October 1871, at the age of 79. He was born 26 December 1791. Babbage attended country school inAlphington near Exeter, then attended King Edward VI Grammar School in Totnes, South Devon, but his health forced him back to private tutors for a time Babbage then joined Holmwood academy, in Baker Street, Enfield,Middlesex, The academy’s library kindled Babbage’s love of mathematics. He studied with two more private tutors after leaving the academy. He was brought home, to study at the Totnes school: Babbage was accepted by Cambridge University and arrived at Trinity College, Cambridge, in October 1810, where he formed the Analytical society in 1812 with John Herschel and George Peacock ; Babbage was also a member of The Ghost Club, which investigated supernatural phenomena, and the Extractors Club, dedicated to liberating its members from the madhouse, should any be committed to one .In 1812 Babbage transferred to Peterhouse, Cambridge. He was the top mathematician there, but did not graduate with honours, receiving a degree without examination instead in 1814 after having defended a thesis that was considered blasphemous in the preliminary public disputation;

In 1815 Babbage lectured at the Royal Institution on astronomy and was elected a Fellow of the Royal Society in 1816. After graduation, Babbage and Herschel visited the Society of Arcueil in Paris, meeting leading French mathematicians and physicists and also worked on a basic explanation of the Electrodynamics of Arago’s rotation with Herschel, and Michael Farraday. These are now part of the theory of eddy currents. He also worked on the unification of electromagnetics. Babbage was also interested in the Coarative View of the Various institutions for the Assurance of Lives and calculated Acturial tables for an insurance Company using Equitable Society Mortality Data from 1762. Babbage helped found the Astronomical Society in 1820, whose aims were to reduce astronomical calculations to a more standard form, and publish the data. In 1824 Babbage won the Astronomical Society’s Gold Medal, “for his invention of an engine for calculating mathematical and astronomical tables” to overcome errors made in tables by mechanisation and to improve the Nautical Almanac after decrepencies were found in traditional calculations. Babbage also helped establish a modern postal system, with his friend Thomas Frederick Colby, And introduced the Uniform Fourpenny Post supplanted by the Uniform Penny Post. In 1816 Babbage, Herschel and Peacock published a translation from French of the lectures of Sylvestre Lacroix concerning Calculus, the Formal Power Series which affected functional equations (including the difference equations fundamental to the difference engine) and operator (D-module) methods for differential equations. He also originated the concept of a programmable computer” and invented the first mechanical computer that eventually led to more complex designs.

The analogy of difference and differential equations was notationally changing Δ to D, as a “finite” difference becomes “infinitesimal”. These symbolic directions became popular, as operational calculus, and pushed to the point of diminishing returns. Woodhouse had already founded this second “British Lagrangian School” Babbage worked intensively on functional equations in general, influenced by Arbogast’s ideas. From 1828 to 1839 Babbage was Lucasian Professor of Mathematics at Cambridge. Not a conventional resident don, and inattentive to teaching, he wrote three topical books during this period of his life. He was elected a Foreign Honorary Member of theAmerican Academy of Arts and Sciences in 1832. Babbage planned to lecture in 1831 on political economy. Babbage’s reforming direction Aiming to make university education more inclusive, with universities doing more for research, a broader syllabus and more interest in applications, but the idea was rejected. Another controversy Babbage had with Richard Jones lasted for six years and he never gave another lecture. Babbage also tried to enter politics, his views included disestablishment of the Church of England, a broader political franchise, and inclusion of manufacturers as stakeholders. He twice stood for Parliament as a candidate for the borough of Finsbury. In 1832 he came in third among five candidates, missing out by some 500 votes in the two-member constituency when two other reformist candidates, Thomas Wakley and Christopher Temple, split the vote. Babbage wrote another book Reflections on the Decline of Science and some of its Causes (1830) attacking the establishment and aiming to improve British science, by ousting Davies Gilbert as President of the Royal Society. Babbage also wished to become the junior secretary of the Royal Society, as Herschel was the senior, but failed after antagonizing Humphry Davy. subsequently the British Association for the Advancement of Science (BAAS) was formed in 1831.

Babbage used symbols to express the actions of his Difference and Analytical Engines in his influential book Economy of Machinery and Manufactures, which dealt with the organisation of industrial production. And An essay on the general principles which regulate the application of machinery to manufactures and the mechanical arts, was featured in the Encyclopædia Metropolitana. In his book Babbage developed the schematic classification of machines, whether for Domestic or industrial use andThe book also contained ideas on rational design in factories, and profit sharing and described The Babbage Principal. This discussed the commercial advantages available with more careful division of labour This principal had already been mentioned in the work of Melchiorre Gioia in 1815.The term was introduced in 1974 by Harry Braverman. Related formulations are the “principle of multiples” of Philip Sargant Florence, and the “balance of processes”. Babbage noticed that skilled workers typically spend parts of their time performing tasks that are below their skill level. If the labour process can be divided among several workers, labour costs may be cut by assigning only high-skill tasks to high-cost workers, restricting other tasks to lower-paid workers And that apprenticeship can be taken as fixed cost but returns to scale are available favoring the factory system. He also published a detailed breakdown of the cost structure of book publishing exposing the trade’s profitability,much to the chagrin of many publishers and namedthe organisers of the trade’s restrictive practices.

Babbage’s theories also influenced the 1851 Great Exhibition his views having a strong effect on many. Karl Marx argued that the source of the productivity of the factory system was the combination of the division of labour with machinery but mentioned that the motivation for division of labour was often for the sake of profitability, rather than productivity. Babbage also influenced the economic thinking of John Stuart Mill, George Holyoake, the economist Claude Lucien Bergery, William Jevons and Charles Fourier among others

In 1837, Babbage published On the Power, Wisdom and Goodness of God. A work of natural theology in which Babbage favored uniformitarianism preferring the conception of creation in which natural law dominated, removing the need for “contrivance. It incorporated extracts from related correspondence of Herschel withCharles Lyell. Babbage put forward the thesis that God had the omnipotence and foresight to create as a divine legislator. He could make laws which then produced species at the appropriate times, rather than continually interfering with ad hoc miracles each time a new species was required. The British Association as inspired by the Deutsche Naturforscher-Versammlung, founded in 1822. It rejected romantic science as well as metaphysics, and started to entrench the divisions of science from literature, and professionals from amateurs. Babbage also identified closely with industrialists And Suggested that industrial society was the culmination of human development. In 1838 a clash with Roderick Murchison led to his withdrawal from further involvement and he also resigned as Lucasian professor,

His interests became more focussed, on computation and metrology, and on international contacts And announced A project to tabulate all physical constants (referred to as “constants of nature”, a phrase in itself a neologism), and then to compile an encyclopedic work of numerical information. He was a pioneer in the field of “absolute measurement”.] His ideas followed on from those of Johann Christian Poggendorff, and were mentioned to Brewster in 1832. There were to be 19 categories of constants, and Ian Hacking sees these as reflecting in part Babbage’s “eccentric enthusiasms” Babbage’s paper On Tables of the Constants of Nature and Art was reprinted by the Smithsonian Institution in 1856, with an added note that the physical tables of Arnold Henry Guyot “will form a part of the important work proposed in this article”.Exact measurement was also key to the development of machine tools. Here again Babbage is considered a pioneer, with Henry Maudslay, William Sellers, and Joseph Whitworth

Babbage also met the the Engineers Marc Brunel and Joseph Clement at the Royal Society And introduced them to Isambard Kingdom Brunel in 1830, for a contact with the proposed Bristol & Birmingham Railway. He also carried out studies, around 1838, showing the superiority of the broad gauge for railways, used by Brunel’s Great Western Railway ln 1838, And invented the pilot (also called a cow-catcher), the metal frame attached to the front of locomotives that clears the tracks of obstacles; he also constructed a dynamometer car. His eldest son, Benjamin Herschel Babbage, also worked as an engineer for Brunel on the railways before emigrating to Australia in the 1850s. Babbage also invented an ophthalmoscope, however the optician Thomas Wharton Jones, ignored it and It Was only widely used after being independently invented by Hermann von Helmholtz.

Babbage also decoded Vigenère’s autokey cipher during the Crimean War His discovery being kept a military secret And later wrote a letter anonymously to the Journal of the Society for Arts concerning “Cypher Writing” . Babbage lived and worked for over 40 years at 1 Dorset Street, Marylebone, until he died; he was buried in London’s Kensal Green Cemetery. According to Horsley, Babbage died “of renal inadequacy, secondary to cystitis.” He had declined both a knighthood and baronetcy. He also argued against hereditary peerages, favoring life peerages instead. In 1983 the autopsy report for Charles Babbage was discovered and later published by his great-great-grandson A copy of the original is also available. Half of Babbage’s brain is preserved at the Hunterian Museum in the Royal College of Surgeons in London The other half of Babbage’s brain is on display in the Science Museum, London.

International Right to Know Day

The International Right to Know Day takes place annually on 28 September. The event was first proposed on 28 September 2002 at a meeting of Freedom of information organisations from around the world in Sofia, Bulgaria, in order to raise awareness about people’s right to access government information while promoting freedom of information as essential to both democracy and good governance. Freedom of information organisations and advocates around the world have since marked the date with activities to celebrate and raise awareness of the right to information.

Freedom of information is an extension of freedom of speech, a fundamental human right recognized in international law, which is today understood more generally as freedom of expression in any medium, be it orally, in writing, print, through the Internet or through art forms. This means that the protection of freedom of speech as a right includes not only the content, but also the means of expression. Freedom of information also refers to the right to privacy in the content of the Internet and information technology. As with the right to freedom of expression, the right to privacy is a recognised human right and freedom of information acts as an extension to this right. Lastly, freedom of information can include opposition to patents, opposition to copyrights or opposition to intellectual property in general. The international and United States Pirate Party have established political platforms based largely on freedom of information issues.

As of 2006 nearly 70 countries had freedom of information legislations applying to information held by government bodies and in certain circumstances to private bodies including Antigua and Barbuda, Angola, Armenia, Colombia, the Czech Republic, the Dominican Republic, Estonia, Finland, France, Iceland, Liechtenstein, Panama, Poland, Peru, South Africa, Turkey, Trinidad and Tobago, Slovakia, and the United Kingdom. However The degree to which private bodies are covered under freedom of information legislation varies, in Angola, Armenia and Peru the legislation only applies to private companies that perform what are considered to be public functions. In the Czech Republic, the Dominican Republic, Finland, Trinidad and Tobago, Slovakia, Poland and Iceland private bodies that receive public funding are subject to freedom of information legislation. Freedom of information legislation in Estonia, France and UK covers private bodies in certain sectors. In South Africa the access provisions of the Promotion of Access to Information Act have been used by individuals to establish why their loan application has been denied. The access provisions have also been used by minority shareholders in private companies and environmental groups, who were seeking information on the potential environmental damage caused by company projects.

Access to information has increasingly been recognized as a prerequisite for transparency and accountability of governments, as facilitating consumers’ ability to make informed choices, and as safeguarding citizens against mismanagement and corruption. This has led an increasing number of countries to enact freedom of information legislation in the past 10 years. Recently private bodies have started to perform functions which were previously carried out by public bodies. Privatisation and de-regulation saw banks, telecommunications companies, hospitals and universities being run by private entities, has lead to demands for the extension of freedom of information legislation to cover private bodies.

In 1983 the United Nations Commission on Transnational Corporations adopted the United Nations Guidelines for Consumer Protection stipulating eight consumer rights, including “consumer access to adequate information to enable making informed choices according to individual wishes and needs”. Access to information became regarded as a basic consumer right, and preventive disclosure, i.e. the disclosure of information on threats to human lives, health and safety, began to be emphasized.

Secretive decision making by company directors and accountancy fraud has also been linked to a number of corporate scandals such as Enron, Worldcom, Tyco, Adelphia and Global Crossing. This Has prompted the US Congress to impose new information disclosure obligations on companies with the Sarbanes-Oxley Act 2002. This led to freedom of information legislation which benefits investors.

Freedom of information (or information freedom) also refers to the protection of the right to freedom of expression with regard to the Internet and information technology. Freedom of information may also concern censorship in an information technology context, i.e. the ability to access Web content, without censorship or restrictions. The World Summit on the Information Society (WSIS) Declaration of Principles adopted in 2003 reaffirms democracy and the universality, indivisibility and interdependence of all human rights and fundamental freedoms. The Declaration also makes specific reference to the importance of the right to freedom of expression for the “Information Society” in stating:

“We reaffirm, as an essential foundation of the Information Society, and as outlined in Article 19 of the Universal Declaration of Human Rights, that everyone has the right to freedom of opinion and expression; that this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers. Communication is a fundamental social process, a basic human need and the foundation of all social organisation. It is central to the Information Society. Everyone, everywhere should have the opportunity to participate and no one should be excluded from the benefits the Information Society offers.”

The 2004 WSIS Declaration of Principles also acknowledged that it is necessary to prevent the use of information resources and technologies for criminal and terrorist purposes, while respecting human rights. The WSIS Declaration only contains a number of references to human rights and does not spell out any procedures or mechanism to assure that human rights are considered in practice.

The digital rights group Hacktivismo, founded in 1999, argues that access to information is a basic human right. The group’s beliefs are described fully in the “Hacktivismo Declaration” which calls for the Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights (ICCPR) to be applied to the Internet. The Declaration recalls the duty of member states to the ICCPR to protect the right to freedom of expression with regard to the internet and in this context freedom of information. The Hacktivismo Declaration recognises the importance to fight against human rights abuses with respect to reasonable access to information on the Internet and calls upon the hacker community to “study ways and means of circumventing state sponsored censorship of the internet and implement technologies to challenge information rights violations. The Hacktivismo Declaration does, however, recognise that the right to freedom of expression is subject to limitations, stating “we recognised the right of governments to forbid the publication of properly categorized state secrets, child pornography, and matters related to personal privacy and privilege, among other accepted restrictions.” However, the Hacktivismo Declaration opposes the use of state power to control access to the works of critics, intellectuals, artists, or religious figures.”

In 2008 the Global Network Initiative (GNI) was founded upon its “Principles on Freedom of Expression and Privacy”. The Initiative was launched in the 60th Anniversary year of the Universal Declaration of Human Rights (UDHR) and is based on internationally recognized laws and standards for human rights on freedom of expression and privacy set out in the UDHR, the International Covenant on Civil and Political Rights (ICCPR) and the International Covenant on Economic, Social and Cultural Rights (ICESCR) Participants in the Initiative include the Electronic Frontier Foundation, Human Rights Watch, Google, Microsoft, Yahoo, other major companies, human rights NGOs, investors, and academics. However although Cisco Systems was invited to the initial discussions they didn’t take part in the initiative. Harrington Investments, proposed that Cisco establish a human rights board, dismissed the GNI as a voluntary code of conduct not having any impact and called for bylaws to be introduced that force boards of directors to accept human rights responsibilities. The internet has been a revolution for censorship as much as for free speech. The concept of freedom of information has emerged in response to state sponsored censorship, monitoring and surveillance of the internet. Internet censorship includes the control or suppression of the publishing or accessing of information on the Internet.

According to the Reporters without Borders (RSF) “internet enemy list” the following states engage in pervasive internet censorship: Cuba, Iran, Maldives, Myanmar/Burma, North Korea, Syria, Tunisia, Uzbekistan, Vietnam and the Great Firewall of China. This system blocks content by preventing IP addresses from being routed through and consists of standard firewall and proxy servers at the Internet gateways. The system also selectively engages in DNS poisoning when particular sites are requested. Internet censorship in the People’s Republic of China is conducted under a wide variety of laws and administrative regulations. In accordance with these laws, more than sixty Internet regulations have been made by the People’s Republic of China (PRC) government, and censorship systems are vigorously implemented by provincial branches of state-owned ISPs, business companies, and organizations. In 2010, U.S. Secretary of State Hillary Clinton, speaking on behalf of the United States, declared ‘we stand for a single internet where all of humanity has equal access to knowledge and ideas’. In her ‘Remarks on Internet Freedom’ she also draws attention to how ‘even in authoritarian countries, information networks are helping people discover new facts and making governments more accountable’, while reporting President Barack Obama’s pronouncement ‘the more freely information flows, the stronger societies become.