World Day Against Cyber-Censorship takes place annually on March 12. It aims to rally computer users in fighting repression of online speech. Reporters Without Borders was also created this day to celebrate the work of brave individuals who have promoted free expression on the Internet. The annual Netizen Prize is awarded to bloggers, online journalists, and cyber-dissidents, who have demonstrated exceptional dedication to this cause. It was first observed on March 12, 2008 at the request of Reporters Without Borders and Amnesty International. A letter written by Jean-Francois Julliard, Secretary-General of Reporters Without Borders, and Larry Cox, Executive Director of Amnesty International, was sent to the Chief Executive Officers of Google, Yahoo! & Microsoft Corporation to request observation of the day.
The Electronic Frontier Foundation remains dedicated to reporting cases of online censorship from all regions of the world, and emphasize the importance of online anonymity in preserving individuals’ right to free speech, with an ongoing feature, This Week in Censorship, which covers global stories of imprisoned bloggers, filtered content, blocked websites, and instances of Internet disconnection. A broad array of reasons are offered as justification for censorship. Bloggers in Thailand face imprisonment for criticizing the monarch. In Pakistan, the Telecommunications Authority has blocked websites, banned words from SMS texts, and most recently, has released a request for proposals to build a national blocking and filtering system: All in the name of fighting “obscene content.” The Turkish government has implemented a so-called “democratic” opt-in filtering mechanism for content that is deemed unsuitable for children and families.
Another common trend is censorship enabled in the name of battling copyright violations. Through our Global Chokepoints project, we are monitoring instances of pro-copyright laws that justify filtering of content, websites blockages, or Internet disconnection to fight infringement. Censorship remains rampant in the Middle Eastern region. In Syria, Iran, and elsewhere, bloggers continue to face imprisonment, and common users have limited access to content online due to state-mandated blocking and filtering programs. Another ongoing issue being covered are authoritarian states using Western-based surveillance technologies to monitor and spy on their citizens. State authorities can use the collected data to arrest, harass, or torture individuals accused of participating in political dissent.
Data Privacy Day takes place annually on January 28. The purpose of Data Privacy Day (Data Protection Day in Europe) is to raise awareness and promote privacy and data protection best practices. It is currently ‘celebrated’ in the United States, Canada, and 27 European countries. In Europe it is referred to as Data Protection Day. Data Privacy/Protection is the relationship between the collection and dissemination of data, technology, the public expectation of privacy, legal and political issues surrounding them. Privacy concerns exist wherever personally identifiable information or other sensitive information is collected, stored, used, and finally destroyed or deleted – in digital form or otherwise. Improper or non-existent disclosure control can be the root cause for privacy issues. Data privacy issues may arise in response to information from a wide range of sources, such as:
Criminal justice investigations and proceedings
Financial institutions and transactions
Biological traits, such as genetic material
Residence and geographic records
Location-based service and geolocation
Web surfing behavior or user preferences using persistent cookies
Internet security has become a growing concern. These concerns include whether email can be stored or read by third parties without consent, or whether third parties can continue to track the websites that someone has visited. Another concern is if the websites that are visited can collect, store, and possibly share personally identifiable information about users. The advent of various search engines and the use of data mining created a capability for data about individuals to be collected and combined from a wide variety of sources very easily. The FTC has provided a set of guidelines that represent widely accepted concepts concerning fair information practices in an electronic marketplace called the Fair Information Practice Principles.
In order not to give away too much personal information, emails should be encrypted. Browsing of web pages as well as other online activities should be done trace-less via “anonymizers”, in case those are not trusted, by open-source distributed anonymizers, so called mix nets, such as I2P or Tor – The Onion Router Are available.
Email isn’t the only internet content with privacy concerns. In an age where increasing amounts of information are going online, social networking sites pose additional privacy challenges. People may be tagged in photos or have valuable information exposed about themselves either by choice or unexpectedly by others. Caution should be exercised with what information is being posted, as social networks vary in what they allow users to make private and what remains publicly accessible. Without strong security settings in place and careful attention to what remains public, a person can be profiled by searching for and collecting disparate pieces of information, worst case leading to cases of cyberstalking or reputational damage.
The challenge of data privacy is to use data while protecting an individual’s privacy preferences and their personally identifiable information. The fields of computer security, data security, and information security design and use software, hardware, and human resources to address this issue. Since the laws and regulations related to Privacy and Data Protection are constantly changing, it is important to keep abreast of any changes in the law and to continually reassess compliance with data privacy and security regulations. Within academia, Institutional Review Boards function to assure that adequate measures are taken to ensure both the privacy and confidentiality of human subjects in research.
Data Privacy Day’s educational initiative was originally focused on raising awareness among businesses as well as users about the importance of protecting the privacy of their personal information online, particularly in the context of social networking. The educational focus has expanded over the past four years to include families, consumers and businesses. In addition to its educational initiative, Data Privacy Day promotes events and activities that stimulate the development of technology tools that promote individual control over personally identifiable information; encourage compliance with privacy laws and regulations; and create dialogues among stakeholders interested in advancing data protection and privacy. The international celebration offers many opportunities for collaboration among governments, industry, academia, nonprofits, privacy professionals and educators.
The Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data was opened by the Council of Europe in 1981. This convention is currently in the process of being updated in order to reflect new legal challenges caused by technological development. The Convention on Cybercrime is also protecting the integrity of data systems and thus of privacy in cyberspace. Privacy including data protection is also protected by Article 8 of the European Convention on Human Rights. The day was initiated by the Council of Europe in 2007 as the European Data Protection Day and on January 26, 2009, the United States House of Representatives passed a House Resolution declaring January 28 National Data Privacy Day. On January 28, 2009, the Senate officially recognised January 28, 2009 as National Data Privacy Day. In response to the increasing levels of data breaches and the global importance of privacy and data security, the Online Trust Alliance (OTA) and the National Cyber Security Alliance adopted Data Privacy Day as Data Privacy & Protection Day, emphasizing the need to look at the long-term impact to consumers of data collection, use and protection practices and they also organise other Data Protection Day Activities
The Analyst, Metaphysician, and Founder of Scientific Computing, Augusta Ada King, Countess of Lovelace was born on 10th December 1815. Born Augusta Ada Byron and now commonly known as Ada Lovelace, she was the daughter of Lord Byron and is remembered as a mathematician and writer chiefly known for her work on Charles Babbage’s early mechanical general-purpose computer, the Analytical Engine. Her notes on the engine include what is recognised as the first algorithm intended to be processed by a machine. Because of this, she is often considered the world’s first computer programmer and left a legacy as role model for young women entering technology careers.
Ada was the only legitimate child born during a brief marriage between the poet Lord Byron and Anne Isabella Byron). She had no relationship with her father, who separated from her mother just a month after Ada was born, and four months later he left England forever and died in Greece in 1823 leaving her mother to raise her single-handedly, Her life was an apotheosis of struggle between emotion and reason, subjectivism and objectivism, poetics and mathematics, ill health and bursts of energy. Lady Byron wished her daughter to be unlike her poetical father, and she saw to it that Ada received tutoring in mathematics and music, as disciplines to counter dangerous poetic tendencies. But Ada’s complex inheritance became apparent as early as 1828, when she produced the design for a flying machine. It was mathematics that gave her life its wings.
As a young adult, she took an interest in mathematics, and in particular that of Lucasian professor of mathematics at Cambridge, Charles Babbage whom she met met in 1833, when she was just 17, who was One of the gentlemanly scientists of the era and become Ada’s lifelong friend. Babbage, was known as the inventor of the Difference Engine, an elaborate calculating machine that operated by the method of finite differences , and they began a voluminous correspondence on the topics of mathematics, logic, and ultimately all subjects. In 1835, Ada married William King, ten years her senior, and when King inherited a noble title in 1838, they became the Earl and Countess of Lovelace. Ada had three children. The family and its fortunes were very much directed by Lady Byron, whose domineering was rarely opposed by King.Babbage had made plans in 1834 for a new kind of calculating machine (although the Difference Engine was not finished), an Analytical Engine.
His Parliamentary sponsors refused to support a second machine with the first unfinished, but Babbage found sympathy for his new project abroad. In 1842, an Italian mathematician, Louis Menebrea, published a memoir in French on the subject of the Analytical Engine. Babbage enlisted Ada as translator for the memoir, and during a nine-month period in 1842-43, she worked feverishly on the article and a set of Notes she appended to it. These notes contain what is considered the first computer program — that is, an algorithm encoded for processing by a machine. Ada’s notes are important in the early history of computers. She also foresaw the capability of computers to go beyond mere calculating or number-crunching while others, including Babbage himself, focused only on these capabilities
Ada called herself an Analyst (& Metaphysician), and the combination was put to use in the Notes. She understood the plans for the device as well as Babbage but was better at articulating its promise. She rightly saw it as what we would call a general-purpose computer. It was suited for “developing and tabulating any function whatever. . . the engine is the material expression of any indefinite function of any degree of generality and complexity.” Her Notes anticipate future developments, including computer-generated music. Sadly though Ada passed away on November 27, 1852, in Marylebone at the age of 37, from Cancer and was buried beside the father she never knew. Her contributions to science were resurrected only recently, but many new biographies* attest to the fascination of Babbage’s “Enchantress of Numbers.”
Computer security day takes place annually on 30 November. The purpose of Computer Security day is is to educate people concerning the threat of computor hacking, Phishing and Scamming, to raise awareness about computer security, and highlight measures that can be taken to keep your computer data safe from undesirable prying eyes. In this modern age electronic devices such as smartphones, tablets, and computers are playing an increasingly important role of our everyday lives. While communication has become easier and more efficient than ever before, these technological advancements have also brought with them new concerns about privacy and security.
Computer Security Day began in 1988, around the time that computers were becoming commonplace, even if they were yet to become ubiquitous in homes. The 1980s saw not only increased usage of computers, especially in business and government, and the internet was in its early stages. While hacking and viruses have virtually been around since the early days of modern computing, evolving and increasingly sophisticated technologies began to see more applications, and therefore more security risks due to the simple fact that more data was at risk as computers found their way into banks, government offices, and businesses. As More important data got stored on computers and servers this meant more valuable information for hackers, resulting in higher profile cases of security breaches so, online security became an important concern by the end of the decade.
The Analyst, Metaphysician, and Founder of Scientific Computing, Augusta Ada King, Countess of Lovelace Sadly passed away on November 27, 1852, in Marylebone at the age of 37, from Cancer. Born Augusta Ada Byron on 10th December 1815. She was the daughter of Lord Byron and is remembered as a mathematician and writer chiefly known for her work on Charles Babbage’s early mechanical general-purpose computer, the Analytical Engine. Her notes on the engine include what is recognised as the first algorithm intended to be processed by a machine. Because of this, she is often considered the world’s first computer programmer and left a legacy as role model for young women entering technology careers. Ada was the only legitimate child born to the poet Lord Byron and Anne Isabella Byron). She had no relationship with her father, who separated from her mother just a month after Ada was born, and four months later he left England forever and died in Greece in 1823 leaving her mother to raise her single-handedly, Her life was an apotheosis of struggle between emotion and reason, subjectivism and objectivism, poetics and mathematics, ill health and bursts of energy.
Lady Byron wished her daughter to be unlike her poetic father, and she saw to it that Ada received tutoring in mathematics and music, as disciplines to counter dangerous poetic tendencies. This produced results when, In 1828, Ada produced a design for a flying machine. As a young adult, she remained interested in mathematics, and in particular that of Lucasian professor of mathematics at Cambridge, Charles Babbage whom she met met in 1833, when she was just 17, who was one of the gentlemanly scientists of the era and become Ada’s lifelong friend. Babbage, was known as the inventor of the Difference Engine, an elaborate calculating machine that operated by the method of finite differences , and they began a voluminous correspondence on the topics of mathematics, logic, and many other subjects.
In 1835, Ada married William King, ten years her senior, and when King inherited a noble title in 1838, they became the Earl and Countess of Lovelace. Ada had three children. The family and its fortunes were very much directed by Lady Byron, whose domineering was rarely opposed by King. Babbage had made plans in 1834 for a new kind of calculating machine (although the Difference Engine was not finished), an Analytical Engine. His Parliamentary sponsors refused to support a second machine with the first unfinished, but Babbage found sympathy for his new project abroad. In 1842, an Italian mathematician, Louis Menebrea, published a memoir in French on the subject of the Analytical Engine. Babbage enlisted Ada as translator for the memoir, and during a nine-month period in 1842-43, she worked feverishly on the article and a set of Notes she appended to it. These notes contain what is considered the first computer program — that is, an algorithm encoded for processing by a machine. Ada’s notes are important in the early history of computers. She also foresaw the capability of computers to go beyond mere calculating or number-crunching while others, including Babbage himself, focused only on these capabilities.
Ada called herself an Analyst & Metaphysician, and the combination was put to use in the Notes. She understood the plans for the device as well as Babbage but was better at explaining uses for the device. She rightly saw it as what we would call a general-purpose computer. It was suited for “developing and tabulating any function whatever. . . the engine is the material expression of any indefinite function of any degree of generality and complexity.” Her Notes also anticipated future developments, including computer-generated music. Her contributions to science and fascination for Babbage’s Difference Engine earned her the nickname “Enchantress of Numbers.”
International Internet Day takes place annually on 29 October. The Internet is the global system of interconnected computer networks that use the Internet protocol suite (TCP/IP) to link devices worldwide and build a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.
During the 1970’s Science fiction novellist Arthur C. Clarke had predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe
The origins of the Internet date back to the 1960’s when research was commissioned by the federal government of the United States to build robust, fault-tolerant communication with computer networks. Initial concepts of wide area networking originated in several computer science laboratories in the United States, United Kingdom, and France. The US Department of Defense awarded contracts as early as the 1960s, including for the development of the ARPANET project, directed by Robert Taylor and managed by Lawrence Roberts. The first message was sent over the ARPANET on 29 October 1969 from computer science Professor Leonard Kleinrock’s laboratory at University of California, Los Angeles (UCLA) to the second network node at Stanford Research Institute (SRI).
Donald Davies first demonstrated packet switching in 1967 at the National Physics Laboratory (NPL) in the UK, which became a testbed for UK research for almost two decades. Packet switching networks such as the NPL network, ARPANET, Tymnet, Merit Network, CYCLADES, and Telenet, were developed in the late 1960s and early 1970s using a variety of communications protocols. The ARPANET project led to the development of protocols for internet working, in which multiple separate networks could be joined into a network of networks. The Internet protocol suite (TCP/IP) was developed by Robert E. Kahn and Vint Cerf in the 1970s and became the standard networking protocol on the ARPANET, incorporating concepts from the French CYCLADES project directed by Louis Pouzin. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations.
The ARPANET, initially served as a backbone for interconnection of regional academic and military networks in the 1980s until Commercial Internet service providers (ISPs) began to emerge in the very late 1980s whereupon The ARPANET was decommissioned in 1990. Limited private connections to parts of the Internet by officially commercial entities emerged in several American cities by late 1989 and 1990, Initially the National Science Foundation Network acted as a new backbone in the 1980s, providing private funding for other commercial extensions, this led to worldwide participation in the development of new networking technologies, and the merger of many networks. The NSFNET was decommissioned in 1995, removing the last restrictions on the use of the Internet to carry commercial traffic and the linking of commercial networks and enterprises by the early 1990s and marked the beginning of the transition to the modern Internet, generating a sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network. Although the Internet had been widely used by academia since the 1980s, the commercialization incorporated its services and technologies into virtually every aspect of modern life.
In the 1980s, research at CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, by British computer scientist and Engineer Tim Berners-Lee and Belgian computer scientist Robert Cailliau proposed using hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”,
Berners Lee and Robert Cailliau wrote a proposal in March 1989 for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world Using concepts from his earlier hypertext systems such as ENQUIRE. They published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3″) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, so that,authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.
The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration. The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.
On 6 August 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This date also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.” The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992, whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web. The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.
Berners-Lee’s combined hypertext with the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.
An early popular web browser was ViolaWWW for Unix and the X Windowing System..Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore. Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.
The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October 1994. It was founded at theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agency (DARPA), which had pioneered the Internet; a year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain names and HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet. The web is a collection of documents and both client and server software using Internet protocols such as TCP/IP and HTTP.Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.
The Internet has no centralized governance in either technological implementation or policies for access and usage; each constituent network sets its own policies. Only the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). Most traditional communications media, including telephony, radio, television, paper mail and newspapers are reshaped, redefined, or even bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, and video streaming websites. Newspaper, book, and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators. The Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, and social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their “brick and mortar” presence to serve a larger market or even sell goods and services entirely online. Business-to-business and financial services on the Internet affect supply chains across entire industries.
The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. Since the mid-1990s, the Internet has had a revolutionary impact on culture, commerce, and technology, including the rise of near-instant communication by electronic mail, instant messaging, voice over Internet Protocol (VoIP) telephone calls, two-way interactive video calls, and the World Wide Web with its discussion forums, blogs, social networking, and online shopping sites. The research and education community continues to develop and use advanced networks such as JANET in the United Kingdom and Internet2 in the United States. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1-Gbit/s, 10-Gbit/s, or more. The Internet’s takeover of the global communication landscape was almost instant in historical terms: it only communicated 1% of the information flowing through two-way telecommunications networks in the year 1993, already 51% by 2000, and more than 97% of the telecommunicated information by 2007. Today the Internet continues to grow, driven by ever greater amounts of online information, commerce, entertainment, and social networking. However, the future of the global internet may be shaped by regional differences in the world. In November 2006, the Internet was included on USA Today’s list of New Seven Wonders.
American business magnate, software executive and philanthropist William Henry “Bill” Gates III was born October 28th, 1955. Bill Gates is the former chief executive and current chairman of Microsoft, the world’s largest personal-computer software company, which he co-founded with Paul Allen. He is consistently ranked among the world’s wealthiest people and was the wealthiest overall from 1995 to 2009, excluding 2008, when he was ranked third; in 2011 he was the third wealthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.
Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairmanthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.
Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairman.
Gulliver’s travels by Jonathan Swift
The novel Gulliver’s Travels by Irish Writer and Clergyman Jonathan Swift was published 28 October 1726. Gulliver’s Travels, is a satire whose full title is Travels into Several Remote Nations of the World. In Four Parts. By Lemuel Gulliver, First a Surgeon, and then a Captain of Several Ships. It is Swift’s best known full-length work, and a classic of English literature. Gulliver’s Travels has been described as a children’s story, proto-science fiction and a forerunner of the modern novel. It was Published seven years after Daniel Defoe’s wildly successful Robinson Crusoe The novel begins with a short preamble in which Lemuel Gulliver gives a brief outline of his life and history before his voyages.
Part I: A voyage to Lilliput During Gulliver’s first voyage he is washed ashore after being shipwrecked and finds himself a prisoner of a race of tiny people, less than 6 inches (12 cm) tall, who are inhabitants of the island country of Lilliput. After giving assurances of his good behaviour, he is given a residence in Lilliput and becomes a favourite of the Lilliput Royal Court. He is also given permission by the King of Lilliput to go around the city on condition that he must not harm their subjects. At first, the Lilliputians are hospitable to Gulliver, but they are also wary of the threat that his size poses to them. The Lilliputians reveal themselves to be a people who put great emphasis on trivial matters. For example, which end of an egg a person cracks becomes the basis of a deep political rift within that nation. They are a people who revel in displays of authority and performances of power. Gulliver assists the Lilliputians to subdue their neighbors the Blefuscudians by stealing their fleet. However, he refuses to reduce the island nation of Blefuscu to a province of Lilliput, displeasing the King and the royal court. Gulliver is charged with treason for, among other crimes, “making water” in the capital though he was putting out a fire and saving countless lives. He is convicted and sentenced to be blinded. With the assistance of a kind friend, “a considerable person at court”, he escapes to Blefuscu. Here, he spots and retrieves an abandoned boat and sails out to be rescued by a passing ship, which safely takes him back home.
Part II: A Voyage to Brobdingnag Gulliver soon sets out again. When the sailing ship Adventure is blown off course by storms and forced to sail for land in search of fresh water, Gulliver is abandoned by his companions and is left on a peninsula on the western coast of the North American continent. The grass of that land is as tall as a tree. He is then found by a farmer who was about 72 ft. tall. He brings Gulliver home and the farmer’s daughter Glumdalclitch cares for Gulliver. The giant-sized farmer treats him as a curiosity and exhibits him for money. After a while the constant shows make Gulliver sick, and the farmer sells him to the queen of the realm. Glumdalclitch (who accompanied her father while exhibiting Gulliver) is taken into the Queen of Brobdingnag’s service to take care of the tiny man. Since Gulliver is too small to use their huge chairs, beds, knives and forks, the Queen of Brobdingnag commissions a small house to be built for him so that he can be carried around in it; this is referred to as his “travelling box”. However because of his diminutive size Gulliver becomes a target for various forms of wildlife including giant wasps, giant monkeys and a giant Eagle…
Part III: A Voyage to Laputa, Balnibarbi, Luggnagg, Glubbdubdrib and Japan After escaping Brobdingnag Gulliver’s ship is attacked by pirates and he is marooned close to a desolate rocky island near India. However He is rescued by the flying island of Laputa, a kingdom devoted to the arts of music, mathematics, and astronomy. Gulliver then tours Balnibarbi, the kingdom ruled from Laputa, and also learns of the rebellion which the kingdom of Lindalino led against the flying island of Laputa. Gulliver sees the ruin brought about by the blind pursuit of science without practical results, in a satire on bureaucracy and on the Royal Society and its experiments. At the Grand Academy of Lagado in Balnibarbi, great resources and manpower are employed on researching completely preposterous schemes such as extracting sunbeams from cucumbers, softening marble for use in pillows, learning how to mix paint by smell, and uncovering political conspiracies by examining the excrement of suspicious persons (muckraking). Gulliver is then taken to Maldonada, the main port of Balnibarbi, to await a trader who can take him on to Japan. While waiting for a passage, Gulliver visits the island of Glubbdubdrib which is southwest of Balnibarbi. On Glubbdubdrib, he visits a magician’s dwelling and discusses history with the ghosts of historical figures, including Julius Caesar, Brutus, Homer, Aristotle, René Descartes, and Pierre Gassendi. On the island of Luggnagg, he encounters the struldbrugs, people who are immortal. They do not have the gift of eternal youth, but suffer the infirmities of old age and are considered legally dead at the age of eighty.
Part IV: A Voyage to the Land of the Houyhnhnms Gulliver returns to sea as the captain of a merchantman, but becomes bored with his employment as a surgeon. Unfortunately His crew turn against him and abandon him in a landing boat. Upon reaching land Gulliver encounters a race of hideous, deformed and savage humanoid creatures to which he conceives a violent antipathy. Shortly afterwards, he meets the Houyhnhnms, a race of talking horses. They are the rulers while the deformed creatures called Yahoos are human beings in their base form. Gulliver becomes a member of a horse’s household and comes to both admire and emulate the Houyhnhnms and their way of life, rejecting his fellow humans as merely Yahoos endowed with some semblance of reason which they only use to exacerbate and add to the vices Nature gave them. Unfortunately an Assembly of the Houyhnhnms rules that Gulliver, is himself a Yahoo with some semblance of reason, and is therefore a danger to their civilization…
National iPod Day is observed annually on October 23 to commemorate the introduction of the Apple iPod on October 23, 2001. The iPod changed the way we listened to and purchased music. The first iPod was sold on November 10, 2001, for $399. However the price and Mac-only compatibility caused sales to be relatively slow until 2004. The iPod line came from Apple’s “digital hub” category, when the company began creating software for the growing market of personal digital devices. Digital cameras, camcorders and organizers had well-established mainstream markets, but the company found existing digital music players “big and clunky or small and useless” with user interfaces that were “unbelievably awful,” so Apple decided to develop its own. As ordered by CEO Steve Jobs, Apple’s hardware engineering chief Jon Rubinstein assembled a team of engineers to design the iPod line, including hardware engineers Tony Fadell and Michael Dhuey, and design engineer Sir Jonathan Ive. Rubinstein had already discovered the Toshiba hard disk drive while meeting with an Apple supplier in Japan, and purchased the rights to it for Apple, and had also already worked out how the screen, battery, and other key elements would work. The aesthetic was inspired by the 1958 Braun T3 transistor radio designed by Dieter Rams, while the wheel-based user interface was prompted by Bang & Olufsen’s BeoCom 6000 telephone. The product “the Walkman of the twenty-first century” was developed in less than one year and unveiled on October 23, 2001. Jobs announced it as a Mac-compatible product with a 5 GB hard drive that put “1,000 songs in your pocket.”
Apple did not develop the iPod software entirely in-house, instead using PortalPlayer’s reference platform based on two ARM cores. The platform had rudimentary software running on a commercial microkernel embedded operating system. PortalPlayer had previously been working on an IBM-branded MP3 player with Bluetooth headphones. Apple contracted another company, Pixo, to help design and implement the user interface under the direct supervision of Steve Jobs. As development progressed, Apple continued to refine the software’s look and feel. Starting with the iPod Mini, the Chicago font was replaced with Espy Sans. Later iPods switched fonts again to Podium Sans—a font similar to Apple’s corporate font, Myriad. Color display iPods then adopted some Mac OS X themes like Aqua progress bars, and brushed metal meant to evoke a combination lock. In 2007, Apple modified the iPod interface again with the introduction of the sixth-generation iPod Classic and third-generation iPod Nano by changing the font to Helvetica and, in most cases, splitting the screen in half by displaying the menus on the left and album artwork, photos, or videos on the right (whichever was appropriate for the selected item).
In 2006 Apple presented a special edition for iPod 5G of Irish rock band U2. Like its predecessor, this iPod has engraved the signatures of the four members of the band on its back, but this one was the first time the company changed the colour of the metal (not silver but black). This iPod was only available with 30GB of storage capacity. The special edition entitled purchasers to an exclusive video with 33 minutes of interviews and performance by U2, downloadable from the iTunes Store. In 2007, during a lawsuit with patent holding company Burst.com, Apple drew attention to a patent for a similar device that was developed in 1979. Kane Kramer applied for a UK patent for his design of a “plastic music box” in 1981, which he called the IXI.He was unable to secure funding to renew the US$120,000 worldwide patent, so it lapsed and Kramer never profited from his idea.
The name iPod was proposed by Vinnie Chieco, a freelance copywriter, who (with others) was called by Apple to figure out how to introduce the new player to the public. After Chieco saw a prototype, he thought of the movie 2001: A Space Odyssey and the phrase “Open the pod bay door, Hal!”, which refers to the white EVA Pods of the Discovery One spaceship. Chieco saw an analogy to the relationship between the spaceship and the smaller independent pods in the relationship between a personal computer and the music player. Apple researched the trademark and found that it was already in use. Joseph N. Grasso of New Jersey had originally listed an “iPod” trademark with the U.S. Patent and Trademark Office (USPTO) in July 2000 for Internet kiosks. The first iPod kiosks had been demonstrated to the public in New Jersey in March 1998, and commercial use began in January 2000, but had apparently been discontinued by 2001. The trademark was registered by the USPTO in November 2003, and Grasso assigned it to Apple Computer, Inc. in 2005.
The earliest recorded use in commerce of an “iPod” trademark was in 1991 by Chrysalis Corp. of Sturgis, Michigan, styled “iPOD”. In mid-2015, several new color schemes for all of the current iPod models were spotted in the latest version of iTunes, 12.2. Belgian website Belgium iPhone originally found the images when plugging in an iPod for the first time, and subsequent leaked photos were found by Pierre Dandumont. In 2017, Apple removed the iPod Nano and Shuffle from its stores, marking the end of Apple producing standalone music players.
Spreadsheet day takes place annually on 17 October in commemoration of The first spreadsheet program Visicalc, which was released October 17 1979 for the Apple II.
A spreadsheet is an interactive computer application for organization, analysis and storage of data in tabular form. Spreadsheets developed as computerized analogs of paper accounting worksheets. The program operates on data entered in cells of a table. Each cell may contain either numeric or text data, or the results of formulas that automatically calculate and display a value based on the contents of other cells. A spreadsheet may also refer to one such electronic document. The first spreadsheet was developed by Dan Bricklin of Software Arts, and was then produced for distribution by Personal Software, and marked the transition of Apple II’s from a machine for computer enthusiasts into a viable tool for business. The advantage of VisiCalc was that it was able to be used on personal computers, finally putting this valuable tool into the hands of homeowners and small business owners alike.
Spreadsheet users can adjust any stored value and observe the effects on calculated values. This makes the spreadsheet useful for “what-if” analysis since many cases can be rapidly investigated without manual recalculation. Modern spreadsheet software can have multiple interacting sheets, and can display data either as text and numerals, or in graphical form. Besides performing basic arithmetic and mathematical functions, modern spreadsheets provide built-in functions for common financial and statistical operations. Such calculations as net present value or standard deviation can be applied to tabular data with a pre-programmed function in a formula. Spreadsheet programs also provide conditional expressions, functions to convert between text and numbers, and functions that operate on strings of text. Spreadsheets have replaced paper-based systems throughout the business world. Although they were first developed for accounting or bookkeeping tasks, they now are used extensively in any context where tabular lists are built, sorted, and shared.
LANPAR, available in 1969 was the first electronic spreadsheet on mainframe and time sharing computers. LANPAR was an acronym: LANguage for Programming Arrays at Random. VisiCalc was the first electronic spreadsheet on a microcomputer, and it helped turn the Apple II computer into a popular and widely used system. Lotus 1-2-3 was the leading spreadsheet when DOS was the dominant operating system. Excel now has the largest market share on the Windows and Macintosh platforms. A spreadsheet program is a standard feature of an office productivity suite; since the advent of web apps, office suites now also exist in web app form. Web based spreadsheets are a relatively new category.
Virtual worlds day takes place annually on 20 August. A virtual world is a computer-simulated environmentwhich may be populated by many users who can create a personal avatar, and simultaneously and independently explore the virtual world, participate in its activities and communicate with others. These avatars can be textual, or graphical representations, or live video avatars with auditory and touch sensations. In general, virtual worlds allow for multiple users but single player computer games, such as Skyrim, can also be considered a type of virtual world.
The user accesses a computer-simulated world which presents perceptual stimuli to the user, who in turn can manipulate elements of the modeled world and thus experience a degree of presence. Such modeled worlds and their rules may draw from reality or fantasy worlds. Example rules are gravity, topography, locomotion, real-time actions, and communication. Communication between users can range from text, graphical icons, visual gesture, sound, and rarely, forms using touch, voice command, and balance senses.
Massively multiplayer online games depict a wide range of worlds, including those based on science fiction, the real world, super heroes, sports, horror, and historical milieus. The most common form of such games are fantasy worlds. Most MMORPGs have real-time actions and communication. Players create a character who travels between buildings, towns, and worlds to carry out business or leisure activities. Communication is usually textual, but real-time voice communication is also possible. The form of communication used can substantially affect the experience of players in the game.
Virtual worlds are not limited to games but, depending on the degree of immediacy presented, can encompass computer conferencing and text-based chatrooms. Sometimes, emoticons or ‘smilies’ are available to show feeling or facial expression. Emoticons often have a keyboard shortcut. synthetic worlds” is another term for virtual worlds.
International Geocaching Day
United States National Lemonade Day
National Bacon Lover’s Day
National Chocolate Pecan Pie Day
National Honey Bee Day
National Radio Day (not to be confused with World Radio day)