Ada Lovelace (Enchantress of Numbers)

The Analyst, Metaphysician, and Founder of Scientific Computing, Augusta Ada King, Countess of Lovelace was born on 10th December 1815. Born Augusta Ada Byron and now commonly known as Ada Lovelace, she was the daughter of Lord Byron and is remembered as a mathematician and writer chiefly known for her work on Charles Babbage’s early mechanical general-purpose computer, the Analytical Engine. Her notes on the engine include what is recognised as the first algorithm intended to be processed by a machine. Because of this, she is often considered the world’s first computer programmer and left a legacy as role model for young women entering technology careers.

Ada was the only legitimate child born during a brief marriage between the poet Lord Byron and Anne Isabella Byron). She had no relationship with her father, who separated from her mother just a month after Ada was born, and four months later he left England forever and died in Greece in 1823 leaving her mother to raise her single-handedly, Her life was an apotheosis of struggle between emotion and reason, subjectivism and objectivism, poetics and mathematics, ill health and bursts of energy. Lady Byron wished her daughter to be unlike her poetical father, and she saw to it that Ada received tutoring in mathematics and music, as disciplines to counter dangerous poetic tendencies. But Ada’s complex inheritance became apparent as early as 1828, when she produced the design for a flying machine. It was mathematics that gave her life its wings.

As a young adult, she took an interest in mathematics, and in particular that of Lucasian professor of mathematics at Cambridge, Charles Babbage whom she met met in 1833, when she was just 17, who was One of the gentlemanly scientists of the era and become Ada’s lifelong friend. Babbage, was known as the inventor of the Difference Engine, an elaborate calculating machine that operated by the method of finite differences , and they began a voluminous correspondence on the topics of mathematics, logic, and ultimately all subjects. In 1835, Ada married William King, ten years her senior, and when King inherited a noble title in 1838, they became the Earl and Countess of Lovelace. Ada had three children. The family and its fortunes were very much directed by Lady Byron, whose domineering was rarely opposed by King.Babbage had made plans in 1834 for a new kind of calculating machine (although the Difference Engine was not finished), an Analytical Engine.

His Parliamentary sponsors refused to support a second machine with the first unfinished, but Babbage found sympathy for his new project abroad. In 1842, an Italian mathematician, Louis Menebrea, published a memoir in French on the subject of the Analytical Engine. Babbage enlisted Ada as translator for the memoir, and during a nine-month period in 1842-43, she worked feverishly on the article and a set of Notes she appended to it. These notes contain what is considered the first computer program — that is, an algorithm encoded for processing by a machine. Ada’s notes are important in the early history of computers. She also foresaw the capability of computers to go beyond mere calculating or number-crunching while others, including Babbage himself, focused only on these capabilities

Ada called herself an Analyst (& Metaphysician), and the combination was put to use in the Notes. She understood the plans for the device as well as Babbage but was better at articulating its promise. She rightly saw it as what we would call a general-purpose computer. It was suited for “developing and tabulating any function whatever. . . the engine is the material expression of any indefinite function of any degree of generality and complexity.” Her Notes anticipate future developments, including computer-generated music. Sadly though Ada passed away on November 27, 1852, in Marylebone at the age of 37, from Cancer and was buried beside the father she never knew. Her contributions to science were resurrected only recently, but many new biographies* attest to the fascination of Babbage’s “Enchantress of Numbers.”

Computer Security Day

Computer security day takes place annually on 30 November. The purpose of Computer Security day is is to educate people concerning the threat of computor hacking, Phishing and Scamming, to raise awareness about computer security, and highlight measures that can be taken to keep your computer data safe from undesirable prying eyes.

In this modern age electronic devices such as smartphones, tablets, and computers are playing an increasingly important role of our everyday lives. While communication has become easier and more efficient than ever before, these technological advancements have also brought with them new concerns about privacy and security.

Computer Security Day began in 1988, around the time that computers were becoming commonplace, even if they were yet to become ubiquitous in homes. The 1980s saw not only increased usage of computers, especially in business and government, and the internet was in its early stages. While hacking and viruses have virtually been around since the early days of modern computing, evolving and increasingly sophisticated technologies began to see more applications, and therefore more security risks due to the simple fact that more data was at risk as computers found their way into banks, government offices, and businesses. As More important data got stored on computers and servers this meant more valuable information for hackers, resulting in higher profile cases of security breaches so, online security became an important concern by the end of the decade.

International Internet Day

International Internet Day takes place annually on 29 October. The Internet is the global system of interconnected computer networks that use the Internet protocol suite (TCP/IP) to link devices worldwide and build a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.

During the 1970’s Science fiction novellist Arthur C. Clarke had predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe

The origins of the Internet date back to the 1960’s when research was commissioned by the federal government of the United States to build robust, fault-tolerant communication with computer networks. Initial concepts of wide area networking originated in several computer science laboratories in the United States, United Kingdom, and France. The US Department of Defense awarded contracts as early as the 1960s, including for the development of the ARPANET project, directed by Robert Taylor and managed by Lawrence Roberts. The first message was sent over the ARPANET on 29 October 1969 from computer science Professor Leonard Kleinrock’s laboratory at University of California, Los Angeles (UCLA) to the second network node at Stanford Research Institute (SRI).

Donald Davies first demonstrated packet switching in 1967 at the National Physics Laboratory (NPL) in the UK, which became a testbed for UK research for almost two decades. Packet switching networks such as the NPL network, ARPANET, Tymnet, Merit Network, CYCLADES, and Telenet, were developed in the late 1960s and early 1970s using a variety of communications protocols. The ARPANET project led to the development of protocols for internet working, in which multiple separate networks could be joined into a network of networks. The Internet protocol suite (TCP/IP) was developed by Robert E. Kahn and Vint Cerf in the 1970s and became the standard networking protocol on the ARPANET, incorporating concepts from the French CYCLADES project directed by Louis Pouzin. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations.

The ARPANET, initially served as a backbone for interconnection of regional academic and military networks in the 1980s until Commercial Internet service providers (ISPs) began to emerge in the very late 1980s whereupon The ARPANET was decommissioned in 1990. Limited private connections to parts of the Internet by officially commercial entities emerged in several American cities by late 1989 and 1990, Initially the National Science Foundation Network acted as a new backbone in the 1980s, providing private funding for other commercial extensions, this led to worldwide participation in the development of new networking technologies, and the merger of many networks. The NSFNET was decommissioned in 1995, removing the last restrictions on the use of the Internet to carry commercial traffic and the linking of commercial networks and enterprises by the early 1990s and marked the beginning of the transition to the modern Internet, generating a sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network. Although the Internet had been widely used by academia since the 1980s, the commercialization incorporated its services and technologies into virtually every aspect of modern life.

In the 1980s, research at CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, by British computer scientist and Engineer Tim Berners-Lee and Belgian computer scientist Robert Cailliau proposed using hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”,

Berners Lee and Robert Cailliau wrote a proposal in March 1989 for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world Using concepts from his earlier hypertext systems such as ENQUIRE.  They published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3″) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, so that,authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.

The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration. The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.

On 6 August 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This date also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.” The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992, whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web. The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.

Berners-Lee’s combined hypertext with the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.

An early popular web browser was ViolaWWW for Unix and the X Windowing System..Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore. Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.

The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October 1994. It was founded at theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agency (DARPA), which had pioneered the Internet; a year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain names and HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet. The web is a collection of documents and both client and server software using Internet protocols such as TCP/IP and HTTP.Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.

The Internet has no centralized governance in either technological implementation or policies for access and usage; each constituent network sets its own policies. Only the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). Most traditional communications media, including telephony, radio, television, paper mail and newspapers are reshaped, redefined, or even bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, and video streaming websites. Newspaper, book, and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators. The Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, and social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their “brick and mortar” presence to serve a larger market or even sell goods and services entirely online. Business-to-business and financial services on the Internet affect supply chains across entire industries.

The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. Since the mid-1990s, the Internet has had a revolutionary impact on culture, commerce, and technology, including the rise of near-instant communication by electronic mail, instant messaging, voice over Internet Protocol (VoIP) telephone calls, two-way interactive video calls, and the World Wide Web with its discussion forums, blogs, social networking, and online shopping sites. The research and education community continues to develop and use advanced networks such as JANET in the United Kingdom and Internet2 in the United States. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1-Gbit/s, 10-Gbit/s, or more. The Internet’s takeover of the global communication landscape was almost instant in historical terms: it only communicated 1% of the information flowing through two-way telecommunications networks in the year 1993, already 51% by 2000, and more than 97% of the telecommunicated information by 2007. Today the Internet continues to grow, driven by ever greater amounts of online information, commerce, entertainment, and social networking. However, the future of the global internet may be shaped by regional differences in the world. In November 2006, the Internet was included on USA Today’s list of New Seven Wonders.

Bill Gates

American business magnate, software executive and philanthropist William Henry “Bill” Gates III was born October 28th, 1955. Bill Gates is the former chief executive and current chairman of Microsoft, the world’s largest personal-computer software company, which he co-founded with Paul Allen. He is consistently ranked among the world’s wealthiest people and was the wealthiest overall from 1995 to 2009, excluding 2008, when he was ranked third; in 2011 he was the third wealthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.

Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairmanthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.

Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairman.

National iPod day

National iPod Day is observed annually on October 23. Apple introduced the iPod on October 23, 2001. The iPod changed the way we listened to and purchased music. The first iPod was sold on November 10, 2001, for $399. However the price and Mac-only compatibility caused sales to be relatively slow until 2004. The iPod line came from Apple’s “digital hub” category, when the company began creating software for the growing market of personal digital devices. Digital cameras, camcorders and organizers had well-established mainstream markets, but the company found existing digital music players “big and clunky or small and useless” with user interfaces that were “unbelievably awful,” so Apple decided to develop its own. As ordered by CEO Steve Jobs, Apple’s hardware engineering chief Jon Rubinstein assembled a team of engineers to design the iPod line, including hardware engineers Tony Fadell and Michael Dhuey, and design engineer Sir Jonathan Ive. Rubinstein had already discovered the Toshiba hard disk drive while meeting with an Apple supplier in Japan, and purchased the rights to it for Apple, and had also already worked out how the screen, battery, and other key elements would work. The aesthetic was inspired by the 1958 Braun T3 transistor radio designed by Dieter Rams, while the wheel-based user interface was prompted by Bang & Olufsen’s BeoCom 6000 telephone. The product “the Walkman of the twenty-first century” was developed in less than one year and unveiled on October 23, 2001. Jobs announced it as a Mac-compatible product with a 5 GB hard drive that put “1,000 songs in your pocket.”

Apple did not develop the iPod software entirely in-house, instead using PortalPlayer’s reference platform based on two ARM cores. The platform had rudimentary software running on a commercial microkernel embedded operating system. PortalPlayer had previously been working on an IBM-branded MP3 player with Bluetooth headphones. Apple contracted another company, Pixo, to help design and implement the user interface under the direct supervision of Steve Jobs. As development progressed, Apple continued to refine the software’s look and feel. Starting with the iPod Mini, the Chicago font was replaced with Espy Sans. Later iPods switched fonts again to Podium Sans—a font similar to Apple’s corporate font, Myriad. Color display iPods then adopted some Mac OS X themes like Aqua progress bars, and brushed metal meant to evoke a combination lock. In 2007, Apple modified the iPod interface again with the introduction of the sixth-generation iPod Classic and third-generation iPod Nano by changing the font to Helvetica and, in most cases, splitting the screen in half by displaying the menus on the left and album artwork, photos, or videos on the right (whichever was appropriate for the selected item).

In 2006 Apple presented a special edition for iPod 5G of Irish rock band U2. Like its predecessor, this iPod has engraved the signatures of the four members of the band on its back, but this one was the first time the company changed the colour of the metal (not silver but black). This iPod was only available with 30GB of storage capacity. The special edition entitled purchasers to an exclusive video with 33 minutes of interviews and performance by U2, downloadable from the iTunes Store. In 2007, during a lawsuit with patent holding company Burst.com, Apple drew attention to a patent for a similar device that was developed in 1979. Kane Kramer applied for a UK patent for his design of a “plastic music box” in 1981, which he called the IXI.He was unable to secure funding to renew the US$120,000 worldwide patent, so it lapsed and Kramer never profited from his idea.

The name iPod was proposed by Vinnie Chieco, a freelance copywriter, who (with others) was called by Apple to figure out how to introduce the new player to the public. After Chieco saw a prototype, he thought of the movie 2001: A Space Odyssey and the phrase “Open the pod bay door, Hal!”, which refers to the white EVA Pods of the Discovery One spaceship. Chieco saw an analogy to the relationship between the spaceship and the smaller independent pods in the relationship between a personal computer and the music player. Apple researched the trademark and found that it was already in use. Joseph N. Grasso of New Jersey had originally listed an “iPod” trademark with the U.S. Patent and Trademark Office (USPTO) in July 2000 for Internet kiosks. The first iPod kiosks had been demonstrated to the public in New Jersey in March 1998, and commercial use began in January 2000, but had apparently been discontinued by 2001. The trademark was registered by the USPTO in November 2003, and Grasso assigned it to Apple Computer, Inc. in 2005.

The earliest recorded use in commerce of an “iPod” trademark was in 1991 by Chrysalis Corp. of Sturgis, Michigan, styled “iPOD”. In mid-2015, several new color schemes for all of the current iPod models were spotted in the latest version of iTunes, 12.2. Belgian website Belgium iPhone originally found the images when plugging in an iPod for the first time, and subsequent leaked photos were found by Pierre Dandumont. In 2017, Apple removed the iPod Nano and Shuffle from its stores, marking the end of Apple producing standalone music players. Currently, the iPod Touch is the only iPod produced by Apple.

INTERNATIONAL CAPS LOCK DAY

INTERNATIONAL CAPS LOCK DAY IS CELEBRATED ANNUALLY ON 22 OCTOBER. INTERNATIONAL CAPS LOCK DAY was founded 2000, when Derek Arnold of Iowa as a parody. after he decided that he, like so many other internet users, had simply had enough of people using all caps to emphasize themselves on the web. So he created INTERNATIONAL CAPS LOCK DAY to poke fun at those individuals who unnecessarily capitalize letters, words, and phrases and bring some sanity back to the web.The day became so popular with internet users that INTERNATIONAL CAPS LOCK DAY is now celebrated twice a year on June 28 and on October 22. The second observation on June 28 was added by Arnold in memory of American pitchman Billy Mays.

Caps Lock is a button on a computer keyboard that causes all letters of Latin-based scripts to be generated in capitals. It is a toggle key: each press reverses its action. Some keyboards also implement a light, so as to give visual feedback about whether it is on or off. Exactly what Caps Lock does depends on the keyboard hardware, the operating system, the device driver, and the keyboard layout. Usually, the effect is limited to letter keys; letters of Latin-based scripts are capitalised, while letters of other scripts (e.g. Arabic, Hebrew, Hindi) and non-letter characters are generated normally.

The Caps Lock key originated as a Shift lock key on mechanical typewriters. An early innovation in typewriters was the introduction of a second character on each typebar, thereby doubling the number of characters that could be typed, using the same number of keys. The second character was positioned above the first on the face of each typebar, and the typewriter’s Shift key caused the entire type apparatus to move, physically shifting the positioning of the typebars relative to the ink ribbon. Just as in modern computer keyboards, the shifted position was used to produce capitals and secondary characters.

The Shift lock key was introduced so the shift operation could be maintained indefinitely without continuous effort. It mechanically locked the typebars in the shifted position, causing the upper character to be typed upon pressing any key. Because the two shift keys on a typewriter required more force to operate and were meant to be pressed by the little finger, it could be difficult to hold the shift down for more than two or three consecutive strokes, therefore the introduction of the Shift lock key was also meant to reduce finger muscle pain caused by repetitive typing.

Mechanical typewriter shift lock is typically set by pushing both Shift and lock at the same time, and released by pressing Shift by itself. Computer Caps Lock is set and released by the same key, and the Caps Lock behavior in most QWERTY keyboard layouts differs from the Shift lock behavior in that it capitalizes letters but does not affect other keys, such as numbers or punctuation. Some early computer keyboards, such as the Commodore 64, had a Shift lock but no Caps Lock; others, such as the BBC Micro, had both, only one of which could be enabled at a time.

Typical Caps Lock behavior is that pressing the key sets an input mode in which all typed letters are uppercase, if applicable. The keyboard remains in Caps Lock mode and would generate all caps text until the key is pressed again. Keyboards often include a small LED to indicate that Caps Lock is active, either on the key itself or in a dedicated indicators area, where Scroll lock and Num lock indicators are also located. On the original IBM PC keyboard, this LED was exclusively controlled by the keyboard. Since the introduction of IBM AT, however, it is under control of the operating system. Small keyboards, such as netbook keyboards, forgo the indicators to conserve space, instead providing software that gives an on-screen or audio feedback.

In most cases, the status of the Caps Lock key only changes the meaning of the alphabet keys, not that of any other key. Microsoft Windows enforces this behavior only when a keyboard layout for a Latin-based script is active, e.g. the “English (United States)” layout but not the “Persian” layout. However, on certain non-QWERTY keyboard layouts, such as the French AZERTY and the German QWERTZ, Caps Lock still behaves like a traditional Shift lock, i.e., the keyboard behaves as if the Shift key is held down, causing the keyboard to input the alternative values of the keys; example the “5” key generates a “%” when Caps lock is pressed.

Depending on the keyboard layout used, the Shift key, when pressed in combination with a Latin-based letter button while Caps Lock is already on, is either ignored,[example needed] or nullifies the effect of Caps Lock, so that typed characters are in lowercase again. Microsoft Windows enforces the latter. While the typical locking behavior on keyboards with Caps Lock key is that of a toggle, each press reversing the shift state, some keyboard layouts implement a combi mode, where pressing a Shift key in Caps Lock mode will also release the Caps Lock mode, just as it typically happens in Shift lock mode. Some keyboard drivers include a configuration option to deactivate the Caps Lock key. This behavior allows users to decide themselves whether they want to use the key, or to disable it to prevent accidental activation.

However There are some protocols that make it appropriate to post in all Caps, such as when posting as part of a weather monitoring network. In this rare and perhaps singular case, all caps are how you indicate that something is in fact of importance, and with its collection of acronyms and shorthand makes sure its clear. However overuse of CAPS LOCK

Information overload day

Information Overload Day takes place annually on 20 October.  Overload Day is about taking control of the flow of information into your life, and limiting it when and if it’s necessary. It was established by a group of companies who wanted to create an awareness day to call attention to what happens when you overload your employees and customers with far too much information.

Research has shown that productivity is effected by the sheer amount of information flowing through our lives, with the average employee receiving no less than 93 emails a day. Combine that with the social media that rules our lives, the constant buzz of new text messages, and the old stand-by that is web browsing and getting overwhelmed is pretty understandable. It is also having a severe economic impact on the time spent for business purposes, not personal. A worker may be interrupted by an email just when they are deep in the middle of a project and will have to refcocus and start again after every interruption. These little pauses may not seem like much individually but combined they are adding up to a $180 Billion dollar impact.

A person can celebrate Information Overload Day by not checking your email so often. Log out of your email client, and log in only five times a day to help limit the amount of interruptions you get in a day. Turn your phone off including the vibration, and take those same stops to reply to text messages and return calls. These interruptions cost more time than you think, and you’d be surprised how productive you could be if you just eliminated them from your day.

The 20 October is also

  • Bridge Day,
  • international Independent Video Store Day,
  • Miss American Rose Day,
  • National Brandied Fruit Day
  • National Call in day for Health Reform
  • National Suspenders Day
  • Sweetest Day
  • International Day for Air traffic Controllers