Cathode Ray tube day

Cathode Ray Tube day takes place annually on 20 December. The cathode-ray tube (CRT) is a vacuum tube that contains one or more electron guns and a phosphorescent screen, and is used to display images. It modulates, accelerates, and deflects electron beam(s) onto the screen to create the images. The images may represent electrical waveforms (oscilloscope), pictures (television, computer monitor), radar targets, or other phenomena. CRTs have also been used as memory devices, in which case the visible light emitted from the fluorescent material (if any) is not intended to have significant meaning to a visual observer (though the visible pattern on the tube face may cryptically represent the stored data).

In television sets and computer monitors, the entire front area of the tube is scanned repetitively and systematically in a fixed pattern called a raster. An image is produced by controlling the intensity of each of the three electron beams, one for each additive primary color (red, green, and blue) with a video signal as a reference. In all modern CRT monitors and televisions, the beams are bent by magnetic deflection, a varying magnetic field generated by coils and driven by electronic circuits around the neck of the tube, although electrostatic deflection is commonly used in oscilloscopes, a type of electronic test instrument.

A CRT is constructed from a glass envelope which is large, deep (i.e., long from front screen face to rear end), fairly heavy, and relatively fragile. The interior of a CRT is evacuated to approximately 0.01 pascals (9.9×10−8 atm to 133 nanopascals (1.31×10−12 atm), evacuation being necessary to facilitate the free flight of electrons from the gun(s) to the tube’s face. The fact that it is evacuated makes handling an intact CRT potentially dangerous due to the risk of breaking the tube and causing a violent implosion that can hurl shards of glass at great velocity. As a matter of safety, the face is typically made of thick lead glass so as to be highly shatter-resistant and to block most X-ray emissions, particularly if the CRT is used in a consumer product.

Cathode rays were discovered by Johann Wilhelm Hittorf in 1869 in primitive Crookes tubes. He observed that some unknown rays were emitted from the cathode (negative electrode) which could cast shadows on the glowing wall of the tube, indicating the rays were traveling in straight lines. In 1890, Arthur Schuster demonstrated cathode rays could be deflected by electric fields, and William Crookes showed they could be deflected by magnetic fields. In 1897, J. J. Thomson succeeded in measuring the mass of cathode rays, showing that they consisted of negatively charged particles smaller than atoms, the first “subatomic particles”, which were later named electrons. The earliest version of the CRT was known as the “Braun tube”, invented by the German physicist Ferdinand Braun in 1897. It was a cold-cathode diode, a modification of the Crookes tube with a phosphor-coated screen.

The first cathode-ray tube to use a hot cathode was developed by John B. Johnson (who gave his name to the term Johnson noise) and Harry Weiner Weinhart of Western Electric, and became a commercial product in 1922. In 1925, Kenjiro Takayanagi demonstrated a CRT television that received images with a 40-line resolution. By 1927, he improved the resolution to 100 lines, which was unrivaled until 1931. By 1928, he was the first to transmit human faces in half-tones on a CRT display. By 1935, he had invented an early all-electronic CRT television. It was named in 1929 by inventor Vladimir K. Zworykin, who was influenced by Takayanagi’s earlier work. RCA was granted a trademark for the term (for its cathode-ray tube) in 1932; it voluntarily released the term to the public domain in 1950. The first commercially made electronic television sets with cathode-ray tubes were manufactured by Telefunken in Germany in 1934.

Color tubes use three different phosphors which emit red, green, and blue light respectively. They are packed together in stripes (as in aperture grille designs) or clusters called “triads” (as in shadow mask CRTs). Color CRTs have three electron guns, one for each primary color, arranged either in a straight line or in an equilateral triangular configuration (the guns are usually constructed as a single unit). (The triangular configuration is often called “delta-gun”, based on its relation to the shape of the Greek letter delta Δ.) A grille or mask absorbs the electrons that would otherwise hit the wrong phosphor.[26] A shadow mask tube uses a metal plate with tiny holes, placed so that the electron beam only illuminates the correct phosphors on the face of the tube;[25] the holes are tapered so that the electrons that strike the inside of any hole will be reflected back, if they are not absorbed (e.g. due to local charge accumulation), instead of bouncing through the hole to strike a random (wrong) spot on the screen. Another type of color CRT uses an aperture grille of tensioned vertical wires to achieve the same result.

In oscilloscope CRTs, electrostatic deflection is used, rather than the magnetic deflection commonly used with television and other large CRTs. The beam is deflected horizontally by applying an electric field between a pair of plates to its left and right, and vertically by applying an electric field to plates above and below. Televisions use magnetic rather than electrostatic deflection because the deflection plates obstruct the beam when the deflection angle is as large as is required for tubes that are relatively short for their size. Various phosphors are available depending upon the needs of the measurement or display application. The brightness, color, and persistence of the illumination depends upon the type of phosphor used on the CRT screen. Phosphors are available with persistences ranging from less than one microsecond to several seconds.[18] For visual observation of brief transient events, a long persistence phosphor may be desirable. For events which are fast and repetitive, or high frequency, a short-persistence phosphor is generally preferable.

When displaying fast one-shot events, the electron beam must deflect very quickly, with few electrons impinging on the screen, leading to a faint or invisible image on the display. Oscilloscope CRTs designed for very fast signals can give a brighter display by passing the electron beam through a micro-channel plate just before it reaches the screen. Through the phenomenon of secondary emission, this plate multiplies the number of electrons reaching the phosphor screen, giving a significant improvement in writing rate (brightness) and improved sensitivity and spot size as well. Most oscilloscopes have a graticule as part of the visual display, to facilitate measurements. The graticule may be permanently marked inside the face of the CRT, or it may be a transparent external plate made of glass or acrylic plastic. An internal graticule eliminates parallax error, but cannot be changed to accommodate different types of measurements. Oscilloscopes commonly provide a means for the graticule to be illuminated from the side, which improves its visibility.

The use of a long persistence phosphor in an Oscilloscope may allow a single brief event to be observed after the event, but only for a few seconds at best. This limitation can be overcome by the use of a direct view storage cathode-ray tube (storage tube). A storage tube will continue to display the event after it has occurred until such time as it is erased. A storage tube is similar to a conventional tube except that it is equipped with a metal grid coated with a dielectric layer located immediately behind the phosphor screen. An externally applied voltage to the mesh initially ensures that the whole mesh is at a constant potential. This mesh is constantly exposed to a low velocity electron beam from a ‘flood gun’ which operates independently of the main gun. This flood gun is not deflected like the main gun but constantly ‘illuminates’ the whole of the storage mesh. The initial charge on the storage mesh is such as to repel the electrons from the flood gun which are prevented from striking the phosphor screen.

When the main electron gun writes an image to the screen, the energy in the main beam is sufficient to create a ‘potential relief’ on the storage mesh. The areas where this relief is created no longer repel the electrons from the flood gun which now pass through the mesh and illuminate the phosphor screen. Consequently, the image that was briefly traced out by the main gun continues to be displayed after it has occurred. The image can be ‘erased’ by resupplying the external voltage to the mesh restoring its constant potential. The time for which the image can be displayed was limited because, in practice, the flood gun slowly neutralises the charge on the storage mesh. One way of allowing the image to be retained for longer is temporarily to turn off the flood gun. It is then possible for the image to be retained for several days.

The majority of storage tubes allow for a lower voltage to be applied to the storage mesh which slowly restores the initial charge state. By varying this voltage a variable persistence is obtained. Turning off the flood gun and the voltage supply to the storage mesh allows such a tube to operate as a conventional oscilloscope tube. During the 1940’s The Williams tube or Williams-Kilburn cathode-ray tube was used in  as a random-access digital storage device to electronically store binary data however the Williams tube was not a display device, and could not be viewed since a metal plate covered its screen.

Since the late 2000s, CRTs have been largely superseded by newer “flat panel” display technologies such as LCD, plasma display, and especially OLED displays, which in the case of LCD and OLED displays have lower manufacturing costs and power consumption, as well as significantly less weight and bulk. Flat panel displays can also be made in very large sizes; whereas 38 to 40 in (97 to 102 cm) was about the largest size of a CRT television, flat panels are available in 60 in (150 cm) and larger sizes. The last known manufacturer of (in this case, recycled) CRTs ceased in 2015.

Passwords

Every year the cyber security and identity protection service SplashData, evaluates millions of leaked passwords to determine which are the most easily hacked. There are numerous themes across the worst passwords list, including first names, hobbies celebrity names, terms from pop culture and sports, and simple keyboard patterns. Out of the five million leaked passwords evaluated for the 2018 list, most were held by users in North America and Western Europe.

Experts recommend people use a passphrases of twelve characters or more with mixed types of characters. They also recommend using different password for each site. Users can also use a password manager to organise passwords, generate secure random passwords, and automatically log into websites. However Experts warn that adding a number or symbol to a common word is ineffective. The reason changing a password frequently may not help is due to the fact that, when most people change their password, they only make minor tweaks such as replacing the number 1 with a number 2.

If you wish to create an effective and secure new password make sure you Use a combination of numbers, symbols, uppercase and lowercase letters, also Ensure that the password is at least eight characters long and Use abbreviated phrases for passwords, Change your passwords regularly and also Log out of websites and devices after you have finished using them. Internet users should not choose a commonly used password such as ‘123456’, ‘password’, ‘qwerty’ or ‘111111’. Internet users should also refrain from using a solitary word or a derivative of a family member’s name, pet’s name, phone number, address or birthday for their password. It is also inadvisable to write your password down or Answer ‘yes’ when asked to save your password to a computer browser.

Ada Lovelace (Enchantress of Numbers)

The Analyst, Metaphysician, and Founder of Scientific Computing, Augusta Ada King, Countess of Lovelace was born on 10th December 1815. Born Augusta Ada Byron and now commonly known as Ada Lovelace, she was the daughter of Lord Byron and is remembered as a mathematician and writer chiefly known for her work on Charles Babbage’s early mechanical general-purpose computer, the Analytical Engine. Her notes on the engine include what is recognised as the first algorithm intended to be processed by a machine. Because of this, she is often considered the world’s first computer programmer and left a legacy as role model for young women entering technology careers.

Ada was the only legitimate child born during a brief marriage between the poet Lord Byron and Anne Isabella Byron). She had no relationship with her father, who separated from her mother just a month after Ada was born, and four months later he left England forever and died in Greece in 1823 leaving her mother to raise her single-handedly, Her life was an apotheosis of struggle between emotion and reason, subjectivism and objectivism, poetics and mathematics, ill health and bursts of energy. Lady Byron wished her daughter to be unlike her poetical father, and she saw to it that Ada received tutoring in mathematics and music, as disciplines to counter dangerous poetic tendencies. But Ada’s complex inheritance became apparent as early as 1828, when she produced the design for a flying machine. It was mathematics that gave her life its wings.

As a young adult, she took an interest in mathematics, and in particular that of Lucasian professor of mathematics at Cambridge, Charles Babbage whom she met met in 1833, when she was just 17, who was One of the gentlemanly scientists of the era and become Ada’s lifelong friend. Babbage, was known as the inventor of the Difference Engine, an elaborate calculating machine that operated by the method of finite differences , and they began a voluminous correspondence on the topics of mathematics, logic, and ultimately all subjects. In 1835, Ada married William King, ten years her senior, and when King inherited a noble title in 1838, they became the Earl and Countess of Lovelace. Ada had three children. The family and its fortunes were very much directed by Lady Byron, whose domineering was rarely opposed by King.Babbage had made plans in 1834 for a new kind of calculating machine (although the Difference Engine was not finished), an Analytical Engine.

His Parliamentary sponsors refused to support a second machine with the first unfinished, but Babbage found sympathy for his new project abroad. In 1842, an Italian mathematician, Louis Menebrea, published a memoir in French on the subject of the Analytical Engine. Babbage enlisted Ada as translator for the memoir, and during a nine-month period in 1842-43, she worked feverishly on the article and a set of Notes she appended to it. These notes contain what is considered the first computer program — that is, an algorithm encoded for processing by a machine. Ada’s notes are important in the early history of computers. She also foresaw the capability of computers to go beyond mere calculating or number-crunching while others, including Babbage himself, focused only on these capabilities

Ada called herself an Analyst (& Metaphysician), and the combination was put to use in the Notes. She understood the plans for the device as well as Babbage but was better at articulating its promise. She rightly saw it as what we would call a general-purpose computer. It was suited for “developing and tabulating any function whatever. . . the engine is the material expression of any indefinite function of any degree of generality and complexity.” Her Notes anticipate future developments, including computer-generated music. Sadly though Ada passed away on November 27, 1852, in Marylebone at the age of 37, from Cancer and was buried beside the father she never knew. Her contributions to science were resurrected only recently, but many new biographies* attest to the fascination of Babbage’s “Enchantress of Numbers.”

Computer Security Day

Computer security day takes place annually on 30 November. The purpose of Computer Security day is is to educate people concerning the threat of computor hacking, Phishing and Scamming, to raise awareness about computer security, and highlight measures that can be taken to keep your computer data safe from undesirable prying eyes.

In this modern age electronic devices such as smartphones, tablets, and computers are playing an increasingly important role of our everyday lives. While communication has become easier and more efficient than ever before, these technological advancements have also brought with them new concerns about privacy and security.

Computer Security Day began in 1988, around the time that computers were becoming commonplace, even if they were yet to become ubiquitous in homes. The 1980s saw not only increased usage of computers, especially in business and government, and the internet was in its early stages. While hacking and viruses have virtually been around since the early days of modern computing, evolving and increasingly sophisticated technologies began to see more applications, and therefore more security risks due to the simple fact that more data was at risk as computers found their way into banks, government offices, and businesses. As More important data got stored on computers and servers this meant more valuable information for hackers, resulting in higher profile cases of security breaches so, online security became an important concern by the end of the decade.

International Internet Day

International Internet Day takes place annually on 29 October. The Internet is the global system of interconnected computer networks that use the Internet protocol suite (TCP/IP) to link devices worldwide and build a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.

During the 1970’s Science fiction novellist Arthur C. Clarke had predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe

The origins of the Internet date back to the 1960’s when research was commissioned by the federal government of the United States to build robust, fault-tolerant communication with computer networks. Initial concepts of wide area networking originated in several computer science laboratories in the United States, United Kingdom, and France. The US Department of Defense awarded contracts as early as the 1960s, including for the development of the ARPANET project, directed by Robert Taylor and managed by Lawrence Roberts. The first message was sent over the ARPANET on 29 October 1969 from computer science Professor Leonard Kleinrock’s laboratory at University of California, Los Angeles (UCLA) to the second network node at Stanford Research Institute (SRI).

Donald Davies first demonstrated packet switching in 1967 at the National Physics Laboratory (NPL) in the UK, which became a testbed for UK research for almost two decades. Packet switching networks such as the NPL network, ARPANET, Tymnet, Merit Network, CYCLADES, and Telenet, were developed in the late 1960s and early 1970s using a variety of communications protocols. The ARPANET project led to the development of protocols for internet working, in which multiple separate networks could be joined into a network of networks. The Internet protocol suite (TCP/IP) was developed by Robert E. Kahn and Vint Cerf in the 1970s and became the standard networking protocol on the ARPANET, incorporating concepts from the French CYCLADES project directed by Louis Pouzin. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations.

The ARPANET, initially served as a backbone for interconnection of regional academic and military networks in the 1980s until Commercial Internet service providers (ISPs) began to emerge in the very late 1980s whereupon The ARPANET was decommissioned in 1990. Limited private connections to parts of the Internet by officially commercial entities emerged in several American cities by late 1989 and 1990, Initially the National Science Foundation Network acted as a new backbone in the 1980s, providing private funding for other commercial extensions, this led to worldwide participation in the development of new networking technologies, and the merger of many networks. The NSFNET was decommissioned in 1995, removing the last restrictions on the use of the Internet to carry commercial traffic and the linking of commercial networks and enterprises by the early 1990s and marked the beginning of the transition to the modern Internet, generating a sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network. Although the Internet had been widely used by academia since the 1980s, the commercialization incorporated its services and technologies into virtually every aspect of modern life.

In the 1980s, research at CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, by British computer scientist and Engineer Tim Berners-Lee and Belgian computer scientist Robert Cailliau proposed using hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”,

Berners Lee and Robert Cailliau wrote a proposal in March 1989 for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world Using concepts from his earlier hypertext systems such as ENQUIRE.  They published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3″) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, so that,authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.

The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration. The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.

On 6 August 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This date also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.” The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992, whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web. The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.

Berners-Lee’s combined hypertext with the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.

An early popular web browser was ViolaWWW for Unix and the X Windowing System..Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore. Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.

The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October 1994. It was founded at theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agency (DARPA), which had pioneered the Internet; a year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain names and HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet. The web is a collection of documents and both client and server software using Internet protocols such as TCP/IP and HTTP.Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.

The Internet has no centralized governance in either technological implementation or policies for access and usage; each constituent network sets its own policies. Only the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). Most traditional communications media, including telephony, radio, television, paper mail and newspapers are reshaped, redefined, or even bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, and video streaming websites. Newspaper, book, and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators. The Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, and social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their “brick and mortar” presence to serve a larger market or even sell goods and services entirely online. Business-to-business and financial services on the Internet affect supply chains across entire industries.

The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. Since the mid-1990s, the Internet has had a revolutionary impact on culture, commerce, and technology, including the rise of near-instant communication by electronic mail, instant messaging, voice over Internet Protocol (VoIP) telephone calls, two-way interactive video calls, and the World Wide Web with its discussion forums, blogs, social networking, and online shopping sites. The research and education community continues to develop and use advanced networks such as JANET in the United Kingdom and Internet2 in the United States. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1-Gbit/s, 10-Gbit/s, or more. The Internet’s takeover of the global communication landscape was almost instant in historical terms: it only communicated 1% of the information flowing through two-way telecommunications networks in the year 1993, already 51% by 2000, and more than 97% of the telecommunicated information by 2007. Today the Internet continues to grow, driven by ever greater amounts of online information, commerce, entertainment, and social networking. However, the future of the global internet may be shaped by regional differences in the world. In November 2006, the Internet was included on USA Today’s list of New Seven Wonders.

Bill Gates

American business magnate, software executive and philanthropist William Henry “Bill” Gates III was born October 28th, 1955. Bill Gates is the former chief executive and current chairman of Microsoft, the world’s largest personal-computer software company, which he co-founded with Paul Allen. He is consistently ranked among the world’s wealthiest people and was the wealthiest overall from 1995 to 2009, excluding 2008, when he was ranked third; in 2011 he was the third wealthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.

Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairmanthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.

Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairman.

National iPod day

National iPod Day is observed annually on October 23. Apple introduced the iPod on October 23, 2001. The iPod changed the way we listened to and purchased music. The first iPod was sold on November 10, 2001, for $399. However the price and Mac-only compatibility caused sales to be relatively slow until 2004. The iPod line came from Apple’s “digital hub” category, when the company began creating software for the growing market of personal digital devices. Digital cameras, camcorders and organizers had well-established mainstream markets, but the company found existing digital music players “big and clunky or small and useless” with user interfaces that were “unbelievably awful,” so Apple decided to develop its own. As ordered by CEO Steve Jobs, Apple’s hardware engineering chief Jon Rubinstein assembled a team of engineers to design the iPod line, including hardware engineers Tony Fadell and Michael Dhuey, and design engineer Sir Jonathan Ive. Rubinstein had already discovered the Toshiba hard disk drive while meeting with an Apple supplier in Japan, and purchased the rights to it for Apple, and had also already worked out how the screen, battery, and other key elements would work. The aesthetic was inspired by the 1958 Braun T3 transistor radio designed by Dieter Rams, while the wheel-based user interface was prompted by Bang & Olufsen’s BeoCom 6000 telephone. The product “the Walkman of the twenty-first century” was developed in less than one year and unveiled on October 23, 2001. Jobs announced it as a Mac-compatible product with a 5 GB hard drive that put “1,000 songs in your pocket.”

Apple did not develop the iPod software entirely in-house, instead using PortalPlayer’s reference platform based on two ARM cores. The platform had rudimentary software running on a commercial microkernel embedded operating system. PortalPlayer had previously been working on an IBM-branded MP3 player with Bluetooth headphones. Apple contracted another company, Pixo, to help design and implement the user interface under the direct supervision of Steve Jobs. As development progressed, Apple continued to refine the software’s look and feel. Starting with the iPod Mini, the Chicago font was replaced with Espy Sans. Later iPods switched fonts again to Podium Sans—a font similar to Apple’s corporate font, Myriad. Color display iPods then adopted some Mac OS X themes like Aqua progress bars, and brushed metal meant to evoke a combination lock. In 2007, Apple modified the iPod interface again with the introduction of the sixth-generation iPod Classic and third-generation iPod Nano by changing the font to Helvetica and, in most cases, splitting the screen in half by displaying the menus on the left and album artwork, photos, or videos on the right (whichever was appropriate for the selected item).

In 2006 Apple presented a special edition for iPod 5G of Irish rock band U2. Like its predecessor, this iPod has engraved the signatures of the four members of the band on its back, but this one was the first time the company changed the colour of the metal (not silver but black). This iPod was only available with 30GB of storage capacity. The special edition entitled purchasers to an exclusive video with 33 minutes of interviews and performance by U2, downloadable from the iTunes Store. In 2007, during a lawsuit with patent holding company Burst.com, Apple drew attention to a patent for a similar device that was developed in 1979. Kane Kramer applied for a UK patent for his design of a “plastic music box” in 1981, which he called the IXI.He was unable to secure funding to renew the US$120,000 worldwide patent, so it lapsed and Kramer never profited from his idea.

The name iPod was proposed by Vinnie Chieco, a freelance copywriter, who (with others) was called by Apple to figure out how to introduce the new player to the public. After Chieco saw a prototype, he thought of the movie 2001: A Space Odyssey and the phrase “Open the pod bay door, Hal!”, which refers to the white EVA Pods of the Discovery One spaceship. Chieco saw an analogy to the relationship between the spaceship and the smaller independent pods in the relationship between a personal computer and the music player. Apple researched the trademark and found that it was already in use. Joseph N. Grasso of New Jersey had originally listed an “iPod” trademark with the U.S. Patent and Trademark Office (USPTO) in July 2000 for Internet kiosks. The first iPod kiosks had been demonstrated to the public in New Jersey in March 1998, and commercial use began in January 2000, but had apparently been discontinued by 2001. The trademark was registered by the USPTO in November 2003, and Grasso assigned it to Apple Computer, Inc. in 2005.

The earliest recorded use in commerce of an “iPod” trademark was in 1991 by Chrysalis Corp. of Sturgis, Michigan, styled “iPOD”. In mid-2015, several new color schemes for all of the current iPod models were spotted in the latest version of iTunes, 12.2. Belgian website Belgium iPhone originally found the images when plugging in an iPod for the first time, and subsequent leaked photos were found by Pierre Dandumont. In 2017, Apple removed the iPod Nano and Shuffle from its stores, marking the end of Apple producing standalone music players. Currently, the iPod Touch is the only iPod produced by Apple.