Day of the programmer

The Day of the Programmer is an international professional day that is celebrated on the 256th (hexadecimal 100th, or the 28th) day of each year (September 13 during common years and on September 12 in leap years). The number 256 (28) was chosen because it is the number of distinct values that can be represented with a byte, a value well-known to programmers. 256 is also the highest power of two that is less than 365, the number of days in a common year.

A computer programmer, sometimes called more recently a coder (especially in more informal contexts), is a person who creates computer software. The term computer programmer can refer to a specialist in one area of computers, or to a generalist who writes code for many kinds of software. A programmer’s most oft-used computer language (e.g., Assembly, COBOL, C, C++, C#, Java, Lisp, Python) may be prefixed to the term programmer. Some who work with web programming languages also prefix their titles with web. A range of occupations that involve programming also often require a range of other, similar skills, for example: (software) developer, web developer, mobile applications developer, embedded firmware developer, software engineer, computer scientist, game programmer, game developer and software analyst. The use of the term programmer as applied to these positions is sometimes considered an insulting simplification or even derogatory.

British countess and mathematician Ada Lovelace is often considered to be the first computer programmer, as she was the first to publish part of a program (specifically an algorithm) intended for implementation on Charles Babbage’s analytical engine, in October 1842. The algorithm was used to calculate Bernoulli numbers.[7] Because Babbage’s machine was never completed as a functioning standard in Lovelace’s time, she unfortunately never had the opportunity to see the algorithm in action. The first person to execute a program on a functioning, modern, electronic computer was the renowned computer scientist Konrad Zuse, in 1941. The ENIAC programming team, consisting of Kay McNulty, Betty Jennings, Betty Snyder, Marlyn Wescoff, Fran Bilas and Ruth Lichterman were the first regularly working programmers. International Programmers’ Day is celebrated annually on 7 January. In 2009, the government of Russia decreed a professional annual holiday known as Programmers’ Day to be celebrated on 13 September (12 September in leap years). It had already been an unofficial holiday before that in many countries.

The word software was used as early as 1953, but did not regularly appear in print until the 1960’s. Before this time, computers were programmed either by customers or the few commercial computer manufacturers of the time, such as UNIVAC and IBM. The first company founded to specifically provide software products and services was the Computer Usage Company, in 1955. The software industry expanded in the early 1960’s, almost immediately after computers were first sold in mass-produced quantities. Universities, governments and businesses created a demand for software. Many of these programs were written in-house by full-time staff programmers; some were distributed freely between users of a particular machine for no charge. And others were developed on a commercial basis. Other firms, such as Computer Sciences Corporation (founded in 1959) also started to grow. The computer/hardware manufacturers soon started bundling operating systems, system software and programming environments with their machines.[citation needed]

During the mid 1970’s The industry expanded greatly with the rise of the personal computer (“PC”). This brought computing to the average office worker and helped create a constantly-growing market for games, applications and utilities software. CP/M, later replaced by DOS, Microsoft’s first operating system product, was the first popular operating system of the time. In the early years of the 21st century, another successful business model has arisen for hosted software, called software-as-a-service, or SaaS; this was at least the third time this model had been attempted. From the point of view of producers of some proprietary software, SaaS reduces the concerns about unauthorized copying, since it can only be accessed through the Web, and by definition, no client software is loaded onto the end user’s PC. By 2014, the role of cloud developer had been defined; in this context, one definition of a “developer” in general was published

Computer programmers write, test, debug, and maintain the detailed instructions, called computer programs, that computers must follow to perform their functions. Programmers also conceive, design, and test logical structures for solving problems by computer. Many technical innovations in programming — advanced computing technologies and sophisticated new languages and programming tools — have redefined the role of a programmer and elevated much of the programming work done today. Job titles and descriptions may vary, depending on the organization.

Programmers work in many settings, including corporate information technology (“IT”) departments, big software companies, small service firms and government entities of all sizes. Many professional programmers also work for consulting companies at client sites as contractors. Licensing is not typically required to work as a programmer, although professional certifications are commonly held by programmers. Programming is widely considered a profession (although some[who?] authorities disagree on the grounds that only careers with legal licensing requirements count as a profession).

Programmers’ work varies widely depending on the type of business for which they are writing programs. For example, the instructions involved in updating financial records are very different from those required to duplicate conditions on an aircraft for pilots training in a flight simulator. Simple programs can be written in a few hours, more complex ones may require more than a year of work, while others are never considered ‘complete’ but rather are continuously improved as long as they stay in use. In most cases, several programmers work together as a team under a senior programmer’s supervision.

Programmers write programs according to the specifications determined primarily by more senior programmers and by systems analysts. After the design process is complete, it is the job of the programmer to convert that design into a logical series of instructions that the computer can follow. The programmer codes these instructions in one of many programming languages. Different programming languages are used depending on the purpose of the program. COBOL, for example, is commonly used for business applications that typically run on mainframe and midrange computers, whereas Fortran is used in science and engineering. C++ is widely used for both scientific and business applications. Java, C#, VB and PHP are popular programming languages for Web and business applications. Programmers generally know more than one programming language and, because many languages are similar, they often can learn new languages relatively easily. In practice, programmers often are referred to by the language they know, e.g. as Java programmers, or by the type of function they perform or environment in which they work: for example, database programmers, mainframe programmers, or Web developers.

When making changes to the source code that programs are made up of, programmers need to make other programmers aware of the task that the routine is to perform. They do this by inserting comments in the source code so that others can understand the program more easily and by documenting their code. To save work, programmers often use libraries of basic code that can be modified or customized for a specific application. This approach yields more reliable and consistent programs and increases programmers’ productivity by eliminating some routine steps.

In order too makes sure a program runs properly, Programmers test it by running it and looking for bugs (errors). As they are identified, the programmer usually makes the appropriate corrections, then rechecks the program until an acceptably low level and severity of bugs remain. This process is called testing and debugging. These are important parts of every programmer’s job. Programmers may continue to fix these problems throughout the life of a program. Updating, repairing, modifying, and expanding existing programs is sometimes called maintenance programming. Programmers may contribute to user guides and online help, or they may work with technical writers to do such work.

Computer programmers often are grouped into two broad types: application programmers and systems programmers. Application programmers write programs to handle a specific job, such as a program to track inventory within an organization. They also may revise existing packaged software or customize generic applications which are frequently purchased from independent software vendors. Systems programmers, in contrast, write programs to maintain and control computer systems software, such as operating systems and database management systems. These workers make changes in the instructions that determine how the network, workstations, and CPU of the system handle the various jobs they have been given and how they communicate with peripheral equipment such as printers and disk drives.

Programmers in software development companies may work directly with experts from various fields to create software – either programs designed for specific clients or packaged software for general use – ranging from video games to educational software to programs for desktop publishing and financial planning. Programming of packaged software constitutes one of the most rapidly growing segments of the computer services industry. Some companies or organizations – even small ones – have set up their own IT team to ensure the design and development of in-house software to answer to very specific needs from their internal end-users, especially when existing software are not suitable or too expensive. This is for example the case in research laboratories.

In some organizations, particularly small on, people commonly known as programmer analysts are responsible for both the systems analysis and the actual programming work. The transition from a mainframe environment to one that is based primarily on personal computers (PCs) has blurred the once rigid distinction between the programmer and the user. Increasingly, adept end users are taking over many of the tasks previously performed by programmers. For example, the growing use of packaged software, such as spreadsheet and database management software packages, allows users to write simple programs to access data and perform calculations.

In addition, the rise of the Internet has made web development a huge part of the programming field. Currently more software applications are web applications that can be used by anyone with a web browser. Examples of such applications include the Google search service, the Outlook.com e-mail service, and the Flickr photo-sharing service. Programming editors, also known as source code editors, are text editors that are specifically designed for programmers or developers for writing the source code of an application or a program. Most of these editors include features useful for programmers, which may include color syntax highlighting, auto indentation, auto-complete, bracket matching, syntax check, and allows plug-ins. These features aid the users during coding, debugging and testing.

Personal Computer Day

Personal Computer Day takes place annually on 12 August. It commemorates the introduction of the first Personal Computer, the IBM PC Model 5150, on 12 August 1981. This machime retailed at $1,565 USD, and had 16 kB of memory which seems paltry Compared with Most of today’s tablets which have at least 1 Gigabyte of RAM, and 16 GB of internal memory. (One Megabyte is 1,024 Kilobytes – One Gigabyte is 1024 Megabytes and 1 Terabyte is 1024 Megabytes.)

The personal computer (PC) was first developed as a multi-purpose machine whose size, capabilities, and price made it feasible for individual use. Personal computers are intended to be operated directly by an end user, rather than by a computer expert or technician. Unlike large costly minicomputer and mainframes, time-sharing by many people at the same time. in the 1960s Institutional or corporate computer owners had to write their own programs to do any useful work with the machines. So during the 1960’s and 1970’s the personal computer was developed

While personal computer users may develop their own applications, usually these systems run commercial software, free-of-charge software (“freeware”) or free and open-source software, which is provided in ready-to-run form. Software for personal computers is typically developed and distributed independently from the hardware or operating system manufacturers. Many personal computer users no longer need to write their own programs to make any use of a personal computer, although end-user programming is still feasible. This contrasts with mobile systems, where software is often only available through a manufacturer-supported channel, and end-user program development may be discouraged by lack of support by the manufacturer.

The advent of personal computers and the concurrent Digital Revolution have significantly affected the lives of people in all countries and Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the personal computer market, first with MS-DOS and then with Microsoft Windows. Alternatives to Microsoft’s Windows operating systems occupy a minority share of the industry. These include Apple’s macOS and free and open-source Unix-like operating systems.

World Wide Web

Credited with being the ‘Inventor of the World Wide Web, Tim Berners-Lee released files describing his idea for the World Wide Web On this date 6th August in 1991 and WWW debuted as a publicly available service on the Internet.

British computer scientist, MIT professor and progenator of the World Wide Web Sir Timothy John “Tim” Berners-Lee, OM, KBE, FRS, FREng, FRSA Was Born 8th June 1955, “He made a proposal for an information management system in March 1989 and on 25 December 1990, with the help of Robert Cailliau and a young student at CERN, he implemented the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the Internet.

In 2004, Berners-Lee was knighted by Queen Elizabeth II for his pioneering work and is also the director of the World Wide Web Consortium (W3C), which oversees the Web’s continued development. He is also the founder of the World Wide Web Foundation, and is a senior researcher and holder of the Founders Chair at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). He is a director of The Web Science Research Initiative and a member of the advisory board of the MIT Center for Collective Intelligence. In April 2009, he was elected a foreign associate of the United States National Academy of Sciences.In June 2009 then British Prime Minister Gordon Brown (BOO! HISS!) announced Berners-Lee would work with the UK Government to help make data more open and accessible on the Web, building on the work of the Power of Information Task Force. Berners-Lee and Professor Nigel Shadbolt are the two key figures behind data.gov.uk, a UK Government project to open up almost all data acquired for official purposes for free re-use.

Commenting on the opening up of Ordnance Survey data in April 2010 Berners-Lee said that: “The changes signal a wider cultural change in Government based on an assumption that information should be in the public domain unless there is a good reason not to—not the other way around.” He went on to say “Greater openness, accountability and transparency in Government will give people greater choice and make it easier for individuals to get more directly involved in issues that matter to them.”In November 2009, Berners-Lee launched the World Wide Web Foundation in order to “Advance the Web to empower humanity by launching transformative programs that build local capacity to leverage the Web as a medium for positive change.”

Berners-Lee is also one of the pioneer voices in favour of Net Neutrality, and has expressed the view that ISPs should supply “connectivity with no strings attached,” and should neither control nor monitor customers’ browsing activities without their expressed consent. He advocates the idea that net neutrality is a kind of human network right: “Threats to the Internet, such as companies or governments that interfere with or snoop on Internet traffic, compromise basic human network rights.”Berners-Lee is also a co-director of the Open Data Institute. He was honoured as the ‘Inventor of the World Wide Web’ in a section of the 2012 Summer Olympics opening ceremony in which he also participated, working at a NeXT Computer. He tweeted: “This is for everyone”, instantly spelled out in LCD lights attached to the chairs of the 70,500 people in the audience.

International and National events happening 1 August

  • World Middle Finger Day
  • Raspberry Cream Pie Day
  • National Girlfriends Day
  • National Minority Donor Awareness Day
  • National Raspberry Cream Pie Day
  • Play Ball Day
  • Respect for Parents Day
  • Rounds Resounding Day
  • Spiderman Day
  • US Air Force Day
  • Woman Astronomers Day

Lammas Day/ Lughnasadh

Lammas Day (Anglo-Saxon hlaf-mas, “loaf-mass”) is also known as Lughnasadh and is a holiday celebrated in some English-speaking countries in the Northern Hemisphere on 1 August. It is a festival to mark the annual wheat harvest, and is the first harvest festival of the year. On this day it was customary to bring to church a loaf made from the new crop, which began to be harvested at Lammastide, which falls at the halfway point between the summer Solstice and Autumn September Equinox. Lammas is also known as Lambess. During Lammas many people participated by Handfasting and playing Funeral Games, loaves were also made  from the grain collected at harvest which was  blessed, and may have be employed afterwards in protective rituals in Anglo Saxon England. a book of Anglo-Saxon charms directed that the lammas bread be broken into four bits, which were to be placed at the four corners of the barn, to protect the garnered grain.


World Wide Web Day

World Wide Web Day takes place annually on 1 August. It is a global celebration dedicated to web browsing, the online activity that brings the world at your fingertips and a wealth of knowledge at your feet. The World Wide Web was conceived by Tim Berners-Lee in 1989 at the CERN centre in Geneva, Switzerland, as a way for him to communicate with co-workers via hyperlinks. The World Wide Web debuted on 23 August 1991 as asystem of interlinked hypertext documents accessed via the Internet. With a web browser, one can view web pages that may contain text, images, videos, and other multimedia and navigate between them via hyperlinks.The web was developed between March 1989 and December 1990. Using concepts from his earlier hypertext systems such as ENQUIRE, British engineer Tim Berners-Lee, acomputer scientist and at that time employee of the CERN, now Director of the World Wide Web Consortium (W3C), wrote a proposal in March 1989 for what would eventually become the World Wide Web. The 1989 proposal was meant for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world. At CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, berners-Lee and Belgian computer scientist Robert Cailliau proposed in 1990 to use hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”, and Berners-Lee finished the first website in December that year. Berners-Lee posted the project on the alt.hypertext newsgroup on 7 August 1991

In the May 1970 issue of Popular Science magazine, Arthur C. Clarke predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe.In March 1989, Tim Berners-Lee wrote a proposal that referenced ENQUIRE, a database and software project he had built in 1980, and described a more elaborate information management system. With help from Robert Cailliau, he published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3″) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, [so that] authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.

The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration.The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.

On 6 August 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This date also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.” The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992, whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web. The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.

Berners-Lee’s breakthrough was to marry hypertext to the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.

An early popular web browser was ViolaWWW for Unix and the X Windowing System..Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore. Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.

After leaving the European Organization for Nuclear Research (CERN) In 1994, Tim Berners Lee founded The World Wide Web Consortium (W3C) at the theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agencyw (DARPA), which had pioneered the Internet. A year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain names and HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet. The web is a collection of documents and both client and server software using Internet protocols such as TCP/IP and HTTP. Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.

Since being unveiled, the World Wide Web has become the main means of interaction, transaction and communication among humans, opening the door of opportunity for people in ways that would have been unimaginable to previous generations.

Sir Clive Sinclair

English entrepreneur and inventor Sir Clive Marles Sinclair was born 30 July 1940. He is most commonly known for. his work in consumer electronics in the late 1970s and early 1980s. Sinclair’s micro Kit was formalised in an exercise book dated 19 June 1958 three weeks before his A-levels. Sinclair drew a radio circuit, Model Mark I, with a components list: cost per set 9/11 (49½p), plus coloured wire and solder, nuts and bolts, plus celluloid chassis (drilled) for nine shillings (45p). Also in the book are advertisement rates for Radio Constructor (9d (3¾p)/word, minimum 6/- (30p) & Practical Wireless (5/6 (27½p) per line or part line). Sinclair estimated producing 1,000 a month, placing orders with suppliers for 10,000 of each component to be delivered. Sinclair wrote a book for Bernard’s Publishing, Practical transistor receivers Book 1, which appeared in January 1959. His practical stereo handbook was published in June 1959 in total he produced 13 constructors’ books and the last book Sinclair wrote as an employee of Bernard’s was Modern Transistor Circuits for Beginners, in 1962.

After spending several years as assistant editor of Practical Wireless and Instrument Practice, Sinclair founded Sinclair Radionics in 1961, His original choice, Sinclair Electronics, was taken; Sinclair Radio was available but did not sound right. Sinclair Radionics was formed on 25 July 1961.Sinclair made two attempts to raise startup capital to advertise his inventions and buy components. He designed PCB kits and licensed some technology. Then he took his design for a miniature transistor pocket radio and sought a backer for its production in kit form. Eventually he found someone who agreed to buy 55% of his company for £3,000 but the deal did not go through. Sinclair, unable to find capital, joined United Trade Press (UTP) as technical editor of Instrument Practice. Sinclair appeared in the publication as an assistant editor in March 1962. Sinclair described making silicon planar transistors, their properties and applications and hoped they might be available by the end of 1962. Sinclair’s obsession with miniaturisation became more obvious as his career progressed. Sinclair undertook a survey for Instrument Practice of semiconductor devices, which appeared in four sections between September 1962 and January 1963. His last appearance as assistant editor was in April 1969. Through UTP, Sinclair had access to thousands of devices from 36 manufacturers. He contacted Semiconductors Ltd (who at that time sold semiconductors made by Plessey) and ordered rejects to repair. He produced a design for a miniature radio powered by a couple of hearing aid cells and made a deal with Semiconductors to buy its micro-alloy transistors at 6d (2½p) each in boxes of 10,000. He then carried out his own quality control tests, and marketed his renamed MAT 100 and 120 at 7s 9d (38¾p) and 101 and 121 at 8s 6d (42½p). He also produced the first slim-line electronic pocket calculator in 1972 (the Sinclair Executive).

In 1973 Sinclair moved into the production of home computers and formed another company, initially called Ablesdeal Ltd. This changed name several times, eventually becoming Science of Cambridge Ltd, and In 1978 they Launched a microcomputer kit, the MK14, based on the National SC/MP chip. By July 1978, a personal computer project was under way. When Sinclair learned the NewBrain could not be sold at below £100 as he envisaged, he turned to a simpler computer. In May 1979 Jim Westwood started the ZX80 project at Science of Cambridge; it was launched in February 1980 the UK’s first mass-market home computer for less than GB£100, at £79.95 in kit form and £99.95 ready-built. In November, Science of Cambridge was renamed Sinclair Computers Ltd.In March 1981, Sinclair Computers was renamed again as Sinclair Research Ltd and the Sinclair ZX81 was launched at £49.95 in kit form and £69.95 ready-built, by mail order, and it is widely recognised for its importance in the early days of the British home computer industry.In February 1982 Timex obtained a license to manufacture and market Sinclair’s computers in the United States under the name Timex Sinclair. In April the ZX Spectrum was launched at £125 for the 16 kB RAM version and £175 for the 48 kB version. In March 1982 the company made an £8.55 million profit on turnover of £27.17 million, including £383,000 government grants for the TV80 flat-screen portable television.In 1982 Sinclair converted the Barker & Wadsworth mineral water bottling factory into the company’s headquarters. (This was sold to Cambridgeshire County Council in December 1985 owing to Sinclair’s financial troubles.)

The following year, he received his knighthood and formed Sinclair Vehicles Ltd. to develop electric vehicles. This resulted in the 1985 Sinclair C5. In 1984, Sinclair launched the Sinclair QL computer, intended for professional users. Development of the ZX Spectrum continued with the enhanced ZX Spectrum 128 in 1985 . In April 1986, Sinclair Research sold the Sinclair trademark and computer business to Amstrad for £5 million. Sinclair Research Ltd. was reduced to an R&D business and holding company, with shareholdings in several spin-off companies, formed to exploit technologies developed by the company. These included Anamartic Ltd. (wafer-scale integration), Shaye Communications Ltd. (CT2 mobile telephony) and Cambridge Computer Ltd. (Z88 portable computer and satellite TV receivers). By 1990, Sinclair Research consisted of Sinclair and two other employees, and its activities have since concentrated on personal transport, the Zike electric bicycle, Zeta bicycle motor and the A-bike folding bicycle for commuters, which weighs 5.5 kilograms (12 lb) and folds down small enough to be carried on public transport.

World Emoji Day

World Emoji Day is celebrated annually on July 17. The day is deemed a “global celebration of emoji” and is primarily celebrated online. World Emoji Day is “the brainchild of Jeremy Burge” according to CNBC who stated that “London-based founder of Emojipedia created it” in 2014. The New York Times reported that Burge created this on July 17 “based on the way the calendar emoji is shown on iPhones” For the first World Emoji Day, Burge told The Independent “there were no formal plans put in place” other than choosing the date.

Emoji (Japanese: 絵文字えもじ, are ideograms and smileys used in electronic messages and web pages. Emoji exist in various genres, including facial expressions, common objects, places and types of weather, and animals. They are much like emoticons, but emoji are actual pictures instead of typographics. Originally meaning pictograph, the word emoji comes from Japanese e (絵, “picture”) + moji (文字, “character”). The resemblance to the English words emotion and emoticon is purely coincidental.[6] The ISO 15924 script code for emoji is Zsye. Originating on Japanese mobile phones in 1999, emoji became increasingly popular worldwide in the 2010s after being added to several mobile operating systems. They are now considered to be a large part of popular culture in the west. In 2015, Oxford Dictionaries named the Face with Tears of Joy emoji the Word of the Year. 

Google changed the appearance of Unicode character U+1F4C5 CALENDAR to display July 17 on Android, Gmail and Hangouts products in 2016 On World Emoji Day 2015, Pepsi launched PepsiMoji which included an emoji keyboard and custom World Emoji Day Pepsi cans and bottles. These were originally released in Canada, and expanded to 100 markets in 2016. Sony Pictures Animation used World Emoji Day 2016 to announce T. J. Miller as the first cast member for The Emoji Movie. Google released “a series of new emoji that are more inclusive of women from diverse backgrounds and all walks of life”,and Emojipedia used July 17 to launch the first World Emoji Awards. Other companies that made emoji-related announcements on World Emoji Day 2016 included Google, Disney, General Electric, Twitter, and Coca-Cola.

Tim Berners Lee OM KBE FRS FREng FRSA FRCS

English-American computer scientist and engineer, Sir Timothy John Berners-Lee OM KBE FRS FREng FRSA FBCS was was born 8 June 1955 In London, England. His parents Mary Lee Woods and Conway Berners-Lee worked on the first commercially-built computer, the Ferranti Mark 1. He attended Sheen Mount Primary School, and then went on to attend south west London’s Emanuel School from 1969 to 1973, at the time a direct grant grammar school, which became an independent school in 1975. A keen trainspotter as a child, he learnt about electronics from tinkering with a model railway. He studied at The Queen’s College, Oxford from 1973 to 1976, where he received a first-class degree bachelor of arts degree in physics.

After graduation, Berners-Lee worked as an engineer at the telecommunications company Plessey in Poole, Dorset. In 1978, he joined D. G. Nash in Ferndown, Dorset, where he helped create type-setting software for printers. Berners-Lee worked as an independent contractor at CERN from June to December 1980. While in Geneva, he proposed a project based on the concept of hypertext, to facilitate sharing and updating information among researchers. To demonstrate it, he built a prototype system named ENQUIRE. After leaving CERN in late 1980, he went to work at John Poole’s Image Computer Systems, Ltd, in Bournemouth, Dorset. He ran the company’s technical side for three years. The project he worked on was a “real-time remote procedure call” which gave him experience in computer networking. In 1984, he returned to CERN as a fellow. In 1989, CERN was the largest Internet node in Europe, and Berners-Lee saw an opportunity to join hypertext with the Internet:

I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and—ta-da!—the World Wide Web. Creating the web was really an act of desperation, because the situation without it was very difficult when I was working at CERN. Most of the technology involved in the web, like the hypertext, like the Internet, multifont text objects, had all been designed already. I just had to put them together. It was a step of generalising, going to a higher level of abstraction, thinking about all the documentation systems out there as being possibly part of a larger imaginary documentation system.” This NeXT Computer was used by Berners-Lee at CERN and became the world’s first web server. Berners-Lee wrote his proposal in March 1989 and, in 1990, redistributed it. He used similar ideas to those underlying the ENQUIRE system to create the World Wide Web, for which he designed and built the first Web browser. His software also functioned as an editor (called WorldWideWeb, running on the NeXTSTEP operating system), and the first Web server, CERN HTTPd (short for Hypertext Transfer Protocol daemon).

He is commonly credited with inventing the World Wide Web (abbreviated as WWW or W3, commonly known as the web). The World Wide Web is a series of interlinked hypertext documents accessed via the Internet. With a web browser, one can view web pages that may contain text, images, videos, and other multimedia and navigate between them via hyperlinks. The web was developed between March 1989 and December 1990. Using concepts from his earlier hypertext systems such as ENQUIRE, British engineer Tim Berners-Lee, acomputer scientist and at that time employee of the CERN, now Director of the World Wide Web Consortium (W3C), wrote a proposal in March 1989 for what would eventually become the World Wide Web. The 1989 proposal was meant for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world. At CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, berners-Lee and Belgian computer scientist Robert Cailliau proposed in 1990 to use hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”. Berners-Lee finished the first website in December 1990 and posted the project on the alt.hypertext newsgroup on 7 August 1991

In the May 1970 issue of Popular Science magazine, Arthur C. Clarke predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe.In March 1989, Tim Berners-Lee wrote a proposal that referenced ENQUIRE, a database and software project he had built in 1980, and described a more elaborate information management system. With help from Robert Cailliau, he published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3”) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, [so that] authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.

The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration.The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.

On 6 August 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This date also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.”[18]The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992,[19]whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web.[22]The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.

Berners-Lee’s breakthrough was to marry hypertext to the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.

An early popular web browser was ViolaWWW for Unix and the X Windowing System. Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore.[28] Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.

The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October 1994. It was founded at theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agency (DARPA), which had pioneered the Internet; a year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain namesand HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet. The web is a collection of documents and both client and server software using Internet protocols such as TCP/IP and HTTP.Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.