Credited with being the ‘Inventor of the World Wide Web, Tim Berners-Lee released files describing his idea for the World Wide Web On the 6th August in 1991 and WWW debuts as a publicly available service on the Internet.Born 8th June 1955, Sir Timothy John “Tim” Berners-Lee, OM, KBE, FRS, FREng, FRSA , also known as “TimBL”, is a British computer scientist, MIT professor and the inventor of the World Wide Web. He made a proposal for an information management system in March 1989 and on 25 December 1990, with the help of Robert Cailliau and a young student at CERN, he implemented the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the InternetIn 2004, Berners-Lee was knighted by Queen Elizabeth II for his pioneering work and is also the director of the World Wide Web Consortium (W3C), which oversees the Web’s continued development. He is also the founder of the World Wide Web Foundation, and is a senior researcher and holder of the Founders Chair at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). He is a director of The Web Science Research Initiative and a member of the advisory board of the MIT Center for Collective Intelligence.

In 2004, Berners-Lee was knighted by Queen Elizabeth II for his pioneering work. In April 2009, he was elected a foreign associate of the United States National Academy of Sciences.In June 2009 then British Prime Minister Gordon Brown (BOO! HISS!) announced Berners-Lee would work with the UK Government to help make data more open and accessible on the Web, building on the work of the Power of Information Task Force. Berners-Lee and Professor Nigel Shadbolt are the two key figures behind data.gov.uk, a UK Government project to open up almost all data acquired for official purposes for free re-use. Commenting on the opening up of Ordnance Survey data in April 2010 Berners-Lee said that: “The changes signal a wider cultural change in Government based on an assumption that information should be in the public domain unless there is a good reason not to—not the other way around.” He went on to say “Greater openness, accountability and transparency in Government will give people greater choice and make it easier for individuals to get more directly involved in issues that matter to them.”In November 2009, Berners-Lee launched the World Wide Web Foundation in order to “Advance the Web to empower humanity by launching transformative programs that build local capacity to leverage the Web as a medium for positive change.”

Berners-Lee is also one of the pioneer voices in favour of Net Neutrality, and has expressed the view that ISPs should supply “connectivity with no strings attached,” and should neither control nor monitor customers’ browsing activities without their expressed consent. He advocates the idea that net neutrality is a kind of human network right: “Threats to the Internet, such as companies or governments that interfere with or snoop on Internet traffic, compromise basic human network rights.”Berners-Lee is a co-director of the Open Data Institute.He was honoured as the ‘Inventor of the World Wide Web’ in a section of the 2012 Summer Olympics opening ceremony in which he also participated, working at a NeXT Computer. He tweeted: “This is for everyone”, instantly spelled out in LCD lights attached to the chairs of the 70,500 people in the audience.

Alan Turing OBE FRS

British  mathematician, logician, cryptanalyst, and computer scientist Alan Turing OBE, FRS was Born June 23rd, 1912 in Maida Vale, and grew up in Hastings. He displayed great individuality from a young age. At 14 he went to Sherborne School in Dorset. Turing read mathematics at Cambridge, he was a completely original thinker who shaped the modern world, and assisted in the development of the innovative Manchester computers. He was also highly influential in the development of computer science, providing a formalisation of the concepts of “algorithm” and “computation” with the Turing machine, which played a sinificant role in the creation of the modern computer. Turing is widely considered to be the father of computer science and artificial intelligece.He also became interested in mathematical biology and wrote a paper on the chemical basis of morphogenesis, and predicted oscillating chemical reactions such as the Belousov–Zhabotinsky reaction, which were first observed in the 1960s.

On 4 September 1939 the day after Britain declared war on Germany, Turing reported to Bletchley Park where he worked for the Government Code and Cypher School (GCCS)the forerunner of GCHQ, Britain’s codebreaking centre. For a time he was head of Hut 8, the section responsible for German naval cryptanalysis. Turing led a team whose ingenuity and intellect were turned to the task of breaking German ciphers. He devised a number of techniques for breaking German ciphers and One of Turing’s main contributions whilst there was to invent the Bombe, an electromechanical machine used to find the daily settings of the Enigma machine. as a result he played an absolutely vital part of the British war effort and It is without question that his efforts helped shorten the war significantly, saving the lives of millions of people.He was also a remarkable British hero who helped create the modern world. Now known as the father of computer science, his inventions contributed greatly to the groundwork for the modern computer.

After the war he worked at the National Physical Laboratory, where he created one of the first designs for a stored-program computer, the ACE. In 1948 Turing joined Max Newman’s Computing Laboratory at Manchester University, where he assisted in the development of the Manchester computers and invented a type of theoretical machine now called a Turing Machine, which formalized what it means to compute a number. Turing’s importance extends far beyond Turing Machines. His work deciphering secret codes drastically shortened World War II and pioneered early computer technology.He was also an early innovator in the field of artificial intelligence, and came up with a way to test if computers could think – now known as the Turing Test. Besides this abstract work, he was down to earth; he designed and built real machines, even making his own relays and wiring up circuits. This combination of pure math and computing machines was the foundation of computer science.

Despite his invaluable help during World War II AND all his other achievements, he was treated badly. A burglary at his home led Turing to admit to police that he was a practicing homosexual, at a time when it was illegal in Britain. This led to his arrest and conviction in 1952 for ‘gross indecency’. He was subsequently forced to choose between imprisonment and chemical castration. He chose chemical castration (treatment with female hormones) as an alternative to prison. As a result of his conviction he lost security clearance and was not allowed to continue his work, and Sadly On 8 June 1954 Turing committed suicide just over two weeks before his 42nd birthday.

Luckily since Turing’s birth, attitudes have changed towars homosexuality and The US-based Association of Computing Machinery has given The Turing Award annually since 1966. This is the computing world’s highest honour for technical contribution to the computing community and considered equivalent to the Nobel prize.On 10 September 2009, following an Internet campaign, British Prime Minister Gordon Brown also made an official public apology on behalf of the British government for “the appalling way he was treated”. Despite his valuable contributions Turing did not receive the recognition and plaudits that he deserved while alive, However this has now been redressed and there is now A fully functional replica of the Bombe which can be found today at Bletchley Park, along with the excellent Turing exhibition. Turing has also been immortalised on film in The Imitation Game starring Benedict Cumberbatch.

Tim Berners-Lee

English-American computer scientist and engineer, Sir Timothy John Berners-Lee OM KBE FRS FREng FRSA FBCS was born 8 June 1955 in London, England, one of four children born to Mary Lee Woods and Conway Berners-Lee. His parents worked on the first commercially-built computer, the Ferranti Mark 1. He attended Sheen Mount Primary School, and then went on to attend south west London’s Emanuel School from 1969 to 1973, at the time a direct grant grammar school, which became an independent school in 1975. A keen trainspotter as a child, he learnt about electronics from tinkering with a model railway. He studied at The Queen’s College, Oxford from 1973 to 1976, where he received a first-class degree bachelor of arts degree in physics.

After graduation, Berners-Lee worked as an engineer at the telecommunications company Plessey in Poole, Dorset. In 1978, he joined D. G. Nash in Ferndown, Dorset, where he helped create type-setting software for printers. Berners-Lee worked as an independent contractor at CERN from June to December 1980. While in Geneva, he proposed a project based on the concept of hypertext, to facilitate sharing and updating information among researchers. To demonstrate it, he built a prototype system named ENQUIRE. After leaving CERN in late 1980, he went to work at John Poole’s Image Computer Systems, Ltd, in Bournemouth, Dorset. He ran the company’s technical side for three years. The project he worked on was a “real-time remote procedure call” which gave him experience in computer networking. In 1984, he returned to CERN as a fellow.In 1989, CERN was the largest Internet node in Europe, and Berners-Lee saw an opportunity to join hypertext with the Internet:

“I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and—ta-da!—the World Wide Web.mCreating the web was really an act of desperation, because the situation without it was very difficult when I was working at CERN later. Most of the technology involved in the web, like the hypertext, like the Internet, multifont text objects, had all been designed already. I just had to put them together. It was a step of generalising, going to a higher level of abstraction, thinking about all the documentation systems out there as being possibly part of a larger imaginary documentation system.” This NeXT Computer was used by Berners-Lee at CERN and became the world’s first web server. Berners-Lee wrote his proposal in March 1989 and, in 1990, redistributed it. It then was accepted by his manager, Mike Sendall.[29] He used similar ideas to those underlying the ENQUIRE system to create the World Wide Web, for which he designed and built the first Web browser. His software also functioned as an editor (called WorldWideWeb, running on the NeXTSTEP operating system), and the first Web server, CERN HTTPd (short for Hypertext Transfer Protocol daemon).

He is commonly credited with inventing the World Wide Web (abbreviated as WWW or W3, commonly known as the web. The World Wide Web is a series of interlinked hypertext documents accessed via the Internet. With a web browser, one can view web pages that may contain text, images, videos, and other multimedia and navigate between them via hyperlinks.The web was developed between March 1989 and December 1990. Using concepts from his earlier hypertext systems such as ENQUIRE, British engineer Tim Berners-Lee, acomputer scientist and at that time employee of the CERN, now Director of the World Wide Web Consortium (W3C), wrote a proposal in March 1989 for what would eventually become the World Wide Web. The 1989 proposal was meant for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world. At CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, berners-Lee and Belgian computer scientist Robert Cailliau proposed in 1990 to use hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”, and Berners-Lee finished the first website in December that year. Berners-Lee posted the project on the alt.hypertext newsgroup on 7 August 1991

In a 1970 issue of Popular Science magazine, Arthur C. Clarke predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe.In March 1989, Tim Berners-Lee wrote a proposal that referenced ENQUIRE, a database and software project he had built in 1980, and described a more elaborate information management system. With help from Robert Cailliau, he published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3”) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, [so that] authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.

The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration.The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.

In 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23 1991. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.”[18]The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992,[19]whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web.[22]The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.

Berners-Lee’s breakthrough was to incorporate hypertext to the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.

An early popular web browser was ViolaWWW for Unix and the X Windowing System..Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore.[28] Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.

The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October 1994. It was founded at theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agency (DARPA), which had pioneered the Internet; a year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain namesand HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. It should be noted that The World Wide Web is different to the Internet, The world wide web is a collection of documents stored online, while the Internet is a means of accesssing them using Internet protocols such as TCP/IP and HTTP. Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.

Ole Kirk Christiansen (LEGO(tm)

Danish businessman and LEGO(Tm) creator Ole Kirk Christiansen, was born 7 April 1891 in Filskov, Denmark. He trained as a carpenter and started making wooden toys in 1932 to make a living after having lost his job during the depression. Sadly though, shortly after the depression Christiansen’s wife also died, leaving him to raise his four sons by himself. Christiansen knew the value of hard wearing toys So to make ends meet he decided to construct a small wooden duck toy for his children. When he found that his sons loved the new toy he decided to put the ducks into production using the leftover wood from his old business. He then went on to making miniature versions of the houses and furniture, which also became quite successful.

Sadly though In 1942 a fire broke out at the factory destroying Ole’s life’s work and forcing them to rebuild from scratch. So in 1947 he invested in a revolutionary injection-moulding machine Imported from Britain in 1947 for 30,000 Danish kroner (£3,200). After Building a new factory, Ole set about re-making his lost designs and moved on to manufacturing plastic items rather than wood, these originally consisted of small plastic bears and rattles. By 1949 he had produced over 200 plastic and wooden toys. Then, two years after buying the injection-moulding machine, he produced the first Lego bricks, called Automatic Binding Bricks, they looked similar to today’s blocks but had a slit in the sides and were completely hollow. Ole Kirk Christiansen came up with the name Lego from the Danish words leg godt, meaning “play well”, and the company grew to become the Lego Group.

Then In 1954, Ole’s son Godtfred, the firm’s junior managing director returned from a UK toy fair with the idea of creating a toy system in which every element could connect together to build things, and by 1958 the firm had patented the colourful bricks with hollow tubes on the underside so they could be locked together and the story of the Lego brick began. Sadly though On 11 March 1958, Christiansen died from a heart attack when he was 66 years old, however his third son Godtfred Kirk Christiansen promptly took over the company and developed his idea of interconnecting bricks culminating in The first Lego set, Town Plan No.1, which had everything a child needed to make their own model town centre, this became a huge success.

Since then Lego(tm) has grown to become a household name, annually selling many million sets worldwide. Then In 1968 they opened a theme park at their HQ in Billund, Denmark — the first of six worldwide. A year later came Lego Duplo for under-fives and in 1978 “minifigure” people. Since then, all manner of themed Lego sets have hit shelves, from pirates, Outer Space, Lord of the Rings, Ninjago to Harry Potter and today, eight Lego sets are sold every Second worldwide. The UK even has its own Legoland which opened in Windsor in 1996 and there are now Lego-only stores, Lego computer games including Lego batman. There are two rather entertaining LEGO Movies and even a clothing range.

World day against Cyber Censorship

World Day Against Cyber-Censorship takes place annually on March 12. It aims to rally computer users in fighting repression of online speech. Reporters Without Borders was also created this day to celebrate the work of brave individuals who have promoted free expression on the Internet. The annual Netizen Prize is awarded to bloggers, online journalists, and cyber-dissidents, who have demonstrated exceptional dedication to this cause. It was first observed on March 12, 2008 at the request of Reporters Without Borders and Amnesty International. A letter written by Jean-Francois Julliard, Secretary-General of Reporters Without Borders, and Larry Cox, Executive Director of Amnesty International, was sent to the Chief Executive Officers of Google, Yahoo! & Microsoft Corporation to request observation of the day.

The Electronic Frontier Foundation remains dedicated to reporting cases of online censorship from all regions of the world, and emphasize the importance of online anonymity in preserving individuals’ right to free speech, with an ongoing feature, This Week in Censorship, which covers global stories of imprisoned bloggers, filtered content, blocked websites, and instances of Internet disconnection. A broad array of reasons are offered as justification for censorship. Bloggers in Thailand face imprisonment for criticizing the monarch. In Pakistan, the Telecommunications Authority has blocked websites, banned words from SMS texts, and most recently, has released a request for proposals to build a national blocking and filtering system: All in the name of fighting “obscene content.” The Turkish government has implemented a so-called “democratic” opt-in filtering mechanism for content that is deemed unsuitable for children and families.

Another common trend is censorship enabled in the name of battling copyright violations. Through our Global Chokepoints project, we are monitoring instances of pro-copyright laws that justify filtering of content, websites blockages, or Internet disconnection to fight infringement. Censorship remains rampant in the Middle Eastern region. In Syria, Iran, and elsewhere, bloggers continue to face imprisonment, and common users have limited access to content online due to state-mandated blocking and filtering programs. Another ongoing issue being covered are authoritarian states using Western-based surveillance technologies to monitor and spy on their citizens. State authorities can use the collected data to arrest, harass, or torture individuals accused of participating in political dissent.

Data privacy Day

CompData Privacy Day takes place annually on January 28. The purpose of Data Privacy Day (Data Protection Day in Europe) is to raise awareness and promote privacy and data protection best practices. It is currently ‘celebrated’ in the United States, Canada, and 27 European countries. In Europe it is referred to as Data Protection Day.

Data Privacy Day’s educational initiative originally focused on raising awareness among businesses as well as users about the importance of protecting the privacy of their personal information online, particularly in the context of social networking. The educational focus has expanded over the past four years to include families, consumers and businesses. In addition to its educational initiative, Data Privacy Day promotes events and activities that stimulate the development of technology tools that promote individual control over personally identifiable information; encourage compliance with privacy laws and regulations; and create dialogues among stakeholders interested in advancing data protection and privacy. The international celebration offers many opportunities for collaboration among governments, industry, academia, nonprofits, privacy professionals and educators.

The Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data was opened by the Council of Europe in 1981. This convention is currently in the process of being updated in order to reflect new legal challenges caused by technological development. The Convention on Cybercrime is also protecting the integrity of data systems and thus of privacy in cyberspace. Privacy including data protection is also protected by Article 8 of the European Convention on Human Rights.

The day was initiated by the Council of Europe in 2007 as the European Data Protection Day and on January 26, 2009, the United States House of Representatives passed a House Resolution declaring January 28 National Data Privacy Day. On January 28, 2009, the Senate officially recognised January 28, 2009 as National Data Privacy Day. In response to the increasing levels of data breaches and the global importance of privacy and data security, the Online Trust Alliance (OTA) and the National Cyber Security Alliance adopted Data Privacy Day as Data Privacy & Protection Day, emphasizing the need to look at the long-term impact to consumers of data collection, use and protection practices and they also organise other Data Protection Day Activities

Ad Lovelace – Enchantress of numbers

CompThe Analyst, Metaphysician, and Founder of Scientific Computing, Augusta Ada King, Countess of Lovelace was born on 10th December 1815. Born Augusta Ada Byron and now commonly known as Ada Lovelace, she was the daughter of Lord Byron and is remembered as a mathematician and writer chiefly known for her work on Charles Babbage’s early mechanical general-purpose computer, the Analytical Engine. Her notes on the engine include what is recognised as the first algorithm intended to be processed by a machine. Because of this, she is often considered the world’s first computer programmer and left a legacy as role model for young women entering technology careers.

Ada was the only legitimate child born during a brief marriage between the poet Lord Byron and Anne Isabella Byron). She had no relationship with her father, who separated from her mother just a month after Ada was born, and four months later he left England forever and died in Greece in 1823 leaving her mother to raise her single-handedly, Her life was an apotheosis of struggle between emotion and reason, subjectivism and objectivism, poetics and mathematics, ill health and bursts of energy. Lady Byron wished her daughter to be unlike her poetical father, and she saw to it that Ada received tutoring in mathematics and music, as disciplines to counter dangerous poetic tendencies. But Ada’s complex inheritance became apparent as early as 1828, when she produced the design for a flying machine. It was mathematics that gave her life its wings.

As a young adult, she took an interest in mathematics, and in particular that of Lucasian professor of mathematics at Cambridge, Charles Babbage whom she met met in 1833, when she was just 17, who was One of the gentlemanly scientists of the era and become Ada’s lifelong friend. Babbage, was known as the inventor of the Difference Engine, an elaborate calculating machine that operated by the method of finite differences , and they began a voluminous correspondence on the topics of mathematics, logic, and ultimately all subjects. In 1835, Ada married William King, ten years her senior, and when King inherited a noble title in 1838, they became the Earl and Countess of Lovelace. Ada had three children. The family and its fortunes were very much directed by Lady Byron, whose domineering was rarely opposed by King.Babbage had made plans in 1834 for a new kind of calculating machine (although the Difference Engine was not finished), an Analytical Engine.

His Parliamentary sponsors refused to support a second machine with the first unfinished, but Babbage found sympathy for his new project abroad. In 1842, an Italian mathematician, Louis Menebrea, published a memoir in French on the subject of the Analytical Engine. Babbage enlisted Ada as translator for the memoir, and during a nine-month period in 1842-43, she worked feverishly on the article and a set of Notes she appended to it. These notes contain what is considered the first computer program — that is, an algorithm encoded for processing by a machine. Ada’s notes are important in the early history of computers. She also foresaw the capability of computers to go beyond mere calculating or number-crunching while others, including Babbage himself, focused only on these capabilities

Ada called herself an Analyst (& Metaphysician), and the combination was put to use in the Notes. She understood the plans for the device as well as Babbage but was better at articulating its promise. She rightly saw it as what we would call a general-purpose computer. It was suited for “developing and tabulating any function whatever. . . the engine is the material expression of any indefinite function of any degree of generality and complexity.” Her Notes anticipate future developments, including computer-generated music. Sadly though Ada passed away on November 27, 1852, in Marylebone at the age of 37, from Cancer and was buried beside the father she never knew. Her contributions to science were resurrected only recently, but many new biographies* attest to the fascination of Babbage’s “Enchantress of Numbers.”