Carl Sagan

American astronomer, cosmologist, astrophysicist, astrobiologist, author, science popularizer, and science communicator in astronomy and other natural sciences, Carl Edward Sagan was born November 9, 1934. Sagan First became interested in science and astronomy when parents took him to the 1939 New York World’s Fair when he was four years old. The exhibits became a turning point in his life. He later recalled the moving map of the America of Tomorrow exhibit which showed beautiful highways and cloverleaves and little General Motors cars all carrying people to skyscrapers, buildings with lovely spires and, flying buttresses. At other exhibits, he remembered how a flashlight that shone on a photoelectric cell created a crackling sound, and how the sound from a tuning fork became a wave on an oscilloscope. He also witnessed the future media technology that would replace radio: television.

Soon after entering elementary school he began to express a strong inquisitiveness about nature. Sagan recalled taking his first trips to the public library alone, at the age of five, when his mother got him a library card. He wanted to learn what stars were, since Nobody else could give him a clear answer. He and a close friend took trips to the American Museum of Natural History across the East River in Manhattan. While there, they went to the Hayden Planetarium and walked around the museum’s exhibits of space objects, such as meteorites, and displays of dinosaurs and animals in natural settings. His parents bought him chemistry sets and reading materials. His interest in space, however, was his primary focus, especially after reading science fiction stories by writers such as H. G. Wells and Edgar Rice Burroughs, which stirred his imagination about life on other planets such as Mars. In 1947 he discovered Astounding Science Fiction magazine, which introduced him to more hard science fiction speculations than those in Burroughs’s novels. That same year inaugurated the “flying saucer” mass hysteria with the young Carl suspecting the “discs” might be alien spaceships.

Sagan lived in Bensonhurst where he went to David A. Boody Junior High School. He had his bar mitzvah in Bensonhurst when he turned 13. In 1948, his family moved to the nearby town of Rahway, New Jersey for his father’s work, where Sagan then entered Rahway High School. He graduated in 1951. Sagan was made president of the school’s chemistry club, and set up his own laboratory at home, teaching himself about molecules by making cardboard cutouts to help him visualize how molecules were formed and also remained interested in astronomy.

Sagan attended the University of Chicago. Its Chancellor, Robert Hutchins, structured the school as an “ideal meritocracy,” with no age requirement. The school also employed a number of the nation’s leading scientists, including Enrico Fermi and Edward Teller, along with operating the famous Yerkes Observatory. Sagan worked in the laboratory of the geneticist H. J. Muller and wrote a thesis on the origins of life with physical chemist Harold Urey. Sagan joined the Ryerson Astronomical Society, received a B.A. degree in self-proclaimed “nothing” with general and special honors in 1954, and a B.S. degree in physics in 1955. He went on to earn a M.S. degree in physics in 1956, before earning a Ph.D. degree in 1960 with the dissertation “Physical Studies of Planets” submitted to the Department of Astronomy and Astrophysics. From 1960 to 1962 Sagan was a Miller Fellow at the University of California, Berkeley. he also published an article in 1961 in the journal Science on the atmosphere of Venus, while also working with NASA’s Mariner 2 team, and served as a “Planetary Sciences Consultant” to the RAND Corporation.

After the publication of Sagan’s Science article, in 1961 Harvard University astronomers Fred Whipple and Donald Menzel offered Sagan the opportunity to give a colloquium at Harvard, and they subsequently offered him a lecturer position at the institution. Sagan instead asked to be made an assistant professor. Sagan lectured, performed research, and advised graduate students at the institution from 1963 until 1968, as well as working at the Smithsonian Astrophysical Observatory, both located in Cambridge, Massachusetts. Cornell University astronomer Thomas Gold then asked Sagan to move to Ithaca, New York and join the faculty at Cornell. and remained a faculty member at Cornell for nearly 30 years until his death in 1996. Following two years as an associate professor, Sagan became a full professor at Cornell in 1970, and directed the Laboratory for Planetary Studies there. From 1972 to 1981, he was associate director of the Center for Radiophysics and Space Research (CRSR) at Cornell. In 1976, he became the David Duncan Professor of Astronomy and Space Sciences.

Sagan was associated with the U.S. space program from its inception. From the 1950s onward, he worked as an advisor to NASA, where one of his duties included briefing the Apollo astronauts before their flights to the Moon. Sagan contributed to many of the robotic spacecraft missions that explored the Solar System, arranging experiments on many of the expeditions. Sagan assembled the first physical message that was sent into space: a gold-anodized plaque, attached to the space probe Pioneer 10, launched in 1972. Pioneer 11, also carrying another copy of the plaque, was launched In 1973. He continued to refine his designs; the most elaborate message he helped to develop and assemble was the Voyager Golden Record that was sent out with the Voyager space probes in 1977. Sagan often challenged the decisions to fund the Space Shuttle and the International Space Station at the expense of further robotic missions.

He became known for his work as a science popularizer and communicator. His best known scientific contribution is research on extraterrestrial life, including experimental demonstration of the production of amino acids from basic chemicals by radiation. Sagan assembled the first physical messages sent into space: the Pioneer plaque and the Voyager Golden Record, universal messages that could potentially be understood by any extraterrestrial intelligence that might find them. Sagan argued the now accepted hypothesis that the high surface temperatures of Venus can be attributed to and calculated using the greenhouse effect.

Sagan published more than 600 scientific papers and articles and was author, co-author or editor of more than 20 books. He wrote many popular science books, such as The Dragons of Eden, Broca’s Brain and Pale Blue Dot, and narrated and co-wrote the award-winning 1980 television series Cosmos: A Personal Voyage. The most widely watched series in the history of American public television, Cosmos has been seen by at least 500 million people across 60 different countries. The book Cosmos was published to accompany the series. He also wrote the science fiction novel Contact, the basis for a 1997 film of the same name. His papers, containing 595,000 items, are archived at The Library of Congress.

Sagan advocated scientific skeptical inquiry and the scientific method, pioneered exobiology and promoted the Search for Extra-Terrestrial Intelligence (SETI). He spent most of his career as a professor of astronomy at Cornell University, where he directed the Laboratory for Planetary Studies. Sagan and his works received numerous awards and honors, including the NASA Distinguished Public Service Medal, the National Academy of Sciences Public Welfare Medal, the Pulitzer Prize for General Non-Fiction for his book The Dragons of Eden, and, regarding Cosmos: A Personal Voyage, two Emmy Awards, the Peabody Award and the Hugo Award. He married three times and had five children. After suffering from myelodysplasia, Sagan died of pneumonia at the age of 62, on December 20, 1996.

Advertisements

International Day of Radiology/X-Ray Day

The International Day of Radiology (IDoR) is celebrated annually on November 8th to promote the role of medical imaging in modern healthcare and mark the anniversary of the discovery of x-rays on November 8th 1895 by Wilhelm Conrad Röntgen, who effectively layed the foundation for the new medical discipline of radiology.

Radiology is the medical specialty that uses medical imaging to diagnose and treat diseases within the body.A variety of imaging techniques such as X-ray radiography, ultrasound, computed tomography (CT), nuclear medicine including positron emission tomography (PET), and magnetic resonance imaging (MRI) are used to diagnose or treat diseases. Interventional radiology is the performance of (usually minimally invasive) medical procedures with the guidance of imaging technologies.

The modern practice of radiology involves several different healthcare professions working as a team. The Radiologist is a medical doctor who has completed the appropriate post-graduate training and interprets medical images, communicates these findings to other physicians by means of a report or verbally, and uses imaging to perform minimally invasive medical procedures. The Nurse is involved in the care of patients before and after imaging or procedures, including administration of medications, monitoring of vital signs and monitoring of sedated patients. The Radiographer, also known as a “Radiologic Technologist” in some countries, is a specially trained healthcare professional that uses sophisticated technology and positioning techniques to acquire medical images. Depending on the individual’s training and country of practice, the radiographer may specialize in one of the above-mentioned imaging modalities or have expanded roles in image reporting.

The International Day of Radiology was first introduced in 2012, as a joint initiative, by the European Society of Radiology (ESR), the Radiological Society of North America (RSNA), and the American College of Radiology (ACR). The International Day of Radiology is a successor to the European Day of Radiology which was launched in 2011. The first and only European Day of Radiology was held on February 10, 2011 to commemorate the anniversary of Röntgen’s death. The European day was organised by the ESR, who later entered into cooperation with the RSNA and the ACR to establish the International Day of Radiology.

The International Day of Radiology marks the anniversary of Röntgen’s discovery of x-rays and the main theme was medical imaging in oncology. The day was celebrated with events in many countries, mostly organised by national professional societies which represent radiologists. Many public lectures on the role of imaging in oncology took place across Europe. In the UK, the Royal College of Radiologists organised a free public lecture at the Wellcome Collection by Dr. Phil O’Connor, who served as head of musculoskeletal imaging at the London 2012 Olympics. The ESR also published two booklets to mark the occasion, ‘The Story of Radiology’, which was created in cooperation with the International Society for the History of Radiology, and ‘Making Cancer Visible: the role of cancer in oncology’

World Radiography Day also takes place to mark the anniversary of the discovery of X-rays in 1895. The purpose of this day is to raise public awareness of radiographic imaging and therapy, which play a crucial role in the diagnosis and the treatment of patients and, most importantly, ensuring radiation is kept to the minimum required, hence improving the quality of patient care. The day is celebrated worldwide by various national radiographers’ associations and societies, including Nigeria’s Association of Radiographers of Nigeria, United Kingdom’s Society of Radiographers (SoR), among others. [1]The International Society of Radiographers and Radiological Technologists have celebrated 8 November as World Radiography Day since 2007.

German mechanical engineer and physicist Wilhelm Konrad Röntgen was born 27 March 1845. He attended high school in Utrecht, Netherlands. However In 1865, he was expelled from high school and Without a high school diploma, Röntgen could only attend university in the Netherlands as a visitor. In 1865, he tried to attend Utrecht University without having the necessary credentials required for a regular student. Upon hearing that he could enter the Federal Polytechnic Institute in Zurich (today known as the ETH Zurich), he passed its examinations, and began studies there as a student of mechanical engineering. In 1869, he graduated with a Ph.D. from the University of Zurich; once there, he became a favorite student of Professor August Kundt, whom he followed to the University of Strassburg.

In 1874, Röntgen became a lecturer at the University of Strassburg. In 1875, he became a professor at the Academy of Agriculture at Hohenheim, Württemberg. He returned to Strassburg as a professor of physics in 1876, and in 1879, he was appointed to the chair of physics at the University of Giessen. In 1888, he obtained the physics chair at the University of Würzburg, and in 1900 at the University of Munich, by special request of the Bavarian government. Although Röntgen accepted an appointment at Columbia University in New York City the outbreak of World War I changed his plans and he remained in Munich for the rest of his career.

During 1895, Röntgen was investigating the external effects from the various types of vacuum tube equipment — apparatuses from Heinrich Hertz, Johann Hittorf, William Crookes, Nikola Tesla and Philipp von Lenard — when an electrical discharge is passed through them.[5][6] In early November, he was repeating an experiment with one of Lenard’s tubes in which a thin aluminium window had been added to permit the cathode rays to exit the tube but a cardboard covering was added to protect the aluminium from damage by the strong electrostatic field that produces the cathode rays. He knew the cardboard covering prevented light from escaping, yet Röntgen observed that the invisible cathode rays caused a fluorescent effect on a small cardboard screen painted with barium platinocyanide when it was placed close to the aluminium window. It occurred to Röntgen that the Crookes–Hittorf tube, which had a much thicker glass wall than the Lenard tube, might also cause this fluorescent effect.

On 8 November 1895, Röntgen decided to test his idea. He carefully constructed a black cardboard covering similar to the one he had used on the Lenard tube. He covered the Crookes–Hittorf tube with the cardboard and attached electrodes to a Ruhmkorff coil to generate an electrostatic charge. Before setting up the barium platinocyanide screen to test his idea, Röntgen darkened the room to test the opacity of his cardboard cover. As he passed the Ruhmkorff coil charge through the tube, he determined that the cover was light-tight and turned to prepare the next step of the experiment. It was at this point that Röntgen noticed a faint shimmering from a bench a few feet away from the tube. To be sure, he tried several more discharges and saw the same shimmering each time. Striking a match, he discovered the shimmering had come from the location of the barium platinocyanide screen he had been intending to use next.

Röntgen speculated that a new kind of ray might be responsible. 8 November was a Friday, so he took advantage of the weekend to repeat his experiments and made his first notes. In the following weeks he ate and slept in his laboratory as he investigated many properties of the new rays he temporarily termed “X-rays”, using the mathematical designation (“X”) for something unknown. The new rays came to bear his name in many languages as “Röntgen rays” (and the associated X-ray radiograms as “Röntgenograms”). At one point while he was investigating the ability of various materials to stop the rays, Röntgen brought a small piece of lead into position while a discharge was occurring. Röntgen thus saw the first radiographic image, his own flickering ghostly skeleton on the barium platinocyanide screen. He later reported that it was at this point that he determined to continue his experiments in secrecy, because he feared for his professional reputation if his observations were in error.

Nearly two weeks after his discovery, he took the very first picture using X-rays of his wife Anna Bertha’s hand. When she saw her skeleton she exclaimed “I have seen my death!” Röntgen’s original paper, “On A New Kind Of Rays” (Ueber eine neue Art von Strahlen), was published on 28 December 1895. On 5 January 1896, an Austrian newspaper reported Röntgen’s discovery of a new type of radiation. Röntgen was awarded an honorary Doctor of Medicine degree from the University of Würzburg after his discovery. He published a total of three papers on X-rays between 1895 and 1897. Today, Röntgen is considered the father of diagnostic radiology, the medical speciality which uses imaging to diagnose disease. A collection of his papers is held at the National Library of Medicine in Bethesda, Maryland.


More Holidays and National Days taking place on November 8

  • Abet and Aid Punsters Day.
  • Cook Something Bold Day.
  • National Ample Time Day.
  • National Cappuccino Day.
  • National Parents as Teachers Day.
  • World Usability Day.
  • X-ray Day.
  • National Dunce day
  • National Harvey Wallbanger day
    .

Marie Curie

Best known for her pioneering research in the field of radioactivity, the World famous Polish–French physicist and chemist Marie Skłodowska Curie was born 7th Novemer in 1867 in Warsaw, Poland. Maria’s paternal grandfather, Józef Skłodowski, had been a respected teacher in Lublin, where he taught the young Bolesław Prus,who became a leading figure in Polish literature.Her father, Władysław Skłodowski, taught mathematics and physics, subjects that Maria was to pursue, and was also director of two Warsaw gymnasia for boys.After Russian authorities eliminated laboratory instruction from the Polish schools, he brought much of the laboratory equipment home, and instructed his children in its use.

The father was eventually fired by his Russian supervisors for pro-Polish sentiments, and forced to take lower-paying posts. the family also lost money on a bad investment, and eventually chose to supplement their income by lodging boys in the house. Maria’s mother Bronisława operated a prestigious Warsaw boarding school for girls; she resigned from the position after Maria was born.She died of tuberculosis in May 1878, when Maria was ten years old. Less than three years earlier, Maria’s oldest sibling, Zofia, had died of typhus contracted from a boarder.

When she was ten years old, Maria began attending the boarding school of J. Sikorska; next she attended a gymnasium for girls, from which she graduated on 12 June 1883 with a gold medal. After an illness she spent the following year in the countryside with relatives of her father, and the next year with her father in Warsaw, where she did some tutoring. Unable to enroll in a regular institution of higher education because she was a woman, she and her sister Bronisława became involved with the clandestine Flying University, a Polish patriotic institution of higher learning that admitted women students.

At a Warsaw laboratory, in 1890–91, Maria Skłodowska did her first scientific work and made an agreement with her sister, Bronisława, that she would give her financial assistance during Bronisława’s medical studies in Paris, in exchange for similar assistance two years later. Maria took a position as governess: first as a home tutor in Warsaw; then for two years as a governess in Szczuki with a landed family, the Żorawskis, who were relatives of her father and fell in love with their son, Kazimierz Żorawski, a future eminent mathematician.Who soon earned a doctorate and pursued an academic career as a mathematician, becoming a professor and rector of Kraków University. Sadly his parent rejected his relationship with Maria.

She lived inWarsaw until the age of 24, when she followed her older sister Bronisława to study in Paris, where she earned her higher degrees and conducted her subsequent scientific work. She was also the first person honored with two Nobel Prizes—in both physics and chemistry, In 1903 she won the Nobel Prize in Physics which She shared with her husband Pierre Curie (and with Henri Becquerel), and In 1911 She became the sole winner of the 1911 Nobel Prize in Chemistry which she shared with Her daughter Irène Joliot-Curie and son-in-law, Frédéric Joliot-Curie, and is the only woman to date to win in two fields, and the only person to win in multiple sciences.

Among her many achievements are the theory of radioactivity (a term that she coined), She also developed techniques for isolating radioactive isotopes, and discovered two radioactive elements, polonium (Which was named after her native country) and radium. She was also the first female professor at the University of Paris andUnder her direction, the world’s first studies were conducted into the treatment of neoplasms, using radioactive isotopes. In 1932, she founded a Radium Institute (now the Maria Skłodowska–Curie Institute of Oncology) in her home town, Warsaw. Which was headed by her physician-sister Bronisława.

Unfortunately though Marie Curie died on 4th July 1934 of aplastic anemia, a condition which was undoubtedly brought on by her lifelong exposure to radiation, however her pioneering research has led the way for many improvements in the fields of Science, Chemistry and Medicine and in 1995 she became the first woman to be entombed on her own merits in the Paris Panthéon.

Leon Theramin

Pioneering Russian inventor Léon Theremin sadly died 3 November 1993. He was born Lev Sergeyevich Termen in Saint Petersburg, Russian Empire in 15 August 1896 into a family of French and German ancestry.He had a sister named Helena.He started to be interested in electricity at the age of 7, and by 13 he was experimenting with high frequency circuits. In the seventh class of his high school before an audience of students and parents he demonstrated various optical effects using electricity.By the age of 17 he was in his last year of high school and at home he had his own laboratory for experimenting with high frequencircuits, optics and magnetic fields.

His cousin, Kirill Fedorovich Nesturkh, then a young physicist, and a singer named Wagz invited him to attend the defense of the dissertation of professor Abram Fedorovich Ioffe. Physics lecturer Vladimir Konstantinovich Lebedinskiy had explained to Theremin the then interesting dispute over Ioffe’s work on the electron.On 9 May 1913 Theremin and his cousin attended Ioffe’s dissertation defense. Ioffe’s subject was on the elementary photoelectric effect, the magnetic field of cathode rays and related investigations. In 1917 Theremin wrote that Ioffe talked of electrons, the photoelectric effect and magnetic fields as parts of an objective reality that surrounds us everyday, unlike others that talked more of somewhat abstract formula and symbols. Theremin wrote that he found this explanation revelatory and that it fit a scientific – not abstract – view of the world, different scales of magnitude, and matter. From then on Theremin endeavoured to study the Microcosm, in the same way he had studied the Macrocosm with his hand-built telescope. Later, Kyrill introduced Theremin to Ioffe as a young experimenter and physicist, and future student of the university.

Theremin recalled that while still in his last year of school, he had built a million-volt Tesla coil and noticed a strong glow associated with his attempts to ionise the air. He then wished to further investigate the effects using university resources. A chance meeting with Abram Fedorovich Ioffe led to a recommendation to see Karl Karlovich Baumgart, who was in charge of the physics laboratory equipment. Karl then reserved a room and equipment for Theremin’s experiments. Abram Fedorovich suggested Theremin also look at methods of creating gas fluorescence under different conditions and of examining the resulting light’s spectra. However, during these investigations Theremin was called up for World War I military service.Despite Theremin being only in his second academic year, the deanery of the Faculty of Physics and Astronomy recommended him to go to the Nikolayevska Military Engineering School in Petrograd (renamed from Saint Petersburg), which usually only accepted students in their fourth year. Theremin recalled Ioffe reassured him that the war would not last long and that military experience would be useful for scientific applications.

Beginning his military service in 1916, Theremin finished the Military Engineering School in six months, progressed through the Graduate Electronic School for Officers, and attained the military radio-engineer diploma in the same year. In the course of the next three and a half years he oversaw the construction of a radio station in Saratov to connect the Volga area with Moscow, graduated from Petrograd University, became deputy leader of the new Military Radiotechnical Laboratory in Moscow, and finished as the broadcast supervisor of the radio transmitter at Tsarskoye Selo near Petrograd (then renamed Detskoye Selo).

During the Russian civil war, in October 1919 White Army commander Nikolai Nikolayevich Yudenich advanced on Petrograd from the side of Detskoye Selo, apparently intending to capture the radio station to announce a victory over the Bolsheviks. Theremin and others evacuated the station, sending equipment east on rail cars. Theremin then detonated explosives to destroy the 120 meter-high antennae mast before traveling to Petrograd to set up an international listening station. There he also trained radio specialists but reported difficulties obtaining food and working with foreign experts who he described as narrow-minded pessimists. Theremin recalled that on an evening when his hopes of overcoming these obstructing experts reached a low ebb, Abram Fedorovich Ioffe telephoned him.[Ioffe asked Theremin to come to his newly founded Physical Technical Institute in Petrograd, and the next day he invited him to start work at developing measuring methods for high frequency electrical oscillations.

The day after Ioffe’s invitation, Theremin started at the institute. He worked in diverse fields: applying the Laue effect to the new field ofX-ray analysis of crystals; using hypnosis to improve measurement-reading accuracy; working with Ivan Pavlov’s laboratory; and using gas-filled lamps as measuring devices. He built a high frequency oscillator to measure the dielectric constant of gases with high precision; Ioffe then urged him to look for other applications using this method, and shortly made the first motion detector for use as a”radio watchman”.while adapting the dielectric device by adding circuitry to generate an audio tone, Theremin noticed the pitch changed when his hand moved around.

In October 1920 he first demonstrated this to Ioffe who called in other professors and students to hear. Theremin recalled trying to find the notes for tunes he remembered from when he played the cello, such as the Swan by Saint-Saëns. By November 1920 Theremin had given his first public concert with the instrument, now modified with a horizontal volume antenna replacing the earlier foot-operated volume control. He named it the “etherphone” to be known as the Терменвокс (Termenvox) in the Soviet Union, as the Thereminvox in Germany,and later as the “theremin” in the United States. Theremin went to Germany in 1925 to sell both the radio watchman and Termenvox patents to the German firm Goldberg and Sons. According to Glinsky this was the Soviet’s “decoy for capitalists” to obtain both Western profits from sales and technical knowledge.

During this time Theremin was also working on a wireless television with 16 scan lines in 1925, improving to 32 scan lines and then 64 using interlacing in 1926, and he demonstrated moving, if blurry, images on 7 June 1927.After being sent on a lengthy tour of Europe starting 1927 – including London, Paris and towns in Germany– during which he demonstrated his invention to full audiences, Theremin found his way to the United States, arriving on 30 December 1927 with his first wife Katia.He performed the theremin with the New York Philharmonic in 1928. He patented his invention in the United States in 1928 and subsequently granted commercial production rights to RCA.Theremin set up a laboratory in New York in the 1930s, where he developed the theremin and experimented with other electronic musical instruments and other inventions. These included the Rhythmicon, commissioned by the American composer and theoristHenry Cowell.In 1930, ten thereminists performed on stage at Carnegie Hall. Two years later, Theremin conducted the first-ever electronic orchestra, featuring the theremin and other electronic instruments including a “fingerboard” theremin which resembled a cello in use.Theremin’s mentors during this time were some of society’s foremost scientists, composers, and musical theorists, including composerJoseph Schillinger and physicist (and amateur violinist) Albert Einstein. At this time, Theremin worked closely with fellow Russian émigré and theremin virtuoso Clara Rockmore.

Theremin was interested in a role for the theremin in dance music. He developed performance locations that could automatically react to dancers’ movements with varied patterns of sound and light.Theremin abruptly returned to the Soviet Union in 1938. At the time, the reasons for his return were unclear; some claimed that he was simply homesick, while others believed that he had been kidnapped by Soviet officials. Beryl Campbell, one of Theremin’s dancers, said his wife Lavinia “called to say that he had been kidnapped from his studio” and that “some Russians had come in” and that she felt that he was going to be spirited out of the country. Many years later, it was revealed that Theremin had returned to his native land due to tax and financial difficulties in the United States.However, Theremin himself once told Bulat Galeyev that he decided to leave himself because he was anxious about the approaching war.Shortly after he returned he was imprisoned in the Butyrka prison and later sent to work in the Kolyma gold mines. Although rumors of his execution were widely circulated and published, Theremin was, in fact, put to work in a sharashka (a secret laboratory in the Gulag camp system), together with Andrei Tupolev, Sergei Korolev, and other well-known scientists and engineers.]The Soviet Union rehabilitated him in 1956.

During his work at the sharashka, where he was put in charge of other workers, Theremin created the Buran eavesdropping system. A precursor to the modern laser microphone, it worked by using a low power infrared beam from a distance to detect the sound vibrations in the glass windows. Lavrentiy Beria, the head of the secret police organization NKVD(the predecessor of the KGB), used the Buran device to spy on the British, French and US embassies in Moscow.According to Galeyev, Beria also spied on Stalin; Theremin kept some of the tapes in his flat. In 1947, Theremin was awarded the Stalin prize for inventing this advance in Soviet espionage technology.Theremin invented another listening device called The Thing. Disguised in a replica of theGreat Seal of the United States carved in wood, in 1945 Soviet school children presented the concealed bug to U.S. Ambassador as a “gesture of friendship” to the USSR’s World War II ally. It hung in the ambassador’s residential office in Moscow, and intercepted confidential conversations there during the first seven years of the Cold War, until it was accidentally discovered in 1952. After his “release” from the sharashka in 1947, Theremin volunteered to remain working with the KGB until 1966. By 1947 Theremin had remarried, to Maria Guschina, his third wife, and they had two children: Lena and Natalia.

After working for the KGB, Theremin worked at the Moscow Conservatory of Music for 10 years where he taught, and built theremins,electronic cellos and some terpsitones (another invention of Theremin).There he was discovered by Harold Schonberg, the chief music critic of The New York Times, who was visiting the Conservatory. But when an article by his hand appeared, the Conservatory’s Managing Director declared that “electricity is not good for music; electricity is to be used for electrocution” and had his instruments removed from the Conservatory. Further electronic music projects were banned, and Theremin was summarily dismissed. In the 1970s, Léon Theremin was a Professor of Physics at Moscow State University (Department of Acoustics) developing his inventions and supervising graduate students. After 51 years in the Soviet Union Theremin started travelling, first visiting France in June 1989 and then the United States in 1991, each time accompanied by his daughter Natalia. Theremin was brought to New York by filmmaker Steven M. Martin where he was reunited with Clara Rockmore. He also made a demonstration concert at the Royal Conservatory of The Hague in early 1993 before dying in Moscow, Russia in 1993.

International Internet Day

International Internet Day takes place annually on 29 October. The Internet is the global system of interconnected computer networks that use the Internet protocol suite (TCP/IP) to link devices worldwide and build a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.

During the 1970’s Science fiction novellist Arthur C. Clarke had predicted that satellites would someday “bring the accumulated knowledge of the world to your fingertips” using a console that would combine the functionality of the photocopier, telephone, television and a small computer, allowing data tyransfer and video conferencing around the globe

The origins of the Internet date back to the 1960’s when research was commissioned by the federal government of the United States to build robust, fault-tolerant communication with computer networks. Initial concepts of wide area networking originated in several computer science laboratories in the United States, United Kingdom, and France. The US Department of Defense awarded contracts as early as the 1960s, including for the development of the ARPANET project, directed by Robert Taylor and managed by Lawrence Roberts. The first message was sent over the ARPANET on 29 October 1969 from computer science Professor Leonard Kleinrock’s laboratory at University of California, Los Angeles (UCLA) to the second network node at Stanford Research Institute (SRI).

Donald Davies first demonstrated packet switching in 1967 at the National Physics Laboratory (NPL) in the UK, which became a testbed for UK research for almost two decades. Packet switching networks such as the NPL network, ARPANET, Tymnet, Merit Network, CYCLADES, and Telenet, were developed in the late 1960s and early 1970s using a variety of communications protocols. The ARPANET project led to the development of protocols for internet working, in which multiple separate networks could be joined into a network of networks. The Internet protocol suite (TCP/IP) was developed by Robert E. Kahn and Vint Cerf in the 1970s and became the standard networking protocol on the ARPANET, incorporating concepts from the French CYCLADES project directed by Louis Pouzin. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations.

The ARPANET, initially served as a backbone for interconnection of regional academic and military networks in the 1980s until Commercial Internet service providers (ISPs) began to emerge in the very late 1980s whereupon The ARPANET was decommissioned in 1990. Limited private connections to parts of the Internet by officially commercial entities emerged in several American cities by late 1989 and 1990, Initially the National Science Foundation Network acted as a new backbone in the 1980s, providing private funding for other commercial extensions, this led to worldwide participation in the development of new networking technologies, and the merger of many networks. The NSFNET was decommissioned in 1995, removing the last restrictions on the use of the Internet to carry commercial traffic and the linking of commercial networks and enterprises by the early 1990s and marked the beginning of the transition to the modern Internet, generating a sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network. Although the Internet had been widely used by academia since the 1980s, the commercialization incorporated its services and technologies into virtually every aspect of modern life.

In the 1980s, research at CERN, a European research organisation nearGeneva straddling the border between France and Switzerland, by British computer scientist and Engineer Tim Berners-Lee and Belgian computer scientist Robert Cailliau proposed using hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”,

Berners Lee and Robert Cailliau wrote a proposal in March 1989 for a more effective CERN communication system but Berners-Lee eventually realised the concept could be implemented throughout the world Using concepts from his earlier hypertext systems such as ENQUIRE.  They published a more formal proposal (on 12 November 1990) to build a “Hypertext project” called “WorldWideWeb” (one word, also “W3″) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture. This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, so that,authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available.” While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, blogs, Web 2.0 and RSS/Atom.

The proposal was modeled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration. The CERN datacenter in 2010 housing some WWW serversA NeXT Computer was used by Berners-Lee as the world’s first web server and also to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser (which was a web editor as well); the first web server; and the first web pages, which described the project itself.The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina revealed in May 2013 that he has a copy of a page sent to him by Berners-Lee which is the oldest known web page. Jones stored it on a floppy disk and on his NeXT computer.

On 6 August 1991, Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext newsgroup. This date also marked the debut of the Web as a publicly available service on the Internet, although new users only access it after August 23. For this reason this is considered the internaut’s day. Many newsmedia have reported that the first photo on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has disclaimed this story, writing that media were “totally distorting our words for the sake of cheap sensationalism.” The first server outside Europe was set up at the Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIRES-HEP database. Accounts differ substantially as to the date of this event. The World Wide Web Consortium says December 1992, whereas SLAC itself claims 1991. This is supported by a W3C document titled A Little History of the World Wide Web. The crucial underlying concept of hypertext originated with older projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson’s Project Xanadu, and Douglas Engelbart’s oN-Line System (NLS). Both Nelson and Engelbart were in turn inspired by Vannevar Bush’s microfilm-based “memex”, which was described in the 1945 essay “As We May Think”.

Berners-Lee’s combined hypertext with the Internet. In his book Weaving The Web, he explains that he had repeatedly suggested that a marriage between the two technologies was possible to members of both technical communities, but when no one took up his invitation, he finally assumed the project himself. In the process, he developed three essential technologies:a system of globally unique identifiers for resources on the Web and elsewhere, the universal document identifier (UDI), later known as uniform resource locator (URL) and uniform resource identifier (URI);the publishing language HyperText Markup Language (HTML);the Hypertext Transfer Protocol (HTTP). The World Wide Web had a number of differences from other hypertext systems available at the time. The web required only unidirectional links rather than bidirectional ones, making it possible for someone to link to another resource without action by the owner of that resource. It also significantly reduced the difficulty of implementing web servers and browsers (in comparison to earlier systems), but in turn presented the chronic problem of link rot. Unlike predecessors such as HyperCard, the World Wide Web was non-proprietary, making it possible to develop servers and clients independently and to add extensions without licensing restrictions. On 30 April 1993, CERN announced that the World Wide Web would be free to anyone, with no fees due. Coming two months after the announcement that the server implementation of the Gopher protocol was no longer free to use, this produced a rapid shift away from Gopher and towards the Web.

An early popular web browser was ViolaWWW for Unix and the X Windowing System..Scholars generally agree that a turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the U.S. High-Performance Computing and Communications Initiative and the High Performance Computing and Communication Act of 1991, one of several computing developments initiated by U.S. Senator Al Gore. Prior to the release of Mosaic, graphics were not commonly mixed with text in web pages and the web’s popularity was less than older protocols in use over the Internet, such as Gopher and Wide Area Information Servers(WAIS). Mosaic’s graphical user interface allowed the Web to become, by far, the most popular Internet protocol.

The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October 1994. It was founded at theMassachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the Defense Advanced Research Projects Agency (DARPA), which had pioneered the Internet; a year later, a second site was founded at INRIA (a French national computer research lab) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, while the total number of websites was still minute compared to present standards, quite a number of notable websites were already active, many of which are the precursors or inspiration for today’s most popular services.Connected by the existing Internet, other websites were created around the world, adding international standards for domain names and HTML. Since then, Berners-Lee has played an active role in guiding the development of web standards (such as the markup languages in which web pages are composed), and has advocated his vision of a Semantic Web. The World Wide Web enabled the spread of information over the Internet through an easy-to-use and flexible format. It thus played an important role in popularizing use of the Internet. Although the two terms are sometimes conflated in popular use, World Wide Web is not synonymous with Internet. The web is a collection of documents and both client and server software using Internet protocols such as TCP/IP and HTTP.Tim Berners-Lee was knighted in 2004 by Queen Elizabeth II for his contribution to the World Wide Web.

The Internet has no centralized governance in either technological implementation or policies for access and usage; each constituent network sets its own policies. Only the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). Most traditional communications media, including telephony, radio, television, paper mail and newspapers are reshaped, redefined, or even bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, and video streaming websites. Newspaper, book, and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators. The Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, and social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their “brick and mortar” presence to serve a larger market or even sell goods and services entirely online. Business-to-business and financial services on the Internet affect supply chains across entire industries.

The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. Since the mid-1990s, the Internet has had a revolutionary impact on culture, commerce, and technology, including the rise of near-instant communication by electronic mail, instant messaging, voice over Internet Protocol (VoIP) telephone calls, two-way interactive video calls, and the World Wide Web with its discussion forums, blogs, social networking, and online shopping sites. The research and education community continues to develop and use advanced networks such as JANET in the United Kingdom and Internet2 in the United States. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1-Gbit/s, 10-Gbit/s, or more. The Internet’s takeover of the global communication landscape was almost instant in historical terms: it only communicated 1% of the information flowing through two-way telecommunications networks in the year 1993, already 51% by 2000, and more than 97% of the telecommunicated information by 2007. Today the Internet continues to grow, driven by ever greater amounts of online information, commerce, entertainment, and social networking. However, the future of the global internet may be shaped by regional differences in the world. In November 2006, the Internet was included on USA Today’s list of New Seven Wonders.

Bill Gates

American business magnate, software executive and philanthropist William Henry “Bill” Gates III was born October 28th, 1955. Bill Gates is the former chief executive and current chairman of Microsoft, the world’s largest personal-computer software company, which he co-founded with Paul Allen. He is consistently ranked among the world’s wealthiest people and was the wealthiest overall from 1995 to 2009, excluding 2008, when he was ranked third; in 2011 he was the third wealthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.

Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairmanthiest American and the second wealthiest person. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder, with 6.4 percent of the common stock. He has also authored or co-authored several books.

Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates has been criticized for his business tactics, which have been considered anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.Gates stepped down as chief executive officer of Microsoft in January 2000. He remained as chairman and created the position of chief software architect. In June 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work, and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect, and Craig Mundie, chief research and strategy officer. Gates’s last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairman.

Occupational Therapy Day

Occupational Therapy day is celebrated annually on October 27. The purpose of National Occupational Therapy day is to educate the public and give an insight concerning the valuable work which is done by occupational therapists.

Occupational therapy (OT) is the use of assessment and intervention to develop, recover, or maintain the meaningful activities, or occupations, of individuals, groups, or communities. It is an allied health profession performed by occupational therapists and Occupational Therapy Assistants. OTs often work with people with mental health problems, disabilities, injuries, or impairments. The American Occupational Therapy Association defines an occupational therapist as someone who “helps people across the lifespan participate in the things they want and need to do through the therapeutic use of everyday activities (occupations). Common occupational therapy interventions include helping children with disabilities to participate fully in school and social situations, injury rehabilitation, and providing supports for older adults experiencing physical and cognitive changes.” occupational therapists are university-educated professionals and must pass a licensing exam to practice. Occupational therapists often work closely with professionals in physical therapy, speech therapy, audiology, nursing, social work, clinical psychology, and medicine.

The earliest evidence of using occupations as a method of therapy can be found in ancient times. In c. 100 BCE, Greek physician Asclepiades treated patients with a mental illness humanely using therapeutic baths, massage, exercise, and music. Later, the Roman Celsus prescribed music, travel, conversation and exercise to his patients. However, by medieval times the use of these interventions with people with mental illness was rare, if not nonexistent. In 18th-century Europe, revolutionaries such as Philippe Pinel and Johann Christian Reil reformed the hospital system. Instead of the use of metal chains and restraints, their institutions used rigorous work and leisure activities in the late 18th century. This was the Moral Treatment era, developed in Europe during the Age of Enlightenment, where the roots of occupational therapy lie. Although it was thriving in Europe, interest in the reform movement fluctuated in the United States throughout the 19th century however during the 20th century is became Occupational Therapy. The Arts and Crafts movement that took place between 1860 and 1910 also impacted occupational therapy. In the US, a recently industrialized country, the arts and crafts societies emerged against the monotony and lost autonomy of factory work. Arts and crafts were used as a way of promoting learning through doing, provided a creative outlet, and served as a way to avoid boredom during long hospital stays.

Eleanor Clarke Slagle (1870-1942) is considered to be the “mother” of occupational therapy. Slagle, who was one of the founding members of the National Society for the Promotion of Occupational Therapy (NSPOT), proposed habit training as a primary occupational therapy model of treatment. Based on the philosophy that engagement in meaningful routines shape a person’s wellbeing, habit training focused on creating structure and balance between work, rest and leisure. Although habit training was initially developed to treat individuals with mental health conditions, its basic tenets are apparent in modern treatment models that are utilized across a wide scope of client populations. In 1915 Slagle opened the first occupational therapy training program, the Henry B. Favill School of Occupations, at Hull House in Chicago. Slagle went on to serve as both AOTA president and secretary. In 1954, AOTA created the Eleanor Clarke Slagle Lectureship Award in her honor. Each year, this award recognizes a member of AOTA “who has who has creatively contributed to the development of the body of knowledge of the profession through research, education, and/or clinical practice.”

The health profession of occupational therapy was conceived in the early 1910s as a reflection of the Progressive Era. Early professionals merged highly valued ideals, such as having a strong work ethic and the importance of crafting with one’s own hands with scientific and medical principles. The National Society for the Promotion of Occupational Therapy (NSPOT), now called the American Occupational Therapy Association (AOTA), was founded in 1917 and the profession of Occupational Therapy was officially named in 1921. William Rush Dunton, one of the founders of NSPOT and visionary figure in the first decades of the profession struggled with “the cumbersomeness of the term occupational therapy”, as it lacked the “exactness of meaning which is possessed by scientific terms”. Other titles such as “work-cure”,”ergo therapy”(ergo being the greek root for “work”), and “creative occupations” were discussed as substitutes, but ultimately, none possessed the broad meaning that the practice of occupational therapy demanded in order to capture the many forms of treatment that existed from the beginning.

The emergence of occupational therapy challenged the views of mainstream scientific medicine. Instead of focusing purely on the medical model, occupational therapists argued that a complex combination of social, economic, and biological reasons cause dysfunction. Principles and techniques were borrowed from many disciplines—including but not limited to physical therapy, nursing, psychiatry, rehabilitation, self-help, orthopedics, and social work—to enrich the profession’s scope. Between 1900 and 1930, the founders defined the realm of practice and developed supporting theories. By the early 1930s, AOTA had established educational guidelines and accreditation procedures.

The early twentieth century was a time in which the rising incidence of disability related to industrial accidents, tuberculosis, World War I, and mental illness brought about an increasing social awareness of the issues involved. The entry of the United States into World War I was also a crucial event in the history of the profession. Up until this time, occupational therapy had been concerned primarily with the treatment of people with mental illness. However, U.S. involvement in the Great War and the escalating numbers of injured and disabled soldiers presented a daunting challenge to those in command. The military enlisted the assistance of NSPOT to recruit and train over 1,200 “reconstruction aides” to help with the rehabilitation of those wounded in the war. With entry into World War II and the ensuing skyrocketing demand for occupational therapists to treat those injured in the war, the field of occupational therapy underwent dramatic growth and change. Occupational therapists needed to be skilled not only in the use of constructive activities such as crafts, but also increasingly in the use of activities of daily living.

There was a struggle to keep people in the profession during the post-war years. Emphasis shifted from the altruistic war-time mentality to the financial, professional, and personal satisfaction that comes with being a therapist. To make the profession more appealing, practice was standardized, as was the curriculum. Entry and exit criteria were established, and the American Occupational Therapy Association advocated for steady employment, decent wages, and fair working conditions. Via these methods, occupational therapy sought and obtained medical legitimacy in the 1920s. The 1920s and 1930s were a time of establishing standards of education and laying the foundation of the profession and its organization. Eleanor Clarke Slagle proposed a 12-month course of training in 1922, and these standards were adopted in 1923. Educational standards were expanded to a total training time of 18-months in 1930 to place the requirements for professional entry on par with those of other professions. The first textbook was published in United States for occupational therapy in 1947, edited by Helen S. Willard and Clare S. Spackman. The profession continued to grow and redefine itself in the 1950s. The profession also began to assess the potential for the use of trained assistants in the attempt to address the ongoing shortage of qualified therapists, and educational standards for occupational therapy assistants were implemented in 1960.

During The 1960s and 1970s Occupational Therapy changed and grew as new knowledge was incorperated and developments in the areas of neurobehavioral research led to new treatments, including the sensory integrative approach developed by A. Jean Ayers. As technology has increased the profession has Also grown and expanded its scope and settings of practice. Occupational science, the study of occupation, was created in 1989 as a tool for providing evidence-based research to support and advance the practice of occupational therapy, as well as offer a basic science to study topics surrounding “occupation”.

Mental health and moral treatment can also be improved with the use of occupational therapy. According to the World Health Organization, mental illness is one of the fastest growing forms of disability. OTs focus on prevention and treatment of mental illness in all populations. In the U.S., military personnel and veterans are populations that can benefit from occupational therapy, but currently this is an under served practice area. Occupational therapy practitioner’s roles have expanded to include political advocacy (from a grassroots base to higher legislation); The Patient Protection and Affordable Care Act had a habilitation clause that was passed in large part due to AOTA’s political efforts. Many Occupational therapy practitioners have been striving personally and professionally toward concepts of occupational justice and other human rights issues that have both local and global impacts. The World Federation of Occupational Therapist’s Resource Centre has many position statements on occupational therapy’s roles regarding their participation in human rights issues at http://www.wfot.org/ResourceCentre.aspx.


Other Holidays and National Days for 27 October are listed below.
• American Beer Day.
• Boxer Shorts Day.
• Cranky Co-Workers Day.
• National Forgiveness Day.
• National Potato Day.
• Navy Day.