Tuesday, June 2, 2009
Emo
History
Origins: 1980s
Emo emerged from the hardcore punk scene of early-1980s Washington, D.C., both as a reaction to the increased violence within the scene and as an extension of the personal politics espoused by Ian MacKaye of Minor Threat, who had turned the focus of the music from the community back towards the individual. Minor Threat fan Guy Picciotto formed Rites of Spring in 1984, breaking free of hardcore's self-imposed boundaries in favor of melodic guitars, varied rhythms, and deeply personal, impassioned lyrics. Many of the band's themes would become familiar tropes in later generations of emo music, including nostalgia, romantic bitterness, and poetic desperation. Their performances became public emotional purges where audience members would sometimes weep. MacKaye became a huge Rites of Spring fan, recording their only album and serving as their roadie on tour, and soon formed a new band of his own called Embrace which explored similar themes of self-searching and emotional release. Similar bands soon followed in connection with the "Revolution Summer" of 1985, a deliberate attempt by members of the Washington, D.C. scene to break from the rigid constraints of hardcore in favor of a renewed spirit of creativity. Bands such as Gray Matter, Beefeater, Fire Party, Dag Nasty, Lunchmeat, and Kingface were connected to this movement.
The exact origins of the term "emo" are uncertain, but date back to at least 1985. According to Andy Greenwald, author of Nothing Feels Good: Punk Rock, Teenagers, and Emo, "The origins of the term 'emo' are shrouded in mystery [...] but it first came into common practice in 1985. If Minor Threat was hardcore, then Rites of Spring, with its altered focus, was emotional hardcore or emocore." Michael Azerrad, author of Our Band Could Be Your Life, also traces the word's origins to this time: "The style was soon dubbed 'emo-core,' a term everyone involved bitterly detested, although the term and the approach thrived for at least another fifteen years, spawning countless bands." MacKaye also traces it to 1985, attributing it to an article in Thrasher magazine referring to Embrace and other Washington, D.C. bands as "emo-core", which he called "the stupidest fucking thing I’ve ever heard in my entire life. Other accounts attribute the term to an audience member at an Embrace show, who yelled that the band was "emocore" as an insult. Others contend that MacKaye coined the term when he used it self-mockingly in a magazine, or that it originated with Rites of Spring. The Oxford English Dictionary, however, dates the earliest usage of "emo-core" to 1992 and "emo" to 1993, with "emo" first appearing in print media in New Musical Express in 1995.
The "emocore" label quickly spread around the Washington, D.C. punk scene and became attached to many of the bands associated with MacKaye's Dischord Records label. Although many of these bands simultaneously rejected the term, it stuck nonetheless. Scene veteran Jenny Toomey has recalled that "The only people who used it at first were the ones that were jealous over how big and fanatical a scene it was. [Rites of Spring] existed well before the term did and they hated it. But there was this weird moment, like when people started calling music 'grunge,' where you were using the term even though you hated it.
The Washington, D.C. emo scene lasted only a few years. By 1986 most of the major bands of the movement—including Rites of Spring, Embrace, Gray Matter, and Beefeater—had broken up. Even so, the ideas and aesthetics originating from the scene spread quickly across the country via a network of homemade zines, vinyl records, and hearsay. According to Greenwald, the Washington, D.C. scene laid the groundwork for all subsequent incarnations of emo:
What had happened in D.C. in the mid-eighties—the shift from anger to action, from extroverted rage to internal turmoil, from an individualized mass to a mass of individuals—was in many ways a test case for the transformation of the national punk scene over the next two decades. The imagery, the power of the music, the way people responded to it, and the way the bands burned out instead of fading away—all have their origins in those first few performances by Rites of Spring. The roots of emo were laid, however unintentionally, by fifty or so people in the nation's capital. And in some ways, it was never as good and surely never as pure again. Certainly, the Washington scene was the only time "emocore" had any consensus definition as a genre. MacKaye and Piccioto, along with Rites of Spring drummer Brendan Canty, went on to form the highly influential Fugazi who, despite sometimes being connected with the term "emo", are not commonly recognized as an emo band.
Socure: http://en.wikipedia.org/wiki/Emo
Monday, June 1, 2009
The Story Of Jazz
Recently, as I departed from Philadelphia to Europe, a friend gave me a book to read on the plane, a paperback which evidently had been on her bookshelves for a long time. It had enough wear and tear to suggest it had been read all the way through by my friend, a teacher and singer with a taste for jazz.
On the overnight flight from Philadelphia to Paris, I opened the book with some doubt and hesitation, since what I had read in the way of jazz criticism and scholarship in the past had varied in quality, and was sometimes out and out boring, or, when interesting, had been egotistical, overly casual, factually inaccurate and/or tendentious. (There are notable exceptions, of course, including Count Basie’s fascinating autobiography, Good Morning Blues, written with Albert Murray, and Lewis Porter’s meticulous and powerful biography, John Coltrane: His Life and Music).
However, after reading a few pages of the book my friend gave to me, Marshall Stearns’ The Story Of Jazz, I recognised a document of historical significance. First published by Oxford University Press in 1956, its excellent scholarship is combined with articulate prose that, like jazz itself, tells a deep, rich and absorbing story. The book also offers a snapshot of what jazz looked like and felt like to an avid listener reflecting on its developments up to the early post-bop and cool jazz era of the mid-1950s.
As Stearns' story progresses from the sixteenth century (yes, back that far) to the twentieth, names like Buddy Bolden, Jelly Roll Morton, Bix Beiderbecke, Louis Armstrong and Lester Young pop up, and then Charlie Parker, Stan Kenton, Dizzy Gillespie, Miles Davis, Thelonious Monk and Lennie Tristano. But, alas, no John Coltrane, no Ornette Coleman, no Bill Evans, no Herbie Hancock. In other words, the book was written shortly before these musicians reached the limelight. Yet the writing is so insightful that the author anticipated many developments to come.
For me, reading this book a full fifty years after its first publication was like turning the light on in a room. Suddenly, I knew where everything was located. I’ve been listening to jazz since my college days in the early 1960s, and have written about it for the past six or seven years. My interest is profound. Jazz excites me, and I listen to it all the time. I think about it a lot. What is jazz about? What does it all mean, musically and in terms of the human spirit? How did Lester Young influence Dexter Gordon? And all that.
But until I read this book, I had no idea how it all fit together. Of course, I knew the clichés: that jazz originated in New Orleans with Creole funerals, marching bands, and the hot trumpet of Buddy Bolden. That it migrated to the north, the southwest, and the east and west coasts. That it went through the so-called jazz age, Kansas City style, the swing era, be-bop, cool and so on.
But this didn’t explain how jazz, a unique product of Afro-American history and modern society, truly evolved at its deeper levels of musical expression. Marshall Stearns, an avid listener who collected research data, recordings, and interviews with musicians of all periods, puts together an illuminating picture that involves historical facts and musical analysis, in a common sense style, yet with great attention to detail.
The most important part of this book, in my opinion, is the first half, where Stearns traces the origins of jazz back to the rhythms, inflections, and rituals of West Africa, from whence the slaves of the Americas were mercilessly captured, dehumanized and bewitched through the dissemination of these influences in South America, the West Indies and the American South.
Stearns shows how the diverse musical styles that evolved gradually combined with European music, especially its harmonies, leading to gospel music, marching bands and work chants and songs. Voodoo dancing, called vodon, based on spirit possession of the dancer, was a very important influence. (In the most intense and expressive jazz performances and recordings of any time period, one can still feel this sense of being possessed. It also influenced the feeling behind the blues, that singular combination of joy and sorrow that almost in itself defines jazz.)
Stearns also discusses the origin and development of blue notes, the flattened third and seventh notes of the diatonic scale. He shows that these notes are actually ranges of tones that can be altered to express emotions. There is no precedent for this in the tempered scale of European music. It is one of the things that makes jazz unique. Stearns also points out that jazz, with roots in West African drumming and Afro-Cuban music, allows for a much greater variation of rhythmic complexities than European music. However, the latter is what has given jazz its complex harmonies, which rose to new dimensions with bebop and post-bop. From the beginning, jazz was in its own way world music, the music of many nations.
The book explains how West Africa influenced jazz through gospel music and African American work songs, as were sung in the cotton fields and on prison chain gangs. These gave jazz its various inflections and shifts in voice as well as the riff, which derives from the call and response pattern of gospel and work singing. The author then traces the impact of these forms on boogie woogie and ragtime, and tells us how these differ from the true jazz idiom, which was only later fully developed by Louis Armstrong.
Stearns goes on to discuss in detail the early big bands, such as that of Fletcher Henderson, the development of swing, and the emergence of bebop and cool jazz. His discussion of the hipster persona of the cool jazz period is quite humorous. Stearns is fair to the newer music of that time, but he seems to prefer hot to cool jazz, so his portrayal of the music of Chet Baker, Gerry Mulligan and others of the West Coast cool school seems negatively biased.
For me, the most important contribution of this profound work is how clearly it shows that jazz was not an out of nowhere American invention, but developed from a rich tapestry of music of diverse sources, each of which has a complexity and beauty of its own making. One can see then how even the far out experiments of free jazz, and music like Coltrane’s “Meditations,†which may sound strange to some listeners, actually derive from the early roots that led to jazz as a distinct form. The entire panoply of jazz music begins to fit together and make sense in light of Stearn’s insight into the early sources.
I am sure there are jazz authorities who would question the accuracy of some of Stearns’ research and analysis. The book could also be sharply criticized for its repeated use of the word negro and its lack of clear opposition to the oppression of African Americans.
On the other hand, the book was written before the civil rights movement, when the word negro was still acceptable to many people of color, and Stearns is certainly critical of social oppression. His stance is consistent with that of Martin Luther King in that he is not blaming or pointing fingers, but simply describing the reality and advocating change. Much to his credit, he is kind, compassionate and balanced even when he adopts the role of critic.
Though it may reveal my lack of knowledge of the literature on jazz music, I am grateful to my friend for turning me on to Stearns’ work. This book definitely bears re-reading and re-consideration by jazz scholars, critics, and historians on the occasion of its fiftieth anniversary - and all serious jazz fans will find it highly readable, fascinating and illuminating.
History and development
Top view: 1 ride cymbal, 3 crash cymbals, 1 splash cymbal, 1 china cymbal, 2 bass drums, 2 mounted toms, 2 floor toms, 1 snare drum, 1 hi-hat, 1 throne
Drum sets were first developed due to financial and space considerations in theaters where drummers were encouraged to cover as many percussion parts as possible. Up until then, drums and cymbals were played separately in military and orchestral music settings. Initially, drummers played the bass and snare drums by hand, then in the 1890s they started experimenting with foot pedals to play the bass drum. William F. Ludwig made the bass drum pedal system workable in 1909, paving the way for the modern drum kit.
By World War I drum kits were characterized by very large marching bass drums and many percussion items suspended on and around it, and they became a central part of jazz music. At that time it consisted of only a bass and snare drum and only occasionally a hi hat. Later, cymbals and a floor tom were added. Finally the mounted toms were added to the set to make it what it is today as a complete set. Hi-hat stands appeared around 1926. Metal consoles were developed to hold rack toms, with swing out stands for snare drums and cymbals. On top of the console was a "contraptions" (shortened to "trap") tray used to hold whistles, klaxons, and cowbells, thus drum kits were dubbed "trap kits."
By the 1930s, Gene Krupa and others popularized streamlined trap kits leading to a basic four piece drum set standard: bass, snare, rack toms, and floor tom. In time legs were fitted to larger floor toms, and "consolettes" were devised to hold smaller tom-toms on the bass drum. In the 1940s, Louie Bellson pioneered use of two bass drums, or the double bass drum kit. With the ascendancy of rock and roll, the role of the drum kit player became more visible, accessible, and visceral. The watershed moment occurred in 1964, when Ringo Starr of The Beatles played his Ludwig kit on American television; an event that motivated legions to take up the drums.
The trend toward bigger drum kits in Rock music began in the 1960s and gained momentum in the 1970s. By the 1980s, widely popular drummers like Neil Peart, Billy Cobham, Carl Palmer, Bill Bruford, and Terry Bozzio were using large numbers of drums and cymbals[1] and had also begun using electronic drums. John Bonham of Led Zeppelin also helped to revolutionize the drum kit and master new unheard of beats. Double bass pedals (Often used in heavy metal) were developed to play on one bass drum, eliminating the need for a second bass drum. In the 1990s and 2000s, many drummers in popular music and indie music have reverted back to basic four piece drum set standard.[2]
In the present, it is not uncommon for drummers to use a variety of auxiliary percussion instruments, found objects, and electronics as part of their "drum" kits. Popular electronics include: electronic sound modules; laptop computers used to activate loops, sequences and samples; metronomes and tempo meters; recording devices; and personal sound reinforcement equipment.
Source: http://en.wikipedia.org/wiki/Drum_kit
Wednesday, May 27, 2009
Hard disk drive
A hard disk drive (often shortened as "hard disk" or "hard drive"), is a non-volatile storage device which stores digitally encoded data on rapidly rotating platters with magnetic surfaces. Strictly speaking, "drive" refers to a device distinct from its medium, such as a tape drive and its tape, or a floppy disk drive and its floppy disk. Early HDDs had removable media; however, an HDD today is typically a sealed unit (except for a filtered vent hole to equalize air pressure) with fixed media.
History
Main article: History of hard disk drives
HDDs (introduced in 1956 as data storage for an IBM accounting computera) were originally developed for use with general purpose computers. During the 1990s, the need for large-scale, reliable storage, independent of a particular device, led to the introduction of embedded systems such as RAID arrays, network attached storage (NAS) systems and storage area network (SAN) systems that provide efficient and reliable access to large volumes of data. In the 21st century, HDD usage expanded into consumer applications such as camcorders, cellphones (e.g. the Nokia N91), digital audio players, digital video players (e.g. the iPod Classic), digital video recorders, personal digital assistants and video game consoles.
Technology
HDDs record data by magnetizing ferromagnetic material directionally, to represent either a 0 or a 1 binary digit. They read the data back by detecting the magnetization of the material. A typical HDD design consists of a spindle which holds one or more flat circular disks called platters, onto which the data are recorded. The platters are made from a non-magnetic material, usually aluminum alloy or glass, and are coated with a thin layer of magnetic material. Older disks used iron(III) oxide as the magnetic material, but current disks use a cobalt-based alloy.[citation needed]
A cross section of the magnetic surface in action. In this case the binary data is encoded using frequency modulation.
The platters are spun at very high speeds. Information is written to a platter as it rotates past devices called read-and-write heads that operate very close (tens of nanometers in new drives) over the magnetic surface. The read-and-write head is used to detect and modify the magnetization of the material immediately under it. There is one head for each magnetic platter surface on the spindle, mounted on a common arm. An actuator arm (or access arm) moves the heads on an arc (roughly radially) across the platters as they spin, allowing each head to access almost the entire surface of the platter as it spins. The arm is moved using a voice coil actuator or in some older designs a stepper motor.
The magnetic recording media are CoCrPt-based magnetic thin films of about 10-20 nm in thickness. The thin films are normally deposited on glass/ceramic/metal substrate and covered by thin carbon layer for protection. The Co-based alloy thin films are polycrystalline and the size of grains has an order of 10 nm. Because the sizes of each grain are tiny, they are typical single domain magnets. The media are magnetically hard (coercivity is about 0.3T) so that a stable remnant magnetization can be achieved. The grain boundaries turn out to be very important. The reason is that, the grains are very small and close to each other, so the coupling between each grains are very strong. When one grain is magnetized, the adjacent grains tend to be aligned parallel to it or demagnetized. Then both the stability of the data and signal-to-noise ratio will be sabotaged. A clear grain boundary can weaken the coupling of the grains and subsequently increase the signal-to-noise ratio. During writing process, ideally one grain can store one bit (1/0). However, current technology can not reach that far yet. In practice, a group of grains (about 100) are magnetized as one bit. So, in order to increase the data density, smaller grains are required. From microstructure point of view, longitudinal and perpendicular recording are the same. Also, similar Co-based thin films are used in both longitudinal and perpendicular recording. However, the fabrication processes are different to gain different crystal structure and magnetic properties. In longitudinal recording, the single-domain grains have uniaxial anisotropy with easy axes lying in the film plane. The consequence of this arrangement is that adjacent magnets repel each other. Therefore the magnetostatic energy is so large that it is difficult to increase areal density. Perpendicular recording media, on the other hand, has the easy axis of the grains oriented perpendicular to the disk plane. Adjacent magnets attract to each other and magnetostatic energy are much lower. So, much higher areal density can be achieved in perpendicular recording. Another unique feature in perpendicular recording is that a soft magnetic underlayer are incorporated into the recording disk.This underlayer is used to conduct writing magnetic flux so that the writing is more efficient. This will be discussed in writing process. Therefore, a higher anisotropy medium film, such as L10-FePt and rare-earth magnets, can be used.
Older drives read the data on the platter by sensing the rate of change of the magnetism in the head; these heads had small coils, and worked (in principle) much like magnetic-tape playback heads, although not in contact with the recording surface. As data density increased, read heads using magnetoresistance (MR) came into use; the electrical resistance of the head changed according to the strength of the magnetism from the platter. Later development made use of spintronics; in these heads, the magnetoresistive effect was much greater than in earlier types, and was dubbed "giant" magnetoresistance (GMR). This refers to the degree of effect, not the physical size, of the head — the heads themselves are extremely tiny, and are too small to be seen without a microscope. GMR read heads are now commonplace.[citation needed]
HD heads are kept from contacting the platter surface by the air that is extremely close to the platter; that air moves at, or close to, the platter speed.[citation needed] The record and playback head are mounted on a block called a slider, and the surface next to the platter is shaped to keep it just barely out of contact. It's a type of air bearing.
The magnetic surface of each platter is conceptually divided into many small sub-micrometre-sized magnetic regions, each of which is used to encode a single binary unit of information. In today's HDDs, each of these magnetic regions is composed of a few hundred magnetic grains. Each magnetic region forms a magnetic dipole which generates a highly localized magnetic field nearby. The write head magnetizes a region by generating a strong local magnetic field. Early HDDs used an electromagnet both to generate this field and to read the data by using electromagnetic induction. Later versions of inductive heads included metal in Gap (MIG) heads and thin film heads. In today's heads, the read and write elements are separate, but in close proximity, on the head portion of an actuator arm. The read element is typically magneto-resistive while the write element is typically thin-film inductive.
In modern drives, the small size of the magnetic regions creates the danger that their magnetic state might be lost because of thermal effects. To counter this, the platters are coated with two parallel magnetic layers, separated by a 3-atom-thick layer of the non-magnetic element ruthenium, and the two layers are magnetized in opposite orientation, thus reinforcing each other. Another technology used to overcome thermal effects to allow greater recording densities is perpendicular recording, first shipped in 2005, as of 2007 the technology was used in many HDDs.[9][10][11]
Modern drives also make extensive use of Error Correcting Codes (ECCs), particularly Reed–Solomon error correction. These techniques store extra bits for each block of data that are determined by mathematical formulas. The extra bits allow many errors to be fixed. While these extra bits take up space on the hard drive, they allow higher recording densities to be employed, resulting in much larger storage capacity for user data.
source: http://en.wikipedia.org/wiki/Hard_disk
Execution and storage
Typically, computer programs are stored in non-volatile memory until requested either directly or indirectly to be executed by the computer user. Upon such a request, the program is loaded into random access memory, by a computer program called an operating system, where it can be accessed directly by the central processor. The central processor then executes ("runs") the program, instruction by instruction, until termination. A program in execution is called a process. Termination is either by normal self-termination or by error — software or hardware error.
Embedded programs
The microcontroller on the right of this USB flash drive is controlled with embedded firmware.
Some computer programs are embedded into hardware. A stored-program computer requires an initial computer program stored in its read-only memory to boot. The boot process is to identify and initialize all aspects of the system, from CPU registers to device controllers to memory contents. Following the initialization process, this initial computer program loads the operating system and sets the program counter to begin normal operations. Independent of the host computer, a hardware device might have embedded firmware to control its operation. Firmware is used when the computer program is rarely or never expected to change, or when the program must not be lost when the power is off.
Manual programming
Switches for manual input on a Data General Nova 3
Computer programs historically were manually input to the central processor via switches. An instruction was represented by a configuration of on/off settings. After setting the configuration, an execute button was pressed. This process was then repeated. Computer programs also historically were manually input via paper tape or punched cards. After the medium was loaded, the starting address was set via switches and the execute button pressed.
Automatic program generation
Generative programming is a style of computer programming that creates source code through generic classes, prototypes, templates, aspects, and code generators to improve programmer productivity. Source code is generated with programming tools such as a template processor or an Integrated Development Environment. The simplest form of source code generator is a macro processor, such as the C preprocessor, which replaces patterns in source code according to relatively simple rules.
Software engines output source code or markup code that simultaneously become the input to another computer process. The analogy is that of one process driving another process, with the computer code being burned as fuel. Application servers are software engines that deliver applications to client computers. For example, a Wiki is an application server that allows users to build dynamic content assembled from articles. Wikis generate HTML, CSS, Java, and Javascript which are then interpreted by a web browser.
Simultaneous execution
See also: Process (computing) and Multiprocessing
Many operating systems support multitasking which enables many computer programs to appear to be running simultaneously on a single computer. Operating systems may run multiple programs through process scheduling — a software mechanism to switch the CPU among processes frequently so that users can interact with each program while it is running. Within hardware, modern day multiprocessor computers or computers with multicore processors may run multiple programs.
Functional categories
Computer programs may be categorized along functional lines. These functional categories are system software and application software. System software includes the operating system which couples the computer's hardware with the application software.The purpose of the operating system is to provide an environment in which application software executes in a convenient and efficient manner. In addition to the operating system, system software includes utility programs that help manage and tune the computer. If a computer program is not system software then it is application software. Application software includes middleware, which couples the system software with the user interface. Application software also includes utility programs that help users solve application problems, like the need for sorting.
Source: http://en.wikipedia.org/wiki/Computer_program
Subscribe to:
Posts (Atom)