Friday, February 25, 2011

If I Had Shoes I Would Walk All Over You.....2

The history of shoes is relatively an exciting invention; shoes first came on the scene in the 1600-1200 BC in Mesopotamia worn by the mountain people who lived on the border of Iran (inventors.about.com/od/sstinventions/aShoes_2.com). In other counties such as Egypt and the Middle East, shoes were shape in the form of a sandal, which was made from palm leaves (factshistoryofshoes.com). Back in Mesopotamia they first started with soft material which wrap around the foot to help protect the foot from being damage. There was only one style there was no difference between a right and left shoe the soft material suited everyone's feet. In 1850 shoes took their first break with leather material, the leather was heavier material that wrapped around the soft material to help protect the foot while mountain climbing one good word to describe how this typical shoe looked is, moccasin. Moccasins were worn when traveling daily and hunting for food, it was not until moccasins cross over to America, Americans found there utopia.

Shoes were first introduced to the United States in 1628 by one of the Mayflower pilgrim, by the name of Thomas Beard, who nailed together the first pair of shoes in America (factshistoryofshoes.com). This was a trade Beard took up from the Indians, and rapidly became popular in America. In the book Culture and Technology, it describes the convenience of technology as a forced by fordism which causes things to speed up for the convenience of materials things which surrounds us. This is exactly what it did in 1858; Lyman Reed Blake was an American who invented the sewing machine for soles of the shoe. Americans took shoes to another level, not only using animals to eat but to make shoes. Shoes were soon put into a size; size thirteen was the only shoe size that was created through barleycorns. Barleycorn was an England term meaning, ten plus three use from the English king, Edward II in1324. This term was brought to America by the Indiana’s. Materials were being added on shoes almost every year. In the book, Culture and Technology, it talked about progressing moving forward. America found evolution being able to walk around without glass or rocks touching their feet. By the 1900 shoes were introduce to factories all over the world. The average pair of shoes caused eleven dollars to make and could take up to weeks to make. By this time color, valcro, and shoestrings also played a major role in making a shoe. The mark of the new term began in 1913 when shoes were starting to get recognized by particular names.

Friday, February 11, 2011

History of Cosmetic Makeup

“A woman without paint is like food without salt.”- Plautus

Early Uses

The first known uses of makeup appear in Egypt’s first dynasty around 3000 B.C. Archaeologists and historians discovered jars of makeup in ancient Egyptian tombs. Particularly, the use of kohl appeared most prevalent. Kohl, essentially a product of soot, acts to darken around the eyes. Both men and women play a part in the history and evolution of makeup in ancient times, as Egyptian men also used kohl to illuminate the eye.

Like Egypt, ancient Roman civilization also developed and consumed cosmetic products and remedies. For example, Plautus, a Roman philosopher living around 200 B.C., famously professed, “A woman without paint is like food without salt.” This statement represents both Roman ideals regarding beauty and somewhat accurately predicts the ideology makeup will represent globally in the future. Specifically, Romans also used kohl to darken the eyes, making them a predominant feature of the face. In addition to kohl, Romans used chalk to lighten the skin and rouge to redden the cheeks.

Later, during the European Middle Ages, makeup became a sign of affluence. Again, like the Romans, pale complexions represented beauty. However, here, pale skin indicated wealth as the rich did not attend to manual labor outdoors as lower social classes could not avoid natural elements due to their work. Despite the use of “lightening” makeup by the affluent, many women achieved this look by literally bleeding their faces. Similarly, Victorian notions of beauty emphasized the gentile appearance of fair complexions. However, Victorian women approached skin care differently than previous eras. While excessive makeup was largely discouraged, women drank vinegar and avoiding going outdoors to retain their pale complexions. http://www.fashion-era.com/make_up.htm#Victorian%20Delicacy

During the Regency Era in the United Kingdom, the era which bridges the Gregorian and Victorian eras, beauty became a central feature of femininity. During this period, beginning in the early 1800s, the concept of lightening skin became widespread. Light skin represented a life of leisure. Like, the Middle Ages, this era favored pale complexions, however increased technologies became available. For example, despite that many chemicals used in this process were lethal women, still used cosmetic products Many of these cosmetics contained white lead and mercury, which could prove fatal. During the 1800s many women used “belladonna”, a plant native to Western nations, to make their eyes appear more “luminous.” However, this substance is poisonous and continued to sell despite its fatal risk. Although often made by pharmacists in English apothecaries, these products contained mercury and nitric acid, both of which are poisonous. In addition to women’s use of cosmetic makeup, men also adorned their faces with a variety of products. Until the mid-1800s, men used makeup widely. For example, King George IV used makeup regularly, as it was a socially acceptable mode of dress. http://www.authorsden.com/visit/viewArticle.asp?id=15438

Twentieth Century

The beginning of the twentieth century marks a critical turning point in the cosmetic industry. Companies began to emerge resulting in mass production and subsequent consumption of cosmetics. For example, Maybelline and Max Factor introduced products to consumer market. Specifically, the 1920s marks the main ideological shift toward makeup and cosmetics. That is, through makeup, women displayed a sense of independence and freedom of expression. Sun-tanning became particularly notable in the 1920s as women began to favor tanner skin complexions celebrated by the famed, Coco Chanel. This trend led to the development of lotions and creams designed to protect the skin from the sun and natural elements.

In Cosmetics, an article published in 1938, H. H. Hazen an American physician provides an overview of cosmetic use from both a medical and social standpoint. Rouge, according to Hazen, consists of talcum, zinc oxide starch, chalk, dyes, wax and fat. Additionally, undesirable elements inherent in rouge include barium and lead. Lipsticks act to stick color onto the lip. Made of, waxes, fat and dyes, lipsticks act to dye the lips. Eyelash enhancers originally made from copper or lead ore still produced fatal consequences as late as 1938 when Dr. Hazen published this article. He notes that mascara consists of lamp black, petrolatum and a component of soap. However, numerous accounts of blindness and sever poisoning still remained recurred. http://www.jstor.org/pss/3413149

Theoretically, the inclusion of makeup as a societal norm, asks two questions. First, if makeup is in fact a technology, what is it for? That is to say, what is society progressing toward? Secondly, one must ask whom the technology is intended to progress through this technology. Specifically, who benefits from the technology and who is victim to its product remains critical. Inequalities of power result through the introduction of technologies, affecting seminal patterns of social existence. The evolving technological advancements in cosmetics and makeup represent evolving cultural ideologies including increasing globalization and ethnocentric outlooks. Technologies like cosmetic makeup ultimately act to overcome the human body.

In today’s society of mass consumption, the cosmetic industry has evolved through cultural influences that preclude the appearance of new technologies. For examples, today, cosmetic companies use clinical tests, emphasizing protection of consumers through the guise of “safety.” Although these products are no longer considered a luxury, the ideal of female beauty rests firmly on the cosmetic makeup industry.

(Enter Witty Title)

The KINECT!!!

It All started October 18th 1958 when a physicist debut his newest invention, an electronic tennis game with two separate controllers connected to an analog computer. Ok, I guess I could fast forward a few years and get to the real inspiration of Xbox Kinect.

It all started on November 19th 2006, when Nintendo launched the Wii. The Wii is a gaming console that uses wireless controllers to track your movements so that it can be relayed back to the console to move your avatar in the game. This was the first gaming console that actually made people get up off of their butts inside of their own home to play a game. It literally took the world by storm. In the U.S it broke records for most sold console in a single month. With that being said it is only natural that its competitors would try to match it or up the ante. In 2007 Don Mattrick, leader of the xbox business, challenged his x box team to expand gaming to a whole new set of customers by getting rid of the controller. A small team was put together to research this possibility. After a short while the team realized that the hardest challenge was going to create a device that tracked the user’s body as it moved.

Microsoft bought many 3d companies to try to solve this problem. Eventually with their vast technology they figured out thru math how to let a depth sensor track the body parts without getting confused by rapid movements. This was the break thru that they needed. The next obstacle to tackle was the use of voice recognition. This would be a very complicated process if Microsoft would have started from scratch, but they had been working on voice recognition for nearly over a decade.

With all of the technological questions answered, all that was lacking was the first game. They knew that Wii sports made the wii popular so they needed something similar, so they started working on Kinect Sports. Microsoft got to work and in 4 years from the first idea they released the Kinect. Inspired by the Wii but made with many other enhancements and without the Wii’s biggest flaw, a controller.

But What is The Kinect?

The Kinect itself is a bar that you place below or above your tv that has motorized pivot point that actually moves up and down to get the best angles. (everytime I start up my X box and it moves it still amazes me) Inside of the Kinect It has 3 main components. A microphone, a Rgb Camera, and a depth sensor. This allows the kinect to have voice recognition, full body 3d capturing, and even face recognition. With These 3 things it resulted in a new multi billion dollar business.




I'm Bug Free with my DDT!

Hey farmer, farmer put away that DDT now,
give me spots on my apples, 
but leave me the birds and the bees please!
"Big Yellow Taxi"- Joni Mitchell






DDT
Dichlorodiphenyltrichloroethane or as it is commonly known, DDT, is an organochlorine (carbon and hydrogen that share an electron bond with one or more chlorine atoms), and is used in insect control. Way back in 1874 it was first synthesized, and it was not until 1940 that it insecticidal properties were discovered by the Swiss Paul Müller.  He was awarded the Nobel Peace Prize, as it was proven highly proficient in sticking onto the shell of arthropods. It was marketed in 1942 as a pesticide that was good to specifically use against the housefly, louse, Colorado beetle, and the deadly mosquito. 
The insecticide used before the war was slowly running low on its reserves, so the introduction of DDT was welcome. Typhus is spread by the louse, which is very prevalent in crowded wartime conditions, and DDT killed the louse very efficiently. Malaria is spread by the mosquito, and DDT was used throughout the world to quell the epidemic. The Colorado beetle was a menace to crops and DDT was seen as a probable answer. During these years the chemical was sprayed everywhere, on crops, sprayed through neighborhoods (like our modern day mosquito trucks), and apparently down soldier’s shirt fronts.
DDT has saved many lives, supposedly 25 million soldiers were saved from malaria and typhus due to its extensive use in those disease prone areas of the world (click the photo caption for more info). How DDT kills is not really known, but it acts as a sort of nerve poison. What I have been able to gather from my sources is that the pesticide sticks to the exoskeleton of arthropods, and then it eventually makes its way into the body, where it begins to force open sodium ion channels in neurons. Because of these rapidly firing neurons, the insect starts to spasm, these spasms are a precursor to death.
Things were looking well for the first synthetic of the industrial era, but soon it was to be known of its disastrous effects on the environment. In 1947 tests were done of insects resistance to DDT, but the results were not widely published. By the eighties over 200 insect species were shown to be resistant to DDT. This was mainly because of overuse in agriculture.
Not only were insects building up a resistance to the pesticide, it was revealed that the stuff sits in the soil for up to two to fifteen years in water it can last up to 150 years! It is also said to “bioacculmulate” which means that if DDT is ingested it is not expressed naturally out of the body. It sits in the fatty stores of the body and when a person begins to lose weight, the toxin is released and is detrimental to the liver. It is highly toxic to fish, which in turn can be passed onto birds. It was found that their egg shells were thinning, the American Bald Eagle almost went extinct, possibly due to DDT.
Even if a place never spayed DDT it can spread to it by water currents and wind, so it is a global risk and not a localized one. When DDT was first introduced it was seen as continued human dominion over the animal kingdom, another one of our manifest destinies, but it made us kings of perfect crops but quiet fields. DDT was banned in 1972, but the U.S. still produces and exports to countries that still use it. It is also stockpiled in case of a medical emergency, but one would think with its track record, that it would cause more damage than fix it.




The history of the xbox 360 is a long and amazing trip that has led us to one of the most amazing console systems that the world has ever seen. The system was designed to compete with the Sony Playstation, that was quickly taking over the video game world. Upon seeing this Bill Gates, the founder/owner of the largest company in the world (Microsoft) chose to jump into the field of console gaming. Bill was very experenced in the gaming world but not through consoles but through the PC. (Otherwise known as the personal computer) Gates knew there was a large market in the gaming field and knew that with his software background that microsoft could compete with Sony better than any other company including the Japanese market of Nintendo. The original Xbox was created in November of 2001, and was revolutionary to the gaming world. When this unit was announced everyone was shocked at the graphical interface and the lines at walmart grew fast. Microsoft realized with this release that the market for gaming was one that was worth the heavy investment. After seeing the way the American public flocked to the original xbox, Gates chose to go ahead and invest four billion dollars into the xbox360 console, the console that would help to change gaming as we know it. Sony was to release their Playstation3 around the same time as the xbox360. Upon hearing this Gates pushed to have the 360 released before the PlayStation 3 hit the markets. This proved to be a hit as well as a miss for the mogul Microsoft. The release of the xbox 360 was, from a financial aspect a massive success the first year it was released. It was destroying the competition in sony as well as Nintendo, namely because it was released earlier and Gates had mastered the online market of "Xboxlive" Long before The Wii (Nintendo) or the Sony Playstation3 could. This quick success would come with a price however. Microsoft lost several thousand members on the Xbox live community due to faulty products that were released before all testing was done. This created a huge market of "Red Ring of Doom" Xboxs.

This market was SO vast that retailers have made small fortunes just repairing and reselling these millions of broken units. As Microsoft scrambled to get their products back in line Sony was upgrading and updating all of their items. Though Sony was having their own issues with hardware it was nothing like the Xbox market had to deal with. While Xbox was busy doing damage control Sony was developing games and moving forward. Though this "Era" of the xbox did hurt it's productivity Gates would make many "comebacks". Microsoft would announce many very innovative products, games, and online options that Sony and their "blue ray" couldn't match. The battle still continues to this day, of who will run the console gaming market, or leader of the nerds so to speak. By the end of this blog you will understand why the craze not only for the 360 but for gaming in general is taking over the nation. As you can see below... It is taking over the entire world... Just ask super kitty...

It's PURRRRRRFECT!!!

-Whizzle Out...

Taking It to the Streets: A History of Portable Music

It has been over 50 years since music first went portable. It caught me by surprise when investigating the history of the iPod that its predecessors are vast and date back much further than I originally expected. In Van Morrison's song, "Brown Eyed Girl" the lyrics, "Chasin' down the old man with the transistor radio"give us the first real glimpse at music portability. In March of 1957 the Sony Corporation launched the TR-63 Pocketable Transistor. This audio device was literally small enough to fit into your pocket. It was the first transistor radio to be commercial shipped to the U.S. and sold for 39.95. The TR-63 was so popular it never met demands for its supply in the U.S. and was sold out virtually for its lifetime. Mary Bellis of "About.com" attributes the Casette Tape to be the first real predecessor to the iPod but I contend that portability from a music device had been very present even from the late 1950's with the first portable transistor radio.
In 1964 Sony initiated a movement towards "reel-to-reel" technology that used a magnetic strip to create frequencies which were then amplified by a power source. This was our first glimpse at live recording technology and was in the form was what today is called analog. The company worked to perfect this technology from magnetics on a paper strip to digital technology and focused more on positively and negatively charged atoms to create this frequency. This technology would later become referred to as digital technology. While experimentation and research was being done with this technology much also was being done to make this new cassette tape technology portable. One of the sincere draw backs of the Pocketable Transistor is that head phones could not be used with it and so you had to be in a quiet place to hear what was being sent out. As cassette tape technology began to sky rocket so too did sales for the 1980's coined "Boom Box". The Boom Box was a personalized Pocketable Transistor on steroids. For the first time DJ's at house parties could record their music and take it to the streets for other people to listen. They wanted the "Boom Boxes" louder and louder and it became a huge trend in urban cultures. One could see that a need for a portable and personal device was much needed and the commercialized Walkman was born.
In 1979, Sony introduced the Walkman it was a lighter, more portable, cassette player that came with only two AA batteries and 50 gram as opposed to a 400 gram pair of headphones with the same audio qualities as the much larger head phones. As soon as the technology for the portable cassette tape hit the commercial market in the U.S., Sony dropped another bomb; the Compact Disk which offered soaring audio qualities with the use of a laser and a diode and a playback of more than one hour. Less than 3 years later that technology had been made portable and in 1984 the D-50 the world first portable CD player was made public. Sony, true to form would try to improve on the CD technology by creating the minidisc in 1992, but CD's would dominate the market until the introduction of the first solid state portable mp3 player.
The birth of the mp3 player was only made possible in the wake of the vast explosion of the computer technology of the 80's, 90's, and 00's. The mp3 player held a chip inside that could store data and save data that was far more extensive than any tape or cd player could. The mp3 player began to move forward but it wasn't until Apple released its iPod that sells for mp3 players soared. Apple had tapped into the market and got in touch with its target. The iPod had a personalized feeling to it, and ease of use, and a screen that made looking for any particular song enjoyable for the user. I believe that iPod gained the head of the pack in demand because of its willingness to take this technology to the people instead of constantly expecting the people to find the technology.

No More Pain: History of Advil (Ibuprofen)



In 1984, Pfizer Inc. introduces a non-steroidal anti-inflammatory (ibuprofen) drug to the United States labeled under the name Advil. Advil is used to reduce fever and relieve pain. Pfizer is the world largest pharmaceutical company founded by Charles Pfizer in 1849. Non-steroidal anti-inflammatory drugs (NSAIDs) or ibuprofen is a copied of propionic acid discovered by Dr. Stewart Adams and his colleagues known as the Boots Group in the United Kingdom, 1961. The Boots group investigated that carboxylic acid as the cause that gave aspirin its anti-inflammatory property. They found propionic to be twice as strong as aspirin. The Boots Group started officially selling ibuprofen in 1964. In 1969, ibuprofen originally served to treat patients with rheumatoid arthritis in the UK and was made available in the US in 1974.

Ibuprofen causes fewer side effects than aspirin and more effective for dental pain and soft-tissue injury. Rheumatoid arthritis (RA) is a disease that causes chronic inflammation in the joints, body tissue and organs.

http://www.youtube.com/watch?v=dJTJ9qw1CkA

The Boots Group initially licenses ibuprofen to two pharmaceutical companies. Wyeth Whitehall Laboratories was the first pharmaceutical company licensed. Wyeth Whitehall Laboratories sold ibuprofen as the name Advil as the United States distributor. Bristol-Meyers were the second company licensed and marketed ibuprofen as Nuprin. In 1983, The Food and Drug Administration approved over the counter status for ibuprofen in the United States. Over 100 million people had been treated with ibuprofen in over 110 countries by 1985. By 1985, ibuprofen was in high demand. The Boots Group kept the patent to ibuprofen until 1985, allowing a number of new manufacturers companies to market their own product form of ibuprofen. Wyeth Whitehall Laboratories understood that Boots could not keep up with the high demand. The Boots Group manufacturer was still making all products manually by hand operation. Due to high demand, Michael Dryden of Whitehall R&D formed a team to create an automated manufacturing process. The first production operation of units started in 1987.


By 1988, Wyeth Whitehall Laboratories build two massive manufacture facilities, first one in Hammonton, New Jersey and the second in Guayama, Puerto Rico. Both facilities produce approximately 32,400,000 tablets per day. Wyeth closed the Hammonton facility and moved all production to Guayama, Puerto Rico in 2004, making it the only U.S. facility to manufacturer Advil. In 2009, Pfizer acquired Wyeth in a cash and stock merger. Pfizer gained the rights and control of all Wyeth products including Advil. As of today, Pfizer/Wyeth produces several kinds of Advil pain relievers for customers, Advil PM, Advil Cold and Sinus, Advil Liqui-Gels, Advil Migraine, and Advil Allergy Sinus.

Supermarkets: Productivity or Just a Product?

Shopping for groceries today seems like a mindless chore, but how did we come to select these choices? How did the supermarket come to be a part of our everyday life? By reexamining the history of the technologies within the supermarket, and the supermarket as an assemblage, we can begin to see the ways in which they have shaped our current articulations through consumption practices.




In 1916, one of the premier cities of the “self-service” grocery stores, zanily called Piggly Wiggly, was Memphis, Tennessee brought by the innovator Clarence Sanders. The self-service grocery store radically changed the way in which people obtained their food and other miscellaneous items. Instead of having of a mediator or clerk to select and garner the goods to purchase for the consumer, the earlier grocery store style, or A&P style, the consumers readily had a myriad of goods displayed to choose from for themselves. The consumers were given baskets to select their own items from the shelves, having to traverse every isle, viewing each product to select the ones they wanted. This gave the consumers the affect, or emotion, of having the freedom to choose their own products thus equating freedom with shopping. But were they? Mike Freeman in an article of the opening of the store said, “ Then forced by the construction of the shelves, they turned back to the front entrance. They could only move in one direction inside the Piggly Wiggly. They had to pass every item Clarence Saunders had for sale.”




Although the shelf construction seems like an architectural ploy, it can be viewed as a harbinger to modern day advertising via the arrangement of products on the shelves. Forcing the customers to view each and every product.




Turnstile gates, directing traffic flow







The delivery method had also changed. After purchasing the products at the register, customers delivered their own groceries to their home rather than the clerks doing so, of which most people were accustomed to with the usual grocery stores. The added “so called” inconveniences of shopping and delivering for themselves at the self-service grocery store did not deter shoppers. For one thing, the prices were cheaper than at other rivaling grocery stores and secondly, the grocery store became an organized social gathering where consumers could shop in a relatively quick way, and visit with their friends. Convenience is not an inherently bad trait, but as discussed by Thomas F. Tierney in our book, "convenience becomes a problem when the value of convenience and the desire to achieve convenience come to dominate technological culture.” Taking from the modern dictionary definition of convenience “being comfortable for use,” and comfort, “ satisfaction of bodily needs,” these two become main attributes of buying and consuming to fulfill these bodily needs.

The newly arranged products spoke for themselves, with labels and price tags on the item, and there was no need for proprietors to hire extra help at the self- service grocery because it was replete with “weights and scales to weigh their own goods.” The self-service stores were more efficient and only needed a limited amount of employees to stock and ring up the customers allowing for more business and foot traffic to occur and with less effort. Efficiency, as our book describes, is another way in which we qualify progress. The more that is produced in less time with added profit, will be the most efficient. And for Whom? The business owners will benefit, but not the workers. Contrarily, the workers will feel alienated, forced to produce repetitious movements like a machine.






The self-service grocery was a hit and by 1930, the worlds first warehouse supermarket was located in New York City under the guise of King Kullen. With this larger than life construct came larger than life products focused on selling in volume and at lower prices. The supermarket organized their use of space with parking lots allowing for easy accessibility. It didn’t take long before the one-stop shop became recognized as a profitable way to have larger stores that could produce the revenue of multiple smaller stores with fewer employees.

By the 50s and 60s people had moved to suburban areas where they could conveniently shop and consume at supermarkets, which gave the appearance of a better life. As our book explains, people have a propensity to amalgamate an idea of progress, with something new, “ a moving forward, with progress as material and moral betterment, a movement towards utopia”(p.6). These tacit assumptions become part of our everyday experience and breed a culture organized around purchasing new and improved lives.

History of Technology: The Washing Machine

“The story goes like this: Technologies make life better because they make life more convenient; that is, they save time, conquer space, and create comfort. Technologies perform task we might otherwise have to for ourselves. They relieve us from drudgery, labor, and physical exertion. They make it easier to go more places faster. They minimize the everyday struggles that were commonplace for our ancestors. In all they make life easier.” (Slack, Wise p. 28) The washing machine fits the ideal mold of this definition of convenience. Before the first type of washing machine was invented in 1797, the attempt to make clothing clean was a chore that was considered a hardship. In early history, before the 1700’s, women would rinse and dry clothing to rid of themselves of the odor that they felt was unbearable. Different regions developed ways to get soiled clothing back to a clean state. Many crew members would tie their clothes up in sacks and throw them overboard, allowing the ocean to fumigate their wares. Later, other practices were developed to get clothing clean such as using ashes from animals or the chemical called lye. (http://ezinearticles.com/?History-of-Washing-Machines---Who-Invented-the-Washing-Machine?&id=1935786)
Culture’s beliefs have determined the technology of the washing machine. Washing clothes is culture’s retaliation against nature’s perspiration and the world’s dirt. To clean laundry is to fight against all that is considered unclean and to construct techniques in which make this possible. The ideas have progressed from beating laundry against a rock, the use of a scrub board, hand powered drum machine, and much later the rotary machine. The rotary machine, which was seen as the model of convenience, has transformed into much more than its first debut. Many of the first washing machines were first made of heavy steal and had wooden tubs. (http://ezinearticles.com/?History-of-Washing-Machines---Who-Invented-the-Washing-Machine?&id=1935786)
As many of the early inventions that the culture deemed necessary, were invented before electricity, therefore most were operated by hand. People so longed for assistance washing their clothes, that the use of the tubs proved helpful. Basically, the machines were made up of a stick with four fingers and it moved clothes around in a bucket or tub. The alternative was much more laborious and the early machines fit the mold as helpful. (http://www.gizmohighway.com/history/washer.htm)
The washing machine, the Thor as it was called in the beginning, was later made with a galvanized tub. The Hurley Machine Company was the first to put this machine on the market, but soon there were many more companies to follow. (http://inventors.about.com/od/wstartinventions/a/washingmachines.htm) The washing machine advanced with the people of its time. It has surfaced new parts, new shapes, and added more features as people have demanded more from it. The modifications of the machine have met the standards of the culture and time around it. By 1937 General Electric expanded James King’s 1851 version of the washing machine to introduce the first modern washing machine to the world. (http://simplywash.com/washing-machines-history/)
When looking at the invention of the washing machine, in all of its progression to better serve its people, it is valuable to revisit the idea of convenience. In Ruth Schwartz Cowan’s book, which studies the technology of a household, household conveniences and the relationship of the nature of work in the home is assessed. Cowan argues that while household technologies, such as the washing machine, do indeed reduce drudgery of washing clothes, it does not take away the labor of the task. The shift, in which Cowan argues lies where there must be a housewife full time in the household to operate these technologies. At the same time, the standards of being clean and meeting different health standards becomes a new found problem alongside the rise of the household technologies. (Slack & Wise, p. 34-35) “However, the network of connections that constitute this technological system do not, in the end, reduce labor and save time; instead, the network of connections is part of a shifting burden in which the demands to collapse time and space become in, in a sense, an inconvenience.” (Slack & Wise, p. 35). Cowan’s view sums that if there is a washing machine; one must now wash clothes often.
Ironically to Cowan’s measures, an inventive husband produced the idea of a washing machine for his wife. His device was simple yet later would be profound. He called this machine for her a gift. (http://www.ehow.com/facts_5031534_history-automatic-washing-machine.html)
Commercial showing the different faces of the washing machine.

A Sweep Through History...Or...a Suction??

Imagine if this was the only thing you had to clean your home with. Well, according to the broomshop, before 1797 brooms in America where home and handmade. Brooms where used to sweep castles, caves, and even to clean ashes from a fireplace. In this ever changing world, brooms have evolved tremendously but, the world around us progresses a lot faster. People evolved from being barefoot to wearing shoes. We evolved from living in caves to larger homes with more living space and bedrooms essentially leaving more space to clean.

As time went on technology advanced. We no longer had to walk or ride horseback because we invented cars. Technological advancements also occurred within the home. You had the invention of the light bulb, mops, feather dusters, and carpet. The carpet industry began in America in the late 1700s. William Sprague introduced the first woven carpet mill in Philadelphia. Along with carpets came carpet beaters, sweepers, and other technologies to clean it. In an ever changing world, we would need something more convenient to clean our larger now carpeted homes. According to our textbook, technology cannot be explained by progress alone but with convenience as well, they kind of go hand and hand. Around 1868, the first technologies that lead to the develop of the vacuum cleaner where underway.


Ives W. McGaffey is responsible for some of the first vacuum cleaner technologies. his first idea consisted of manually powered machine that needed to be cranked. This technology was very inconvenient because the operator had to push and crank at the same time. Over the years several inventors such a John Thurman and Melvin Bissell, developed vacuum cleaners but Hubert Cecil Booth was a trailblazer in the development of the electronic vacuum cleaner.

Booth came up with the idea for his vacuum cleaner after watching the cleaning of a railway carriage. It was cleaned by using compressed air. Booth's vacuum consisted of a tube connected to an airpump. The other end of the tube had a nozzle and this was pushed over the surface being clean. Because the machine was so large it took two people to operate it. At this time most homes did not have electricity and many people could not afford to buy a vacuum so Booth incorporated a cleaning service. Because of its size, this vacuum was treated more like a car. It was powered by an electric engine and carried on a four wheel horse carriage. Long tubes were stretched thru the windows to clean the homes.
As the vacuum cleaner evolved they became smaller and more convenient. The upright vacuum cleaner came along in the early 1900s. An asthmatic man by the name of Murray Spangler designed a machine for collecting dust with a broom handle, an electric motor with a rotating brush and a pillow case for the dust. Spangler was an eemployee of Hoover, now a leading name in home appliances caught wind of Spangler's idea and made it a success.
Today vacuum cleaners come in all sizes from handheld to extremely compact. They are convenient and cut the time it takes to clean your home or office in half. They have hoses and can function in several different ways. You can even find self operating floor cleaners! In a steadily evolving world, the vacuum cleaner has come a long way, and yet has even further to go.















Thursday, February 10, 2011

Infuse Bone Graft rhBMP-2: The new future of bone growth




Infuse Bone Graft rhBMP-2 is the latest in bone fusion technology and represents a revolutionary new approach to spinal fusion surgery. Infuse bone graft contains a genetically engineered version of a protein that occurs naturally. This protein has been isolated in the laboratory and then purified and reproduced using recombinant DNA technology. The resulting recombinant human protein is known as rhBMP-2, and when combined with an absorbable collagen sponge, is marketed by Medtronic Sofamor Danek under the trade name Infuse Bone Graft.


More than 40 years ago, orthopedic surgeons determined that the protein extracts required for bone to heal, or regenerate, in the body were contained within the bone itself. So the question is, if our bones already regenerate themselves why not just use an autograft and leave it be? For the readers information an autograft is the transplantation of organs, tissues or even proteins from one part of the body to another in the same individual. For many years that is exactly what many neurosurgeons did, until a man in 1979 named Dr. Marshall Urist, a professor in the department Orthopedic Surgery at the University of California at Los Angeles School of Medicine coined the term “bone morphogenetic protein” (BMP) to describe these proteins. Dr. Urist and other scientists found that for BMP to work they would have to isolate one protein (BMP-2) from the bone tissue and use DNA technology to create genetically engineered cells, which after completion was called recombinant human BMP-2 (rhBMP-2). This was brilliant, because now the cells that they created could create pure BMP-2 protein, which is what is needed for our bones to grow.

This new quasi-synthetic known as Infuse Bone Graft rhbmp-2 became the “gold standard” (meaning it is equal or better than autograft). Infuse became the only product ever to be equal and or better than the patients own bone at remodeling itself. Infuse is one hundred times stronger than your own bone at doing this. Now people who develop degenerative disc disease (DDD) have a more comfortable way of dealing with their pain. Patients will not have to endure two separate surgeries, one having to harvest the bone from their hip (autograft) and the second having the bone placed in the spine. It can be done with one simple surgery.



Would someone call this technology genius or revolutionary? It did receive the Prix Galien award the past two years. The Prix Galien award is an award that was created to promote advances in pharmaceutical research. So one might say it is genius. How about revolutionary? Chapter three of our text Culture + Technology discusses two hypothesis that Langdon Winner used to explain technological determinism. The second hypothesis says “that belief that technological change is the single most important source of change in society.” In other words the development of Infuse is revolutionary to a technological determinist. I guess the real answer to the question though lays beneath the blade of the surgeon.