BLOGGER TEMPLATES AND TWITTER BACKGROUNDS

Tuesday, October 27, 2009

Canola Oil, the "Better" Oil

Beginning in the early 90’s, Canola oil became a face product for what was known as the low-fat food craze.

Canola oil, the name actually derived from a mixture of the words Canada, oil, low and acid, comes from the rape seed. Although originally rapeseed oil was considered inedible due to its high euric acid content, and was originally used in lamps and for lubrication purposes, the rapeseed was eventually bred to a point where the euric acid was reduced to a very low non-harmful level.

Beginning in the 80’s and culminating in the early 90’s, the health benefits of Canola oil became rather publicized and the product proved very popular at a time when losing weight and staying in shape was key to consumers. Canola oil boasts a very low saturated and high monounsaturated fat content, as well as having omega-3 fatty acids which promote heart health.
Oppositely, there are many that believe Canola oil to be harmful to the heart rather than helpful as it is believed by some that the euric acid content of Canola is indeed actually harmful to the heart. However, these allegations have been discredited by numerous health institutions and studies.

In 1995 the first crop of Canola oil was genetically engineered to be resistant to herbicide; followed in 1998 when crops of Canola oil were genetic engineered to be disease and drought resistant. This has caused controversy in some parts of the world for various ethical and economic reasons.

Regardless of such controversy, Canola oil remains one of the biggest selling oils today. In 2000, it was the world’s 3rd leading source in oil. Today it is categorized as having one of the biggest selling genetically-modified crops with 80% of being genetically modified. These numbers cement the status of Canola oil as a health beneficial food product.


Posted By: Rob Goodman

Sources:

Barthet, V. (2009). Canola. The Canadian Encyclopedia. Retrieved October 25, 2009 from
http://www.thecanadianencyclopedia.com/index.cfm?PgNm=TCE&Params=A1ARTA0001356

No Author. (2007). What is Canola? A problem with weeds—the canola story. Retrieved October 25, 2009 from http://www.biotechnologyonline.gov.au/foodag/weeds.html

No Author. (2007). Canola Facts: Why Growers Choose GM Canola. Canola Council of Canada. Retrieved October 25, 2009 from http://www.canola-council.org/facts_gmo.aspx.

Tamper-Evident Packaging Saves the 80's

Tamper-evident packaging involves the use of a closure on some sort of packaging that if opened, it is clearly visible to anyone that sees.

The need and importance of such packaging came full-circle in the early 1980’s, when certain groups with the intent of using blackmail or for unknowable reasons, claimed they had tampered with jars of baby food with poisonous substances. Due to this, all supplies had to be thrown out since there was no way of knowing which jars had been tampered with. Companies responded by making jars with a button on top of their lids that popped up and made a noise indicating the package was un-tampered with since being bottled. Actions such as this were vital since tampering with products became not only an issue of social importance, but of economic significance as well. Companies experienced losses from throwing out supplies and from their tarnished image.

One such company that received a massive backlash was Tylenol. In 1982, tampering of their over-the-counter product led to the death of more than 7 people, when authorities discovered that the Tylenol each individual had taken was laced with potassium cyanide. The media labeled the incident as the TYMURS (a mix between Tylenol and murders), and a nation-wide recall of Tylenol bottles was issued and Tylenol’s market share fell from 35% to 8%.

Incidents such as this forced the Food and Drug Administration (FDA) to create stricter tamper-evident packaging laws, and companies like Johnson & Johnson added triple layer tamper-evident packaging to Tylenol’s products to help them save their image. Their strategy worked and ever since this incident, the FDA has progressively created more and more laws to reinforce the protection of marketable products, creating the need for more technology to create tamper-proof closures and seals.

Posted By: Rob Goodman

Sources:

Wolnik, KA, Fricke, FL, Bonnin, E, Gaston ,CM, Satzger ,RD (March 1984). The Tylenol tampering incident - tracing the source. Analytical chemistry. 56 (3), pp. 466A–470A.

Image retrieved from
http://www.medidose.com/15mlpolyethylenevial-1.aspx

Monday, October 26, 2009

Architecture Goes Green

A new trend seen in the 21st century for Architecture is "green architecture" which is founded in the new "green movement" of this century. This architecture was birthed from the new hype for being environmentally friendly and "green" in today's society. Green building is environmentally friendly, economically conscious, energy saving, and cost-effective. It focuses on efficiently using the earth's resources while reducing the building impact on the environment around it.

Older trends have been coupled with new technology to create home environments which are environmentally friendly and green while at the same time still withholding the same popular trends in the home industry.

There are many aspects of these new Green buildings. New cooling and heating systems have been developed to create more efficient and eco-friendly systems. Daylight is key to these buildings. New "smar" windows have been invented to control the daylight with glass that turns from clear to opaque for control of light and privacy.

These builidings use all of the latest technology; from voice-activation controllers for every aspect of the house to weather sensors on the outside to control the opening and closing of windows. These houses are essentially completely computer controlled, cutting down on resources needed for older housing methods.

Another aspect of this new architecture is the shift towards eco-friendly resources made from natural substances in contrast to the usual completely man-made and chemically created resources. This is coupled with wind and solar energy which will essentially be running every aspect of the house.

New advancements and innovations are still being refined and discovered presently and much is to be expected in the years to come and researchers and scientists develop even newer and better ways to create Green buildings.

Posted By: Kylie Graham


Sources:

Green Architecture in the 21st Century (2009). Retrieved October 26, 2009 from
http://www.reddawn.com/featart11.html

Image Retrieved October 26, 2009 from
kodokloncatz.co.cc/anggitadwiyani/?p=23

Air Travel Like Never Before

Innovations in the airline industry for the 21st century have focused on cutting costs and making the travel experience for the client for convenient and personalized. The drop in the economy this century in addition to the terrorist attack on September 11, 2001 have greatly impacted the airline industry. With people unable to spend as much on travel coupled with a drop in desirability of flying have pushed the industry to focus on how to attract customers again in this new setting. Airlines are facing increasing fuel costs, government regulatory and security issues, major changes in the competitive architecture of the industry, as well as shifts in consumer travel behavior.

The industry has moved from focusing on the "process" of traveling, such as reservations, ticketing, and payment management, to focusing on "the people." Airlines are now catering to the passenger due to huge shifts in moods of society and the world around.

The first attempt at creating an easier and more convenient traveling experience began with JetBlue in 2001. But even newer developments have taken form with Design Q. This is a new design and idea created by a British design company. This company has been working on plans for airplane seating layout, in 2009, in which the seats are staggered and facing eachother running vertically down the planes interior. This design is a cost-cutting option for airlines that wish to fit more seats while creating a more efficient flying experience.

This design has been created for shorter, more frequently used flights in which the flight cost can be cut dramatically while still increasing the number of passengers.

Airlines are facing huge pressures and are attempting to combat many new issues raised in this century resulting in a shift in focues for airlines


Posted By: Kylie Graham


Sources:

Chaudhuri, Saabira. "The Future of Air Travel." Fast Company (2007). September 2007. Retrieved October 25, 2009 from
http://www.fastcompany.com/articles/2007/09/buckman.html

Active Packaging: It's Alive!

For the past few decades, scientists have been working to counteract the fact that foods are alive, biological substances that age and eventually die. The idea of preserving and extending the shelf life of food has been modified and re-modified many times over and only now with the recent introduction of "active packaging" has sufficient technology come to the forefront in order to address this issue.

This technology first began with controlled and modified storage of food and perishables in the form of storage rooms, transport containers and retail food packages that have a controlled atmosphere. A "controlled atmosphere" refers to the monitoring and controlling of gaseous compositions.

Active packaging is composed of packaging material that can control the environment that the food is residing in by interacting correctly, in a positive way, with the gases the food gives off. Gases are added or removed to the atmosphere of the package depending on what is residing in the package. This helps in the preservation of food and perishables as well as their shelf-life.

Reportedly, "active packaging" technology was utilized by 2.7 billion packages in 2001 in the U.S alone. In 2002, it was expected to be utilized in 7 billion packages by 2006. In 2007, demand for "active packaging" was projected to climb 13% by 2011. If these numbers hold, it's clear that "active packaging" is revolutionizing the the packaging industry as we know it.

Posted by: Kylie Graham

Sources:
CSIRO Food and Nutritional Sciences (2009). Retrieved from
http://www.foodscience.afisc.csiro.au/actpac.htm

No Author. (2007). U.S. Demand for Active Packaging on the Rise. Label and Narrow Web. Retrieved from
http://www.highbeam.com/doc/1G1-171029968.html


Image retrieved October 26, 2009 from http://www.acpk.com/

Surgery...from a Robot

The 21st century ushered in a new era that has revolutionized the world of medicine and surgeries. It began with robotic surgeries which were performed by robots while a surgeon controlled them from a chair across the room.

Not only was the tremor of the human hand erased by the steady ease of the robot machine, but surgeons were also not as easily tired by being allowed to sit down while performing. The robots allow surgeons to perform complex maneuvers that can be awkward and hard to reach allowing for less error margins and better precision.

Advancements in robotic surgery has led to telesurgeries in which surgeries can now be performed across the world. The first one was performed in 2001 from a doctor who was physically in New York, but was able to remove the gallbladder of a woman in Paris, France. This new development will revolutionize the world as we know it.

Since then, several prototypes have been implemented to perform surgeries, such as "Penelope," which assisted with its first human surgery in 2005. Patients can be protected from harmful viruses by having surgeries in which only robots are present. Also, patients can be operated on from across the world, allowing them to recover very near to their homes.

Posted by: Kylie Graham

Sources:
Military.com (2009). Retrieved October 21, 2009 from
http://www.military.com/features/0,15240,97023,00.html

Encyclopedia of Surgery (2009). Retrieved October 21, 2009 from http://www.surgeryencyclopedia.com/St-Wr/Telesurgery.html

Image retrieved October 26, 2009 from http://www.vpr.net/news_detail/79286/

Sunday, October 25, 2009

High-Definition Television Available Everywhere

The 1970's marked the stirrings of HDTV or High-Definition Television.

First developed in Japan to sell more TVs by improving the picture quality it soon gained popularity. The technology hit the market with the first HDTV system, dubbed MUSE, which was supposed to offer consumers the best picture and sound quality ever available.

1989 was an important year for the new technology as Japan became the first country to regularly broadcast HDTV. However, HDTV in the U.S. was first met with resistance. The Terrestrial Television Broadcasters were opposed to such broadcasts in the U.S. because they feared they would be pushed out of the HDTV market. HDTV requires a higher bandwidth than standard TV and the channels controlled by Terrestrial Television Broadcasters would not be able to handle such an increase. The United States Congress was also opposed to HDTV, essentially they did not agree that the U.S. should utilize another technology that was owned by another country. As time went out, the technology improved and the opposition softened when in 1995 the U.S. Federal Communications Commission set the standard for completely digital HDTV.

In 2006 the FCC mandated that all channels will have the capacity to broadcast HDTV. Now, such technology is standard in the U.S.

Posted By: Jessica Lantz



Sources:

undefined. (2009). HDTV History. In RepairHome. Retrieved October 25, 2009, from http://www.repair-home.com/HDTV_History.html.

Image Retrieved October 25, 2009 from

Ford's "Kinetic" Design

In 2004, Ford Motor Company began to experiment with a new design concept that became known as “kinetic design.”


The main principle of kinetic design is to make lines on the car that make it feel “athletic, dynamic and in shortness, make it look like its moving even when it’s standing still.” Kinetic is literally defined as “energy in motion” and Ford very much wanted to make their cars looks as if they are moving forward and in a constant state of evolution. Martin Smith, Ford of Europe’s executive design director, stated that Europeans wanted a Ford that showed emotion and a sense of refinement, rather than something that looked lifeless and cartoonish.


With this in mind, designer’s formulated the SAV in 2005, the first concept car to display the kinetic design aesthetic. Since then they’ve created kinetic features on most of their European models, such as the Ka and the Focus.


At this point, the question Ford faces the most is will kinetic design translate to American car-buyers? In 2005, Peter Horbury, the Executive Director of Ford North American Design, stated that “kinetic design DNA doesn't mean the same thing to buyers in this market, so it won't resonate the same way.” Horbury further acknowledged that while many Americans are of varied cultures and have different values, they still tend to share the same universal idea about life. They don’t enjoy the quite, but rather they prefer directness and action. Therefore, he believes Americans would prefer a bolder, structured design, something loud and flashy rather than sleek and refined.


Whether or not Americans will embrace or reject kinetic design still remains to be seen, since U.S audiences only recently at the 2008 Detroit Auto Show got their first look at the Ford Verve, a kinetic designed sedan planned to go to market in Europe and Asia by 2009 and in America by 2010.


Ford currently believes that Americans are looking for “technology, design and fuel efficiency.” The smaller, economical, European kinetic design is their answer.



Posted By: Rob Goodman

Sources:

No Author. (2009). Ford Kinetic Design. Ford of Britain. Retrieved from
http://www.ford.co.uk/Cars/KineticDesign/mdp=v1204941954768

Abuelsamid, Sam. (2008) Detroit 2008: Ford Verve Sedan concept brings "Kinetic design" to the U.S. for the first time. Auto Blog Green. Retrieved from http://green.autoblog.com/2008/01/13/detroit-2008-ford-verve-sedan-concept/#.

Image retrieved from http://green.autoblog.com/2008/01/13/detroit-2008-ford-verve-sedan-concept/#.

CD: The Compact Disc

James Russell first had the idea in 1965 while working for Pacific Northwest National Laboratory.

He called it the Optical Digital Recording (ODR) and was originally made to store information digitally while a laser read the information. By 1980 he made the first disc player and was given the R&D 100 Award for his work. However, no company was interested in the technology until Sony and Philips licensed it a version of the ODR for audio, which they called Compact Disc (CD).

The CD first became commerically available in 1982 closely followed by the CD-ROM for data in 1985. Although new formats for audio and data storage are also on the market both the CD and CD-ROM are still widely used. Both technologies paved the way for such inventions as the MP3, DVD, and Blueray. However, the invention of the MP3 begged the question "will CDs go extinct?" So far, the answer is no.


Posted By: Jessica Lantz


Source:

undefined. (July 28, 2000). Who Invented the CD and the CD-ROM? . In Scrupuli: Essays with Sharp Points. Retrieved October 25, 2009, from http://metaed.blogspot.com/2000/07/who-invented-cd-and-cd-rom.html.

Image Retrieved October 25, 2009 from http://govia.osef.org/.

Guggenheim Museum: "The Building of the Century"


In the center of Bilbao, Spain, the Guggenheim Museum opened in 1998, gaining more publicity in the year it opened than any other building that year. The reason for the extreme exposure of the museum had much to do with the incredibly abstract, contemporary and remarkable design for the building, which did not conform to traditional architecture standards at the time.

The designer of the building, Frank Gehry, began to experiment with his unique style in the late 1970’s using “non-traditional forms and using found objects to create collage-like models.” His style has become characterized as an example of“Deconstructivism,” a development of postmodern architecture that borrows greatly from the idea of fragmentation and is defined by:

-An unfinished or broken looking appearance
-A combination of formal compositions standards with a sort of disjointed aesthetic
-The non-use of rectangular forms and surfaces to create a distorted look
-Shape and forms grouped incongruently together

The building’s lack of uniformity raised its profile tremendously worldwide, and because the building was located in the Basque region of Spain which was commonly recognized as a region of political strife for years, it gave a sense of pride and calm to the people of the region, and in 2002, tourism in Bilbao reportedly increased five-fold with 80% of all tourists coming mainly to see the museum.

The success ultimate of the Guggenheim Museum was due mostly to the boldness of Gehry’s design in taking standards and completely turning them on their head. It proved people’s readiness to embrace something so different, and ultimately brought on a sense of change. Shortly after it’s opening, Time Magazine declared the Guggenheim Museum as “The Building of the Century.”

Posted By: Rob Goodman

Sources:

Friedman, Mildred ed. (1999). Gehry Talks: Architecture + Process. New York, NY: Rizzoli International.

Image retrieved from
http://www.worldenough.net/picture/English/lab/Lab_street/3.cypriot_abroad.htm

Art Goes Online

With the rise of the internet in the early 1990’s, information became more and more accessible on line and limitations seemed endless. This reasoning translated as well to the art world, especially in the 1994-95 art seasons when art that would more than likely never be seen outside of an art gallery in a certain locale, could be viewed by absolutely anyone in the world at anytime through online galleries. This concept is not limited to art in general, but also to museums and other tours as well.

One primary benefit of presenting art online is that an unknown can gain exposure they otherwise may have had a hard time finding. The downside of displaying art online is the quality of the image online compared to seeing a piece in person. In a way, it undermines the physical viewing experience, and can even elicit different emotions the artist perhaps did not intend to bring forth with their work.

Overall, the availability in viewing art online is just another prime example of the fluidity of information the internet has instigated since its conception. It also gives rise to the notion that art can be made strictly to be viewed on the internet, and that the internet can be used as a great tool for art education.

Posted By: Rob Goodman

Sources:

Atkins, Robert. (1995). The Art World and I Go Online. Art in America. 83 (12), pp. 58-59, 63.

Image retrieved from
http://seattlejew.blogspot.com/2007_11_01_archive.html.

1980 Hybrid Concept Car

In 1980, the Briggs & Stratton Corp. introduced its plug-in, hybrid concept car stating, "we are all seeing our personal mobility threatened by rising petroleum prices and dwindling resources.

The fundamental appeal of electric cars is that they allow us to use energy sources other than petroleum on the road." The B&S car used a parallel hybrid system which allowed the owner to choose between gas, electric power, or both; a concept that has been borrowed by the Hybrid cars of today.

The 1980 March edition of Motor Trend Magazine found "the car's plug-in, electric-only range was between 30 and 60 miles, enough for many people to commute to and from work without using the gas motor. In hybrid mode, the car's range was about 200 miles."

The car appeared strange though, due to its 6-wheeled design. The weight of the batteries required extra support so 2 extra wheels were added to the address the problem. The car was built with little consideration for safety, the batteries were so heavy they could come hurtling toward the driver in the event of a crash. The B&S concept car had other problems as well--it was disturbingly loud due to its internal mechanical processes.

However, the B&S concept car was an important project for the Hybrid car industry. It's design concepts were improved upon to make more efficient Hybrid models. Although the "Green" movement is a prominent facet of today, its stirrings can be seen in projects like this.

Posted By: Jessica Lantz


Sources:

Carney, Dan . (June 1, 2007). Lawnmower-Engine MakerOnce Had the Lead in Hybrids. In The Wall Street Journal Online. Retrieved October 25, 2009, from
http://online.wsj.com/public/article_print/SB117932740689804953.html.

Image Retrieved October 25, 2009 from
http://auto.howstuffworks.com/1980-briggs-and-stratton-hybrid-concept-car.htm

Ford's "New Edge" Design

Starting in the mid 1990’s and continuing into the first half of the next century, the Ford Motor Company , looking to create something new and exciting that would also be favorable to their production needs, created an innovative styling technique known as “new edge.”

The basis for this name came from the use of adding sharp angled lines, arcs, and curves on a vehicles body surface that collided with one another to create the effect of highlighting and shadow. Not only was this technique aesthetically pleasing, but because the parts of the body were assembled by piece to create such lines in the design, it made it more labor easy and cost effective to produce. Ford initially utilized the styling technique in several of their concept cars which never fully came to fruition, such as the GT90 in 1995. However, the design was first publicly featured and produced in Europe with such models as the Ford Ka in 1996, and most notably the Ford Focus in 1998. In 1999, the Ford Mustang’s 4th generation pony car became the first North American vehicle to feature the new edge design, followed shortly by the introduction of the Ford Focus into the North American market in 2000.

Although other popular Ford vehicles followed suit by having particular new design elements added to them, Ford has yet to produce another fully new edge designed vehicle since they switched gears to the new “kinetic” styling technique.


Posted By: Rob Goodman

Sources:

Krebs, Michelle. (1997). New Designer To Take a Seat at Ford’s Drawing Board. The New York Times. Retrieved from
http://www.nytimes.com/1997/09/28/automobiles/new-designer-to-take-a-seat-at-ford-s-drawing-board.html.

Winter, Drew. (1998). Living on the Edge: New Ford styling trend isn’t too tough, stampers say. Ward’s Auto World. Retrieved from http://wardsautoworld.com/ar/auto_living_edge_new/.

Image retrieved from http://www.carprices.co.uk/models/Ford/fordfocus

DVD Technology

DVD, which alternately means Digital Video Disk or Digital Versatile Disk, was founded in 1995 as a successor to the Video Home System (VHS), and in some respects the Compact Disc (CD). In 1993, two different formats at the time, the Multimedia Compact Disc (MMCD) and the Super Density (SD) disk, were being poised against each other to overcome the VHS format. With companies such as Sony and Toshiba separating themselves on which formats to support and produce, a war similar to that between VHS and Betamax years prior was beginning to ensue, and computer companies such as Apple and IBM began to take notice. In an effort to end a similar dispute from reoccurring, Computer software companies joined together to boycott the industry, stating they would not use either formats being produced by the companies unless they were able to come up with one single one. Both sides agreed to make one single format, and from this the DVD. In 1997, DVD video was finally mass marketed to the United States as a high quality alternative to VHS.

The DVD, in comparison to the VHS or the CD, is an optical format able to hold extensive amounts of data, such as a feature length film, because it uses a 650 nm laser diode as opposed to the 780 nm of a CD, allowing for greater capacity. Also, the DVD is able to record on dual layers, meaning information can be recorded to either side of the disc. DVD, especially in comparison to VHS, also provides better sound and picture quality that doesn’t fade. Other options include language selection, cutting to scenes, bonus features, and multi-angle selection, all of which VHS has never been able to provide in one single package.

Throughout the years as DVD players have become smaller and less costly to produce due to their growing popularity, they’ve also begun to overtake the rental and sales of VHS as early as 2003. The DVD format has gained the status of being a standard, yet its staying power remains to be seen as the emergence of more advance high-definition formats in the market, such as the Blu-Ray disc in 2006, are gaining momentum.


Posted By: Rob Goodman

Sources:

Bakalis, Anna. (2003). It’s Unreel: DVD Rentals Overtake Video Casettes. The Washington Times. Retrieved from
http://washingtontimes.com/news/2003/jun/20/20030620-113258-1104r/?page=2.

Wempen, Faithe. (2001). Build Your Skills: A Comparison Between DVD and CD-Rom. Tech Republic. Retrieved from
http://articles.techrepublic.com.com/5100-10878_11-1047035.html.

Markoff, John. (1995). Business Technology; a Battle for Influence Over Insatiable Disks. The New York Times. Retrieved from
http://www.nytimes.com/1995/01/11/business/business-technology-a-battle-for-influence-over-insatiable-disks.html.

Image retrieved from http://www.vidcam.com.au/copy/index.html.



Technology. (n.d.). DVD Forum. Retrieved from
http://www.dvdforum.org/forum.shtml

"Fly-By-Wire": Analog vs. Digital


First used in a test research aircraft from 1972 to 1985, NASA revolutionized the business of commercial and military airplanes by developing the F-8 digital “fly-by-wire (FBW)” system.

FBW involves a series of wires connected to the hydraulic system of an aircraft that checks for diagnostics involved with the stability of the plane. The pilot is connected and aware of this system by a computer which they control, rather than using the older traditional method of flying with a manual steering wheel. The main benefit of the digital system compared to the manual is the increase in aerodynamic efficiency. Essentially, you can better maneuver the aircraft more accurately without the sacrifice of losing natural stability. Stability is of utmost importance when operating an aircraft, because if it is disturbed by an outside force such as a gust of wind, the flight path can diverge drastically, increasing the “g-forces” pressed upon the the aircraft, which can result in the failure of the structure of the aircraft, causing it to fall apart and crash.

What FBW control provides is a “high-integrity automatic stabilization of the aircraft.” This type of faultless control allows aircraft designers more freedoms to create aircrafts that are lighter, and have better aerodynamic structure. Overall efficiency is the key word.FBW controls were first introduced into military aircraft with the redesign of McDonnell Douglas’ (now Boeing) YF-17 into the F/A-18, flying first in 1980.

In 1988, Airbus became the first commercial company to incorporate FBW into their airplanes with the introduction of the A-320. Other companies followed suit, such as Boeing who first used FBW technology in their 777 model in 1993, and then produced the airplane first with United Airlines in 1995. It should be stated that while FBW does offer many modern conveniences, it is also another case of technological advancements causing the manuals jobs of humans to become obsolete, as the system limits the autonomic actions of the pilot. However, the efficiency of FBW is undeniably a great step towards making modern transportation faster and more convenient.

Posted By: Rob Goodman


Sources:

Collinson, R.P.G. (1999). Fly-by-wire Flight Control. Computing and Control Engineer Journal. 10 (4), pp. 141-152.

Sabbagh, Karl. (1996). 21st Century Jet. New York, NY: Scribner.

Curry, Marty ed. (2009). Past Projects: F-8 Digital Fly-By-Wire. NASA. Retrieved October 18, 2009 from
http://www.nasa.gov/centers/dryden/history/pastprojects/F8/index.html.

Image retrieved October 26,2009 from
http://www.nasa.gov/centers/dryden/history/pastprojects/F8/index.html