Friday, March 30, 2007

1868:
first department store in the US

Zion's Co-operative Mercantile Institution or ZCMI was founded in 1868 and may have been the first department store in the United States. R.H. Macy & Co. started earlier, but did not become a department store until 1877.

Based in Salt Lake City, Utah, ZCMI quickly became a household name in the community. The (Mormon) Church of Jesus Christ of Latter-day Saints was a significant influence in the company, retaining a majority interest in ZCMI until its eventual sale.

In December 1999, as a result of losses for two consecutive years along with mounting economic and social pressures, ZCMI was sold to the May Department Stores Company. ZCMI operated under its original name as a part of May's Oregon-based Meier & Frank division until April 2002, when the stores adopted the Meier & Frank name.

In addition to the name change, the stores in Logan, St. George, Idaho Falls, and Pocatello were sold to Dillard's. By August 2002, the stores were further consolidated within May into the company's Los Angeles division, Robinsons-May, though retaining the Meier & Frank nameplate. May Department Stores was acquired by Federated Department Stores in September 2005, with most stores converted to Macy's by the end of 2006. In 2007, Federated anniounced that it was changing its corporate name to Macy's.

Thursday, March 29, 2007

1807:
Britain outlaws the slave business

In 1805 the British House of Commons passed a bill that made it unlawful for any British subject to capture and transport slaves, but the measure was blocked by the House of Lords.

In 1806, Lord Grenville formed a Whig administration. Grenville and his Foreign Secretary, Charles Fox, were strong opponents of the slave trade. Fox and William Wilberforce led the campaign in the House of Commons, whereas Grenville, had the task of persuading the House of Lords to accept the measure.

Greenville made a passionate speech where he argued that the trade was "contrary to the principles of justice, humanity and sound policy" and criticised fellow members for "not having abolished the trade long ago". When the vote was taken the Abolition of the Slave Trade bill was passed in the House of Lords by 41 votes to 20. In the House of Commons it was carried by 114 to 15 and it become law on 25th March, 1807.

British captains who were caught continuing the trade were fined £100 for every slave found on board. However, this law did not stop the British slave trade. If slave-ships were in danger of being captured by the British navy, captains often reduced the fines they had to pay by ordering the slaves to be thrown into the sea.

Some people involved in the anti-slave trade campaign such as Thomas Clarkson and Thomas Fowell Buxton, argued that the only way to end the suffering of the slaves was to make slavery illegal. However, it was not until 1833 that Parliament passed the Slavery Abolition Act.

Un The US, Abraham Lincoln is credited for "freeing the slaves" with his Emancipation Proclamation in 1963, but the Proclamation was limited in many ways. It applied only to states that had seceded from the Union, leaving slavery untouched in the loyal border states. It also expressly exempted parts of the Confederacy that had already come under Northern control. Most important, the freedom it promised depended upon Union military victory.

The Proclamation was not a law passed by Congress but a presidential order empowered by Lincoln's position as Commander in Chief of the Army and Navy. As the Union armies conquered the south, thousands of slaves were freed each day until nearly all (estimated at 4 million) were free by July of 1865. Some slavery continued to exist in the border states until the entire institution was finally wiped out by the ratification of the Thirteenth Amendment in 1865.

Japan abolished slavery in 1588, Saudi Arabia in 1962, and Mauritania in 1981. Illegal slavery still continues.

Wednesday, March 28, 2007

2007:
British cars made in China

Nanjing Automobile Group unveiled its made-in-China MG sports cars and sedans Tuesday, the first step in its plans to use the iconic British brand as a platform for global expansion.

China's oldest car maker introduced the MG TF roadster and the MG 7295 and 7275 sedans at its $450 million plant near Nanjing city. It has renamed the brand "Ming Jue" or "Modern Gentleman" to represent grace and style for the Chinese consumer. Ten thousand tons of assembly line equipment were shipped from England to China.

MG's new owner plans to invest $2 billion in the brand, including opening British and US plants. "MG, as an established brand worldwide, will help Nanjing Auto attract customers' attention," said Huang Zherui, an analyst who advises carmakers in China.

Companies in China, a country with more than $1 trillion in foreign reserves, have bought overseas rivals to gain technologies and brands that will help them expand globally. Lenovo Group bought IBM's PC unit in 2005. TCL bought the German television maker, Schneider Electronics in 2002.

Nanjing Auto bought the MG brand and other assets in 2005, after the collapse of the British automaker MG Rover Group. SAIC Motor, a General Motors and Volkswagen partner in CHina, bought the design rights for two MG Rover models and K-series engines.

Nanjing plans to build MG TF roadster convertibles at a former MG factory in England for sale in Europe this year. They had planned to make MG TF Coupes in Oklahoma next year (but that now seems unlikely), and sell the roadster, coupe and sedan in the US.

"Japanese automakers took 30 years" to begin overseas sales, while "Korean automakers took 20 years," Wang Haoliang, Nanjing Auto's chairman said at a ceremony Tuesday. "We are much faster."

Nanjing Auto was founded in 1947. Its main products are trucks and buses, as well as Fiat cars. Its new MG plant has an annual capacity of 200,000 cars, 250,000 engines and 100,000 transmissions.

The first MG car was built at Morris Garages in England in the 1920s. The automaker built a series of two-seater roadsters including the T-Series, the MGA and the MGB, an icon of the 1960s and 1970s. The company was absorbed into British Motor in 1952, the start of a series of mergers involving the British car industry. The brand ended up as part of MG Rover, which collapsed in 2005.

China's economic growth has averaged about 10 percent for the past four years, making cars affordable to more people. Vehicle sales rose 25 percent last year to 7.22 million, as China surpassed Japan as the world's biggest auto market behind the United States. (info from The International Herald Tribune and NPR, photo from MotorAuthority)

Tuesday, March 27, 2007

2006:
downloaded music singles outsell CD albums

To the regret of music labels, fans are buying fewer and fewer full albums. In the shift from CDs to digital music, buyers can now pick the individual songs they like without having to pay $10 or more for an album.

In 2006, digital singles outsold plastic CDs for the first time. So far this year, sales of digital songs have risen 54 percent, to roughly 189 million units, according to data from Nielsen SoundScan. Digital album sales are rising at a slightly faster pace, but buyers of digital music are purchasing singles over albums by a margin of 19 to 1. Sales of albums, in either disc or digital form, have dropped more than 16 percent so far this year, a slide that music executives attribute to an unusually weak release schedule and shrinking retail floor space for music. Even though sales of individual songs, sold principally through iTunes, are rising, it has not been nearly enough to compensate.

Because of this shift in listener preferences, record labels are facing the loss of the album as their main product and moneymaker. They are re-examining everything from marketing practices to their contracts. Some new performers get contracts to record only ring tones or a few singles.

At the same time, the industry is straining to shore up the album as long as possible, in part by prodding listeners who buy one song to purchase the rest of a collection. Apple may offer iTunes users credit for songs they have already purchased if they buy the associated album. (Under Apple’s current practice, customers who buy a song and then the album pay for the same song twice).

But some analysts say they doubt that such promotions can reverse the trend.

“I think the album is going to die,” said Aram Sinnreich, managing partner at Radar Research, a media consulting firm based in Los Angeles. “Consumers are listening to play lists,” or mixes of single songs from an assortment of different artists.

All this comes as the industry’s long sales slide has been accelerating. Many music executives dispute the idea that the album will disappear. In particular, they say, fans of jazz, classical, opera and certain rock (bands like Radiohead and Tool) will demand album-length listening experiences for many years to come. But for other genres — including some strains of pop music, rap, R&B and much of country, where sales success is closely tied to radio air play of singles, the album may be entering its twilight.

“For some genres and some artists, having an album-centric plan will be a thing of the past,” said Jeff Kempler, chief operating officer of EMI’s Capitol Music Group. While the traditional album provides value to fans, he said, “perpetuating a business model that fixates on a particular packaged product configuration is inimical to what the Internet enables, and it’s inimical to what many consumers have clearly voted for.”

A decade ago, the music industry had all but stopped selling music in individual units. “Ca-singles” and 3-inch mini CDs that were intended to emulate the earlier success of the 45rpm single failed to catch on with music buyers. But now, four years after Apple introduced its iTunes service, individual songs account for roughly two-thirds of all music sales volume in the United States. And that does not count purchases of music in other, bite-size forms like ring tones, which have sold more than 54 million units so far this year.

One of the biggest reasons for the shift, is that PC-empowered consumers are forgoing album purchases after years of paying for complete CDs with too few songs they like.

In some ways, the current climate recalls the 1950s and to some extent, the 60s, when many popular acts sold more singles than albums. It took greatly influential works like The Beatles’ Sgt. Pepper’s Lonely Hearts Club Band and the Beach Boys’ Pet Sounds to turn the album into pop music’s dominant medium. Most earlier albums were “greatest hits” collections of previously released singles, but “Sgt. Pepper” helped make the album a new art form with a unified structure – one of the first “concept” albums.

With today’s production costs and flexible purchasing, an album like “Sgt. Pepper” might not be made. It is often cited as the Beatles’ best work and the most influential album of all time, and was ranked #1 on Rolling Stone's The 500 Greatest Albums of All Time. Now people buy music one 99-cent track at a time, and perhaps "With a Little Help from My Friends" or "Lovely Rita" would be downloaded many more times than "Being for the Benefit of Mr. Kite!" (info from The New York Times and other sources)

Monday, March 26, 2007

1965:
miniskirt

While women’s hemlines have gone up and down through history, the invention of what we know as the miniskirt is generally credited to British fashion designer Mary Quant in the mid-1960s. Quant was inspired by the Mini automobile. French designer André Courrèges is also often cited as its inventor, and some give the credit to Helen Rose who made some miniskirts for actress Anne Francis in the 1956 science fiction movie, Forbidden Planet. There is also a claim that designer John Bates was responsible for the miniskirt. Bates designed costumes for Diana Rigg, who acted as as Emma Peel in the TV show, The Avengers, and helped define "Mod" style.

Mary Quant ran a popular clothes shop in London called Bazaar, where she sold her own designs. In the late 1950s she began experimenting with shorter skirts, which resulted in the miniskirt in 1965, one of the defining fashions of the decade. Due to Quant's position in the heart of fashionable "Swinging London", the miniskirt was able to spread beyond a simple street fashion into a major international trend.

Minis were popularized by skinny models Twiggy and Jean Shrimpton; and even the Queen of England caught a little of the mood, sporting a noticeably shorter hemline at the royal garden party one year.

Shrimpton wore a short white shift dress made by Colin Rolfe in 1965 at the Melbourne Cup Carnival in Australia, where it caused a sensation. Shrimpton claimed that the brevity of the skirt was due mainly to Rolfe running out of fabric, and that the controversy was also caused by her not wearing a hat and gloves, vital accessories in the conservative environment.

The miniskirt was further popularized by André Courrèges, who developed it separately and incorporated it into his Mod look, for spring/summer 1965. His miniskirts were less body-hugging, and worn with the white "Courrèges boots" that became a trademark. By introducing the miniskirt into the haute couture of the fashion industry, Courrèges gave it a greater degree of respectability than might otherwise have been expected of a street fashion.

The miniskirt was followed up in the late 1960s by the even shorter micro skirt. (info from Wikipedia and other sources)

Friday, March 23, 2007

1920:
beginning of the Band-Aid

Back in 1920, newlywed Josephine Dickson was living in New Brunswick, New Jersey, with her husband Earle. Married life agreed with her, but housekeeping did not. Josephine was a klutz.

When Earle came home from his job as a cotton buyer at Johnson & Johnson, Josephine would have cuts or burns on her fingers. Josephine had no easy way of covering and protecting her injuries, and Earle had to cut pieces of adhesive tape and cotton gauze and make a bandage for each wound. This happened day after day. Josephine was bleeding all over the kitchen. Dinner was delayed. Earle was getting pissed-off.

Finally, after several weeks of kitchen accidents, Earle hit upon an idea. (Luckily for Johnson & Johnson, his idea was not to hire a cook.) Earle sat down and prepared some ready-made bandages by placing squares of cotton gauze at intervals along an adhesive strip and covering them with removable crinoline fabric. Now all Josephine had to do was cut off a length of the strip and wrap it over her bruise. This was the proto Band-Aid bandage.

Earle soon told his boss at work about his new invention and soon the first adhesive bandages were being produced and sold under the soon-to-be-world-famous Band-Aid trademark.

The first ones were three inches wide and 18 inches long. They were not an immediate success. Only $3,000 worth were sold the first year, but eventually they caught on.

Earle was eventually rewarded with a position as Vice President in the company, where he stayed until his retirement. As for Josephine, history does not record whether she ever mastered the art of accident-free cooking. But we do know she had plenty of Band-Aids available just in case.

Later product advances included machine manufacturing, a shift from cloth to plastic tape, sterile packages with the little red thread for opening, lots of different sizes, clear strips, medicated strips, rugged strips, waterproof strips and decorated Band-Aids for kids.

Brooke Shields, Terri Garr and John Travolta were in Band-Aid commercials before they became famous. (info from Johnson & Johnson)

Thursday, March 22, 2007

2006:
average US home gets over 100 TV channels



Last year, for the first time, the number of television channels available in the average American home reached the 100 mark, according to figures released by the Nielsen Media Research, which provides TV viewership statistics that are used to help determine if a show is a hit or a dud.

The channel average jumped to 104.2, an increase of nearly eight from the previous year. As of 2006, 47 percent of U.S. homes received more than 100 channels, a jump of 5 percent from 2005.

As digital cable and satellite TV have become more widely available, the number of channels available to U.S. homes has skyrocketed. In 2000, for instance, the average home had 61.4 channels, and in 1995, the number was 41.1.

As the number of channels available to a household increased, so did the number of channels tuned, although the percentage of available channels actually viewed decreased. In 2006, the average household tuned to 15.7 (or 15.1%) of the 104.2 channels available. This compares to 2000, when the average home viewed 22.1% of the available channels (13.6 channels viewed out of 61.4 available channels).

General dramas still dominate the broadcast networks program lineups, comprising 50% (67 of 134) of the primetime programs, an increase of four programs from 2005 to 2006.

Wednesday, March 21, 2007

1533: first high-heeled shoes

While high heels today are mostly associated with women's shoes, many shoe designs worn by both genders have elevated heels, including cowboy boots and cuban heels.

Raised heels may have been a response to the problem of a horse rider's foot slipping forward in stirrups. The "rider's heel," about 1-1/2" high, appeared around 1500. The leading edge was canted forward to help grip the stirrup, and the trailing edge was canted forward to prevent the elongated heel from catching on underbrush or rock while backing up, such as in on-foot combat. These features are evident today in riding boots, notably cowboy boots.

In 1533, Catherine de Medici, the diminutive wife of the Duke of Orleans, commissioned a cobbler to fashion her a pair of heels, both for fashion, and to increase her stature. They were an adaptation of chopines (elevated wooden soles with both heel and toe raised not unlike modern platform shoes), but unlike chopines the heel was higher than the toe and the "platform" was made to bend in the middle with the foot.

The simple riding heel gave way to a more stylized heel over its first three decades. Beginning with the French, heel heights among men crept up, often becoming higher and thinner, until they were no longer useful while riding, but were relegated to "court-only" wear. By the late 1600s men's heels were commonly between three and four inches in height.

France's King Louis XIV (1638-1715) was only five feet, three inches tall until he grew five inches wearing shoes with curved heels constructed of cork and covered with red-dyed leather symbolizing nobility. On special occasions, his high heels were ornamented with hand-painted scenes of his military victories. Today, curved heels preserve his legacy and are known as Louis or French heels. Other heel-wearers used their footwear to boast of their wealth; the heels were so high that servants had to break them in, so to wear high heels also proved one could afford servants.

High-heeled shoes quickly caught on with the fashion-conscious men and women of the French court, and spread to pockets of nobility in other countries. The term "well-heeled" became synonymous with opulent wealth. Both men and women continued wearing heels as a matter of noble fashion throughout the seventeenth and eighteenth centuries. When the French Revolution drew near, in the late 1700s, the practice of wearing heels fell into decline in France due to its associations with wealth and aristocracy. Throughout most of the 1800s, flat shoes and sandals were usual for both sexes, but the heel resurfaced in fashion during the late 1800s, almost exclusively among women. (Photo from Wikipedia) (Info from Wikipedia and Answers.com)

Tuesday, March 20, 2007

1982:
Commodore 64 computer beats IBM PC

Many people think that Radio Shack, IBM or even Apple started the personal computer business, but Commodore had a big part in it.

Commodore, the commonly used name for Commodore International, was an American electronics company which was a vital player in the personal computer field in the 1980s. Commodore developed and marketed the Commodore 64, an extremely popular desktop computer, in 1982. The model number refers to its 64K (not meg or gig) of RAM, double the 32K then supplied in the much more expensive IBM PC.

The company that would become Commodore International was started in Toronto by Jack Tramiel in 1954. He had already run a small business fixing typewriters for a few years while living in New York and driving a cab, but managed to sign a deal with a Czechoslovakian company to manufacture their designs in Canada and moved to Toronto to start production. By the late 1950s a wave of Japanese machines forced most North American typewriter companies out of business, and Tramiel turned to adding machines.

In 1962 the company was formally incorporated as Commodore Business Machines. In the late 1960s history repeated itself when Japanese firms started producing and exporting adding machines. The company's main investor and chairman, Irving Gould, suggested that Tramiel travel to Japan to understand how to compete. Instead, he returned with the new idea to produce electronic calculators, which were just coming on the market.

Commodore soon had a profitable calculator line and was one of the more popular brands in the early 1970s, producing both consumer as well as scientific calculators. However in 1975 Texas Instruments, the main supplier of calculator parts, entered the market directly and put out a line of machines priced at less than Commodore's cost of the parts. Commodore had to be rescued once again by an infusion of cash from Gould, which Tramiel used beginning in 1976 to purchase several second-source chip suppliers, including MOS Technology, Inc., in order to assure his supply.

He agreed to buy MOS, which was having troubles of its own, only on the condition that its chip designer Chuck Peddle join Commodore as head of engineering. Once Chuck Peddle had taken over engineering at Commodore, he convinced Tramiel that calculators were already a dead end and that they should turn their attention to home computers. Peddle packaged his existing KIM-1 single-board computer design in a metal case, along with a full-travel QWERTY keyboard, monochrome monitor, and tape recorder for program and data storage, to produce the Commodore PET. From its 1977 debut, Commodore would be a computer company.

The PET computer line was used primarily in schools, due to its tough all-metal construction (some models were labeled "Teacher's PET"), but did not compete well in the home setting where graphics and sound were important. This was addressed with the introduction of the VIC-20 in 1981, which was introduced at a cost of $299 and sold in retail stores. Commodore took out aggressive ads featuring William Shatner asking consumers "Why buy just a video game?" The strategy worked and the VIC-20 became the first computer to ship more than one million units. A total of 2.5 million units were sold over the machine's lifetime.

CBM introduced the Commodore 64 in 1982 as the successor to the VIC-20. The C64 possessed remarkably-capable sound and graphics for its time. Its $595 price was high compared to the VIC-20, but it was still much less expensive than any other 64K computer on the market. Early C64 ads boasted, "You can't buy a better computer at twice the price."

In 1983 Tramiel decided to focus on market share and cut the price of the VIC-20 and C64 dramatically. TI responded by cutting prices on its TI-99/4A, which had been introduced in 1981. Soon there was an all-out price war involving Commodore, TI, Atari and practically every vendor other than Apple. This price war likely contributed to the video game crash of 1983. By the end of this conflict, Commodore had shipped somewhere around 22 million C64s, making the C64 the best selling computer of all time, and in the process drove TI out of the home-computer market, almost destroyed Atari, bankrupted most smaller companies, and wiped out its own savings. Tramiel's motto, "Business is war," had taken its toll.

With market share eroding, Commodore embarked on a series of decisions that were heavily questioned by shareholders and the press, who sometimes accused management of only being interested in removing as much value from the company as possible before it finally disappeared. By 1994, only its operations in Germany and the United Kingdom were still profitable.

Commodore's computer systems, especially the C64 and Amiga series, retain a cult-following among their users years after the company's demise. (info from Wikipedia)

Monday, March 19, 2007

1967:
last state repeals anti food color law

In the late-19th century, when margarine began to emerge as a cheap substitute for natural butter, manufacturers added a yellow dye to make it look more like butter so it would be more acceptable to consumers.

Dairy farmers objected, and lobbied Congress to pass the Oleomargarine Act of 1886, which imposed a penalty tax on any margarine that was colored to look like butter. The law also gave the dairy farmers the explicit right to use their own color additives. At the beginning of the 20th century, these discriminatory color laws were widespread: 32 states had banned yellow margarine; some had even forced manufacturers to dye their product an unappetizing pink.

Restrictive Federal legislation was repealed in 1950 after margarine gained in popularity as it found political allies, and as nutritionists, advertisers and home economists began to portray it differently. Margarine was no longer eaten only by poor people. In 1967, Wisconsin became the last state to repeal its anti-color law. (info from Slate and Jstor)

Friday, March 16, 2007

1250 (approx.):
first eyeglasses

No one knows for certain when eyeglasses were invented, although documents from the 13th century prove the existence of eyeglasses at that time. Several sources quote a manuscript written in Rome in 1289 by a member of the Popozo family that says, "I am so debilitated by age that without the glasses known as spectacles, I would no longer be able to read or write." A painting done by Tommaso da Modena in 1352 includes the first known artistic representation of eyeglasses.

Historians credit the Chinese with carving the first frames more than 2,000 years ago, but apparently those frames did not contain lenses and were used to protect their eyes from "evil forces." The frames were carved from tortoise shell, a sacred material.

The use of a magnifying glass was first recorded in about 1000 A.D. It was called a reading stone and was placed on top of reading material to magnify letters. Monks used it to copy manuscripts. Later, Venetian glassblowers constructed lenses that could be held in a frame in front of the eyes. Glasses for distance vision first appeared around the middle of the 15th century, and there are various references in literature of that time to spectacles for "distant vision."

In the 15th century, the printing press was invented, making reading materials more available to the public and increasing the need for glasses. Early eyeglasses were held by hand in front of the eyes or designed to "perch" on the nose. It wasn't until the 17th century that a London optician perfected the use of side pieces that rested on the ears.

In 1784, Benjamin Franklin invented a bifocal lens with the top half for viewing at distance and the bottom half for reading. (from VisionRX)

Thursday, March 15, 2007

1692:
last execution for witchcraft in the US

The majority of witch trials in the US took place in New England, mostly in Massachusetts. The most famous trials were in Salem, MA in 1692.

The last witchcraft trial in Massachusetts was in 1693. The defendant was found Not Guilty. There were witching accusations in the South until 1709.

The last execution for witchcraft in the United States was in 1692. Witches were hanged, stoned, or crushed to death, not burned at the stake.

The last witch executions in European countries were:
Holland 1610
England 1684
Scotland 1727
France 1745
Germany 1775
Switzerland 1782
Poland 1793

(Info from The Encyclopedia of Witchcraft and Demonology by Rossell Robbins

Wednesday, March 14, 2007

1901:
first vacuum cleaner

In 1876, Melville Bissell revolutionized home care by making the need for beating carpets less frequent. He owned a china shop in Grand Rapids, Michigan, and made the first popular and successful carpet sweeper by putting rotary brushes in a small canister with a push handle. Bissell's invention was spurred by his own need: bits of packing-crate straw became imbedded in his carpet. The Bissell carpet sweeper picked up both straw and dust and contained them in the canister for later disposal.

On the other side of the Atlantic a British company called Ewbank dominated the market. By 1880, Ewbank sweepers were found in many homes including the palaces of Britain's royal family.

Unfortunately, carpet sweepers lacked vacuum suction. They were effective to a certain point, but could not pull dust and dirt from deep within carpet pile. Inventor Hubert Cecil Booth saw a demonstration in London of an American machine that blew compressed air through carpeting; this produced a cloud of dust (proving how much was trapped inside the carpet), but the same dust only settled back into the carpet.

Americans had also experimented with suction devices since about 1859, but only a few factory cleaners reached the marketplace. Booth saw the future in suction. He proved this to friends in two startling demonstrations. In one, he placed a handkerchief on the carpet and sucked on the handkerchief with his mouth. The underside of the cloth was filled with dirt. Even more startling, Booth was so eager to prove his thinking to friends that he knelt in front of a chair in a restaurant and sucked on the chair covering. Coughing and spluttering, he spat the extracted dirt into a hankie.

Booth’s first vacuum cleaner, called "puffin Billy," was made of a piston pump. It did not contain any brushes; all the cleaning was done by suction through long tubes with nozzles on the ends. It was a large machine, mounted in a horse-drawn van that was pulled through the streets. The vans of the British Vacuum Cleaning Company (BVCC) were bright red, and uniformed operators would haul hose off the van and route it through the windows of a building to reach the rooms inside. Booth was harassed by complaints about the noise of his machines and was fined for frightening horses. The BVCC's most prestigious engagement was cleaning the carpets in Westminster Abbey in London before the 1901 coronation of King Edward VII and Queen Alexandra.

The coronation cleaning led to a demonstration at Buckingham Palace, which had a system installed after the royal family saw the dirt Booth was able to suction out of the palace. Booth's vacuum system, however, was not suitable for individual homeowners. Some large buildings had Booth's machine installed in the basement with a network of tubes fitted into the walls of the rooms with sockets in the walls. Short lengths of tubing with nozzles were connected to the sockets, and this central cleaning system sucked the dust into a container in the basement.

Efforts to make smaller vacuum cleaners were slow to develop. Booth made a smaller version call the Trolley Vac in 1906, but it was very expensive and still weighed 100 lb Other cleaners included the Griffith (also debuting in 1906) and the Davies device, patented in 1909, which required a two-man operating crew, fine for wealthy households but not the average home.

James Spangler, like Bissell, suffered from dust allergy and asthma. In 1907, He built an electric-powered vacuum cleaner in Ohio. Spangler made a box of wood and tin with a broom handle to push it and a pillow case to hold the dust. Spangler's innovation was to connect the motor to a fan disc and a rotating brush, combining the best of Bissell's brush sweeper with the suction of a powered vacuum cleaner to pull more dust out of carpets.

Spangler himself did not have the money to promote the cleaner, but his relative, William H. "Boss" Hoover, a maker of leather goods, quickly saw the advantages of Spangler's machine. The first Hoover vacuum was made in 1908 and weighed only 40 lb. The machines sold very well door-to-door because housekeepers could see the action on their own carpeting. Hoover quickly built a large retailing operation that spread to Britain by 1913; to this day, vacuum cleaning in England is called "hovering." (photo from Ciclomatic, info from About.com)

Tuesday, March 13, 2007

2032:
Last payment made by check in the US

On December 31, 2032, Garrison P. Weinberg (94) of Delray Beach, FL, was the last known American to write a paper check.

Weinberg's $83.78 check was used to pay for a dozen eggs and a quart of orange juice in a Piggly-Wiggly supermarket. Weinberg said, "Like most shoppers, I usually pay with a finger scan, but I still had a checkbook, and wanted to have a chance to do something that might get me in the history books."

Since the 1980s, credit cards, debit cards and payments by phone and online, gradually replaced paper checks. The Federal Reserve estimated that fewer than one million checks were written in the US in 2030, compared with about 10 billion in 2015, 37 billion in 2003, and 50 billion in 1995.

By 2010, credit card and debit card retail transactions started being replaced by Radio Frequency Identification (RFID) implants. In 2025, "plastic" credit was made illegal, and was replaced by more secure Point of Purchase Biological Credit Identification ("POP-BCID"), including finger scans, eye scans, and Instant Follicle DNA Analysis. By that time, computers with voiceprint analyzers had become widespread, for access to secure documents, adult websites, and for online purchases.

Old credit cards became collectors' items. One BankAmericard issued in 1962, recently sold for over $35,000 on eBay.

Paper money and coins were removed from circulation in 2030, except for non-spendable gold-colored Federal Commemorative Medallions that could be used as Christmas stocking stuffers, and as gifts for children who lost teeth. The Medallions are sold by the Federal government, to help finance the war in Iraq. Wallets largely disappeared in the early 2030s, when Americans started using Multi-Function IPhones ("MFIPs") that took the place of driver's license, insurance card, Social Security card, draft card, etc.

According to The Associated Press, the decline in check writing prompted the Federal Reserve to dramatically reduce the size of its check-processing department. From 2003 to 2007, the Fed closed more than half of its 45 check-processing centers. By the end of 2008, only 18 centers remained, and by 2015, there was just one left, in Bangalore, India.

Check writing survived in the US for some specialized tasks, such as real estate transactions, and paying kids to mow the lawn; but became illegal on the first day of 2033, under the Federal Paperwork Reduction Act of 2025. Most other industrialized countries stopped using checks by the mid 2030s, and at this time only Cuba and North Korea still use checks.

Monday, March 12, 2007

1870: first flat-bottom paper bag

Before Margaret Smith got involved, paper bags were like giant envelopes. Knight was an employee in a paper bag factory when she invented a device that would automatically fold and glue paper bags with square bottoms, so they'd hold more and stand up by themselves.

Male co-workers reportedly refused her advice when installing the equipment because they thought a woman couldn't know anything about machines. Knight can be considered the mother of the grocery bag, and founded the Eastern Paper Bag Company in 1870.

Knight was born in 1838. She received her first patent at the age of 30, but inventing was always part of her life. She made sleds and kites for her brothers while growing up in Maine. She went to work in the Amoskeag cotton mills when she was nine years old. When, at the age of twelve, she saw a fellow worker badly injured, she invented a device to quickly stop the machinery; and the owner put it to use.

Knight is considered a "female Edison," and received some 26 patents for such diverse items as a window frame and sash, machinery for cutting shoe soles, and improvements to internal combustion engines. She is believed to have made twice as many other inventions that were not patented. Margaret Knight's paper bag machine design is still in use, and her original machine is in the Smithsonian. Info from About.com and the University of Houston)

Friday, March 9, 2007

1979:
death of last American former slave

Former slave Charlie Smith died of natural causes in 1979 in a nursing home in Barrow, Florida. He was believed to be 137 years old, making him the oldest person in the United States.

Smith was born with the name Mitchell Watkins in Liberia, West Africa, in 1842. He came to America as a child slave, claiming he was lured aboard a slave ship by promises of "fritter trees on board with lots of syrup."

He arrived in New Orleans in 1854 and was given the name of his owner, a Texas rancher, as well as a new birth date, July 4th. Smith gained his freedom when President Lincoln signed the Emancipation Proclamation on January 1, 1863.

(Information from The African-American Book of Days: Inspirational History and Thoughts for Every Day of the Year, by Julia Stewart.) (Photo shows Smith on His 134th Birthday, 1976. Photograph by Peggy Kehoe, courtesy of The Polk County Democrat, Bartow, Florida.)

Thursday, March 8, 2007

1976:
last Good Humor ice cream truck
(but not quite)

In 1920, Harry Burt, a Youngstown, Ohio candy maker, created a lollypop called the Jolly Boy Sucker. That same year, while working in his ice cream parlor, Burt created a smooth chocolate coating that was compatible with ice cream. It tasted great, but the new combination was too messy to eat. Burt’s son, Harry Jr., suggested freezing wooden lollypop sticks, used for Jolly Boy Suckers, into the ice cream, and it worked.

Burt called his creation the Good Humor bar, capitalizing on the then widely held belief that a person’s humor, (temperament) was related to the humor of the palate (sense of taste). Convinced that he had something big on his hands, he filed for a patent at 3 a.m. on January 30. The patent officials didn't share his sense of urgency. It took three years and a personal trip to Washington, D.C., with a five-gallon pail of Good Humor bars before Burt was finally granted exclusive rights to "ice cream on a stick."

To market his new product, Burt sent out a fleet of 12 chauffeur-driven trucks, all with bells. The Good Humor bar was an immediate success in Youngstown. Customers liked that the ice cream was on a stick, and the Good Humor men in their white uniforms promoted a clean, wholesome, and trustworthy image.

In 1926, after the death of her husband, Cora Burt took the company public with franchises costing just $100. During the next few years as Good Humor expanded into other parts of the Midwest, the Good Humor Corporation of America was formed. It acquired the patents and consolidated the operations of some of the franchised companies.

In 1930, M.J. Meehan, a New York businessman, acquired the national rights to the company by buying 75 percent of the shares. The Meehan family owned the company until 1961 when it was sold to Unilever’s U.S. subsidiary, the Thomas J. Lipton Company.

Unilever’s Lipton Foods unit continued to manufacture and market Good Humor products for the next 12 years. In 1976, when the company's direct-selling business was disbanded in favor of grocery stores and free-standing freezer cabinets, the trucks were parked for the last time. Some of the trucks were purchased by ice cream distributors while others were sold to private individuals.

In 1989, Unilever purchased Gold Bond Ice Cream, located in Green Bay, Wis., and grouped its U.S. ice cream and frozen novelty businesses under the name Gold Bond-Good Humor Ice Cream. With its acquisition of Breyers Ice Cream in 1993, the company name was changed to Good Humor-Breyers Ice Cream. The company also makes the Popsicle and Klondike Bar.

Photo is from www.GoodHumorTrucks.com. The operator of that website provides trucks and uniformed drivers for corporate events, social affairs and movies in the Chicago area.

Wednesday, March 7, 2007

1812:
first White House wedding

In 1812, Dolley Madison, wife of fourth president James Madison, arranged the first marriage ceremony to be held at the White House -– the wedding of her widowed sister, Lucy Payne Washington, to a Supreme Court Justice, Thomas Todd.

Dolley herself was a widow who remarried. Her first husband was not the president, but John Todd, Jr., a lawyer who died of yellow fever after just three years of marriage, leaving Dolley with a young son.

For half a century she was the most important woman in the social circles of America. To this day she remains one of the best known and best loved first ladies, and apparently the only one who inspired an ice cream brand. Dolley served ice cream at her husband's Inaugural Ball in 1813. The ice cream brand spells her first name without the "e."

Tuesday, March 6, 2007

1960:
first electronic wrist watch

"Accutron" tuning fork watches, first sold by Bulova in 1960, use a 360-Hertz tuning fork to regulate a mechanical watch movement. It's inventor was Max Hetzel, who joined the Bulova Watch Company in 1948. Hetzel was the first engineer to use an electronic device, a transistor, in a wrist watch, and his Accutron was the first watch that truly deserved the adjective electronic. More than 4 million were sold until production stopped in 1977.

Few developments in timekeeping technology created a stir like the introduction of a watch that used a tuning fork as a timing standard rather than a rotating balance wheel. Eight years in development, the Accutron had only 12 moving parts and 27 parts total, compared with 26 moving parts and 130 parts total in a typical self-winding mechanical watch.

Accutrons are supposed to neither gain nor lose more than one minute per month. Prior to the Accutron, it was unusual to find a mechanical watch of this accuracy, even a certified chronometer.

Before Accutron, the method of keeping time mechanically had not changed much in over 300 years. Suddenly in 1960, a timepiece went on the market which was inherently accurate and made the use of escapements and balance wheels obsolete.

The original Accutron 214 is an American icon, born at a time when America felt threatened by Russian advances in space technology. It was brought into existence by Bulova under the leadership of retired general Omar N. Bradley, the WW2 hero for whom the Bradley Fighting Vehicle was named.

During the 1960's it was worn by most of the pilots of the X-15 rocket plane, and Accutron played a part in every US Space mission during the 60's and 70's including the Moon landings. There are several Accutron 214 timing devices sitting on the Moon's "Sea of Tranquility," placed there by astronauts.

The Omega Speedmaster Professional chronograph wristwatch (known as the "Moon watch") was designated by NASA for use by the astronauts in all manned space missions, becoming the first watch on the moon in the wrist of Edwin 'Buzz' Aldrin.

However all the instrument panel clocks and time-keeping mechanisms in the spacecraft on those space missions were Bulova Accutrons with tuning fork movements, because at the time, NASA did not know how well a mechanical movement would work in zero gravity conditions.

The Accutron 214 was declared the American "Gift of State" by President Linden Johnson and given to hundreds of visiting dignitaries. The 214 was made into panel mount clocks and installed in the instrument panels of thousands of military ships and aircraft including "Air Force One". The 214 can reasonably be considered the prototype for all modern quartz watches. No other timepiece has had a greater impact on the way we keep time today. (info from Finer Times, Bulova, Wikipedia, Accutron214)

Monday, March 5, 2007

1964:
Debut of GI JOE, first boy's action figure

In the early 1960s, toymaker Hasbro wanted to develop a boys' toy that would enjoy the success of their Barbie doll, with millions of kids pestering parents for profitable clothing and accessories.

In February 1964 at the Toy Fair in New York, America was introduced to G.I. JOE: "AMERICA'S MOVABLE FIGHTING MAN." The name G.I. JOE was taken from the movie "The Story Of G.I. JOE" which featured an American army unit in World War II.

G.I. JOE was an incredibly ambitious product release, with 75 different products to support the four basic branches of military - Soldier, Sailor, Marine, and Pilot. Each figure was 11 ½ inches tall, had 21 moving parts and came in fatigue uniform with boots, work cap, and a dog tag. The original G. I. JOE was a slightly more muscular version of Ken, the boyfriend of Barbie. Wary of trying to sell a doll to boys, the company coined the phrase action figure.

Sales the first three years were enormous, and Hasbro aggressively rolled out new products that evolved the line, like the Five Star Jeep, Mercury Space Capsule, Deep Sea Diver, footlocker, Green Beret, and Soldiers of the World. Buoyed by their success, Hasbro even tried a few new products, like the G.I. Nurse. She failed spectacularly, and today remains one of the most sought-after G.I. Joe toys.

In many incarnations across four decades, G.I. JOE has become the single greatest brand in the history of boys' toys, and ushered in a new play pattern that forever changed the scope of the toy world.

G.I. JOE retained his military theme from 1964 until 1968. By the end of the decade, sales were faltering, as they were for all military-themed toys at a time of increased protest against the War in Viet Nam.

In 1969 G.I. JOE retired from military service and became an adventurer. He then explored outer space, the far-flung African deserts and jungles, and the undersea world. An adventure-filled comic book was packed with each set, illustrating several different ways that boys could play with the sets.

Hasbro has kept G.I. JOE on the market nearly continuously since 1964, with the exception of a few years between the “retirement” of the classic “vintage” G.I. Joe in 1976 and the introduction of the new “A Real American Hero” in 1982. \

G.I. JOE has assumed many different faces over the years - from military hero to adventurer to ADVENTURE TEAM member, to SUPER JOE, to the smaller but no less powerful REAL AMERICAN HERO (which itself changed over time), to Sgt. Savage, to G.I. JOE: Extreme, and back to 12” G.I. JOE figures with the Hall of Fame, to 8” Sigma 6, then to the 2 ½” Mission Scale.

There has never been a toy that has had the complex history of G.I. JOE, and there hasn’t been a boy’s toy, aside perhaps from trains, that has lasted as long. In 2004, G.I. JOE was honored with induction into the National Toy Hall of Fame. The collection has proven its longevity, and the adventures of G.I. JOE continue to this day. (Info from Hasbro and CBS, photo from Hasbro.)

Friday, March 2, 2007

1920s:
end of the bathing machine

In the 19th century, especially in Britain, men and women who swam in the ocean were usually segregated into separate areas, so that no one of the opposite sex might catch sight of them in their bathing suits, which (although extremely modest by more modern standards) were not considered proper clothing to be seen in by the general public.

The bathing machine was a device, popular in the 19th century, which was intended to allow people to wade in the ocean at beaches without violating Victorian notions of modesty. Bathing machines were roofed and walled wooden carts which would be rolled into the sea.

The bathing machine was part of sea-bathing etiquette which was more rigorously enforced upon women than men, but was expected to be observed by people of both sexes among those who wished to be considered "proper".

People would enter the bathing machine while it was on the dry beach, wearing street clothes. In the privacy of the machine they would then change into their bathing suit, placing their street clothes into a compartment where it would remain dry.

The bathing machine would then be wheeled or slid down into the water. The most common forms of bathing machines had large wide wheels and were propelled in and out of the surf by horse or human power. Some resorts had wooden rails put out into the water for the wheels to roll on. A few had bathing machines pulled in and out by attached cables propelled by a steam engine.

Once in the water, the occupants would use steps to exit the machine out the sea side. It was considered essential that the machine block any view of the bather from the shore.

Some resorts employed a person called a "dipper," a large strong person of the same sex as the bather who would assist the bather getting into and out of the sea. Some dippers were said to roughly push the bathers into the water, then yank them out; but this was considered part of the ocean bathing experience. Bathing machines would often be equipped with a flag which could be raised by the bather as a signal that she was ready to return to shore.

According to some sources, the bathing machine was first developed about 1750 by Benjamin Beale at the resort town of Margate. However, in the Scarborough Public Library there is an engraving dated 1736 which shows people bathing and appears to be the first recorded evidence for the use of bathing machines.

Bathing machines were most common in the United Kingdom and parts of the British Empire with a sizable British population, but were also used at beaches in some other nations, including the US. Legal segregation of bathing areas in Britain ended in 1901, and use of the bathing machine declined fairly rapidly thereafter. By the start of the 1920s bathing machines were almost extinct. (Info from Wikipedia)

Thursday, March 1, 2007

1890:
first execution with electric chair

Alfred P. Southwick developed the idea of using electric current as a method of execution after having witnessed an intoxicated man die after having touched an exposed terminal on a live generator.

The first practical electric chair was made by Harold P. Brown, an employee of Thomas Edison, hired to research electrocution and develop the electric chair.

Brown's design was based on George Westinghouse's alternating current (AC), which was then just emerging as the rival to Edison's less transport-efficient direct current (DC), which was further along in commercial development. The decision to use AC was entirely driven by Edison's attempt to claim that AC was more lethal than DC.

In 1886 New York State established a committee to determine a new, more humane system of execution to replace hanging. Neither Edison nor Westinghouse wanted their electrical system to be chosen because they feared that consumers would not want in their homes the same type of electricity used to kill criminals.

In order to prove that AC electricity was dangerous and therefore better for executions, Brown and Edison, who promoted DC electricity, publicly killed many animals with AC, including a circus elephant. They held executions of animals for the press in order to ensure that AC current was associated with electrocution. It was at these events that the term "electrocution" was coined. Edison introduced the verb "to westinghouse" for denoting the art of executing persons with AC current. Most of their experiments were conducted at Edison's West Orange, New Jersey, laboratory in 1888.

The demonstrations apparently had their intended effects, and the AC electric chair was adopted by the committee in 1889. The first person to be executed via the electric chair was William Kemmler in New York's Auburn Prison in 1890; the 'state electrician' was Edwin Davis. The Westinghouse company refused to sell an AC generator for the purpose of execution, so Edison and Brown used subterfuge in order to acquire the AC generator. They pretended that the Westinghouse AC generator was for use in a university.

In 1900, Charles Justice was a prison inmate at the Ohio State Penitentiary in Columbus. While performing cleaning detail duties in the death chamber, he devised an idea to improve the efficiency of the restraints on the electric chair. Justice designed metal clamps to replace the leather straps, thus allowing for the inmate to be secured more tautly and minimize the problem of burnt flesh. These revisions were incorporated into the chair and Justice was subsequently paroled from prison. Ironically, he was convicted in a robbery/murder and returned to prison 13 years later under a death sentence. On November 9, 1911, he died in the same electric chair that he had helped to improve.

A record was set on July 13, 1928 when seven men were executed, one after another, in the electric chair at Kentucky State Penitentiary in Eddyville. In 1942, six Germans convicted of espionage in the Quirin Case were put to death in the District of Columbia jail electric chair.

The last person who involuntarily was executed via the electric chair was Lynda Lyon Block on May 10, 2002 in Alabama.

A number of states still allow the condemned person to choose between electrocution and lethal injection. In all, seven inmates nationwide, 4 in Virginia, 2 in South Carolina and 1 in Arkansas have opted for electrocution over lethal injection. The last use of the chair (as of 2006) was on July 20, 2006, when Brandon Hedrick was electrocuted in Virginia. He elected this method. Before that, it had not been used since May 2004, when James Neil Tucker was electrocuted in South Carolina. He refused to choose his execution method. (Info from Wikipedia, photo from New Jersey State Police Museum)