Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
dd: Barcodes Sweep the World
History of the Bar Code
By Tony Seideman
Supermarkets are a perilous business. They must stock thousands of products in scores of brands and sizes to sell at painfully small markups. Keeping close track of them all, and maintaining inventories neither too large nor too small is critical. Yet for most of this century, as stores got bigger and the profusion on the shelves multiplied, the only way to find out what was on hand was by shutting the place down and counting every can, bag, and parcel. This expensive and cumbersome job was usually done no more than once a month. Store managers had to base most of their decisions on hunches or crude estimates.
Long before bar codes and scanners were actually invented, grocers knew they desperately needed something like them. Punch cards, first developed for the 1890 U.S Census, seemed to offer some early hope. In 1932, a business student named Wallace Flint wrote a master's thesis in which he envisioned a supermarket where customers would perforate cards to mark their selections; at the checkout counter they would insert them into a reader, which would activate machinery to bring the purchases to them on conveyer belts. Store management would have a record of what was being bought.
The problem was, of course, that the card reading equipment of the day was bulky, utterly unwieldy, and hopelessly expensive. Even if the country had not been in the middle of the Great Depression, Flint's scheme would have been unrealistic for all but the most distant future. Still, it foreshadowed what was to come.
The first step toward today's bar codes came in 1948, when Bernard Silver, a graduate student, overheard a conversation in the halls of Philadelphia's Drexel Institute of Technology. The president of a food chain was pleading with one of the deans to undertake research on capturing product information automatically at checkout. The dean turned down the request, but Bob Silver mentioned the conversation to his friend Norman Joseph Woodland, a twenty-seven-year-old graduate student and teacher at Drexel. The problem fascinated Woodland.
His first idea was to use patterns of ink that would glow under ultraviolet light, and the two men built a device to test the concept. It worked, but they encountered problems ranging from ink instability to printing costs. Nonetheless, Woodland was convinced he had a workable idea. He took some stock market earnings, quit Drexel, and moved to his grandfather's Florida apartment to seek solutions. After several months of work he came up with the linear bar code, using elements from two established technologies: movie soundtracks and Morse code.
Woodland, now retired, remembers that after starting with Morse code, "I just extended the dots and dashes downwards and made narrow lines and wide line out of them." To read the data, he made use out of Lee de Forest's movie sound system from the1920's. De Forest had printed a pattern of varying degrees of transparency on the edge of the film, then shone a light through it as the picture ran. A sensitive tube on the other side translated the shifts in brightness into electric waveforms, which were in turn converted to sound by loudspeakers. Woodland planned to adapt this system by reflecting light off his wide and narrow lines and using a similar tube to interpret the results.
Woodland took his idea back to Drexel, where he began putting together a patent application. He decided to replace his wide and narrow lines with concentric circles, so that they could be scanned from any direction. This became known as the bull's-eye code. Meanwhile, Silver investigated what form the codes should ultimately take. The two filed a patent application on October 20, 1949.
In 1951 Woodland got a job at IBM, where he hoped his scheme would flourish. The following year he and Silver set out to build the first actual bar code reader-in the living room of Woodland's house in Binghamton, New York. The device was the size of a desk and had to be wrapped in black oilcloth to keep out ambient light. It relied on two key elements: a five-hundred-watt incandescent bulb as the light source and an RCA935 photo-multiplier tube, designed for movie sound systems, as the reader.
Woodland hooked the 935 tube up to an oscilloscope. Then he moved a piece of paper marked with lines across a thin beam emanating from the light source. The reflected beam was aimed at the tube. At one point the heat from the powerful bulb set the paper smoldering. Nonetheless, Woodland got what he wanted. As the paper moved, the signal on the oscilloscope jumped. He and Silver had created a device that could electronically read printed material.
It was not immediately clear how to transform this crude electronic response into a useful form. The primitive computers of the day were cumbersome to operate, could only perform simple calculations, and in any case were the size of a typical frozen-food section. The idea of installing thousands of them in supermarkets from coast to coast would have been pure fantasy. Yet without a cheap and convenient way to record data from Woodland and Silver's codes, their idea would have been no more than a curiosity.
Then there was that five-hundred-watt bulb. It created an enormous amount of light, only a tiny fraction of which was read by the 935 tube. The rest was released as expensive, uncomfortable waste heat. "That bulb was an awful thing to look at," Woodland recalls. "It could cause eye damage." The inventors needed a source that could focus a large amount of light into a small space. Today that sounds like a prescription for a laser, but in 1952 lasers did not exist. In retrospect, bar codes were clearly a technology whose time had nowhere near come.
But Woodland and Silver, sensing the potential, pressed on. In October 1952 their patent was granted. Woodland stayed with IBM and in the late 1950's persuaded the company to hire a consultant to evaluate bar codes. The consultant agreed that they had great possibilities but said they would require a technology that lay at least five years off. By now almost half the seventeen-year life of Woodland and Silver's patent had expired.
IBM offered a couple of times to buy the patent, but for much less than they thought it was worth. In 1962 Philco met their price, and they sold. (The following year Silver died at age thirty-eight.) Philco later sold the patent to RCA. In 1971 RCA would jolt several industries into action; before then, the next advances in information handling would come out of the railroad industry.
Freight cars are nomads, wandering all across the country and often being lent from one line to another. Keeping track of them is one of the most complex tasks the railroad industry faces, and in the early 1960's it attracted the attention of David J. Collins. Collins got his master's degree from MIT in 1959 and immediately went to work for the Sylvania Corporation, which was trying to find military applications for a computer it had built. During his undergraduate days Collins had worked for the Pennsylvania Railroad and he knew that the railroads needed a way to identify cars automatically and then to handle the information gathered. Sylvania's computer could do the latter; all Collins needed was a means to retrieve the former. Some sort of coded label seemed to be the easiest and cheapest approach.
Strictly speaking, the labels Collins came up with were not bar codes. Instead of relying on black bars or rings, they used groups of orange and blue stripes made of a reflective material, which could be arranged to represent the digits 0 through 9. Each car was given a four-digit number to identify the railroad that owned it and a six-digit number to identify the car itself. When cars went into a yard, readers would flash a beam of colored light onto the codes and interpret the reflections. The Boston & Maine conducted the first test of the system on its gravel cars in 1961. By 1967 most of the kinks had been worked out, and a nationwide standard for a coding system was adopted. All that remained was for railroads to but and install the equipment.
Collins foresaw applications for automatic coding far beyond the railroads, and in 1967 he pitched the idea to his bosses at Sylvania. "I said what we'd like to do now is develop the little black-and-white-line equivalent for conveyer control and for everything else that moves," he remembers. In a classic case of corporate shortsightedness, the company refused to fund him. "They said, 'We don't want to invest further. We've got this big market, and let's go and make money out of it.' " Collins quit and co founded Computer Identics Corporation.
Sylvania never even saw profits from serving the railroad industry. Carriers started installing scanners in 1970, and the system worked as expected, but it was simply too expensive. Although computers had been getting a lot smaller, faster, and cheaper, they still cost too much to be economical in the quantities required. The recession in the mid-1970's killed the system as a flurry of railroad bankruptcies gutted industry budgets. Sylvania was left with a white elephant.
Meanwhile, Computer Identics prospered. Its system used lasers, which in the late 1960's were just becoming affordable. A milliwatt helium-neon laser beam could easily match the job done by Woodland's unwieldy five-hundred-watt bulb. A thin stripe moving over a bar code would be adsorbed by the black stripes and reflected by the white ones, giving scanner sensors a clear on/off signal. Lasers could read bar codes anywhere from three inches to several feet away, and they could sweep back and forth like a searchlight hundreds of times a second, giving the reader many looks at a single code from many different angles. That would prove to be a great help in deciphering scratched or torn labels.
In the spring of 1969 computer Identics quietly installed its first two systems-probably the first true bar code systems anywhere. One went into a General Motors plant in Pontiac, Michigan, where it was used to monitor the production and distribution of automobile axle units. The other went into a distribution facility run by General Trading Company in Carlsbad, New Jersey, to help direct shipments to the proper loading-bay doors. At this point the components were still being built by hand; Collins made the enclosures for the scanners by turning a wastebasket upside down and molding fiberglass around it. Both systems relied on extremely simple bar codes bearing only two digits worth of information. But that was all they needed; the Pontiac plant made only eighteen types of axle, and the General Trading facility had fewer than a hundred doors.
Computer Identic's triumph proved the potential for bar codes in industrial settings, but it was the grocery industry that would once again provide the impetus to push the technology forward. In the early 1970's, the industry set out to propel to full commercial maturity the technology that Woodland and Silver had dreamed up and Computer Identics had proved feasible.
Already RCA was moving to assist the industry. RCA executives had attended a 1966 grocery industry meeting where bar-code development had been urged, and they smelled new business. A special group went to work at an RCA laboratory in Princeton New Jersey, and the Kroger grocery chain volunteered itself as a guinea pig. Then, in mid 1970, an industry consortium established an ad hoc committee to look into bar codes. The committee set guidelines for bar-code development and created a symbol-selection subcommittee to help standardize the approach.
This would be the grocery industry's Manhattan Project, and Alan Haberman, who headed the subcommittee as president of First National Stores, recalls proudly, "We showed that it could be done on a massive scale, that cooperation without antitrust implications was possible for the common good, and that business didn't need the government to shove them in the right direction."
At the heart of the guidelines were a few basic principles. To make life easier for the cashier, not harder, bar codes would have to be readable from almost any angle and at a wide range of distances. Because they would be reproduced by the millions, the labels would have to be cheap and easy to print. And to be affordable, automated checkout systems would have to pay for themselves in two and a half years. This last goal turned out to be quite plausible; a 1970 study by McKinsey & Company predicted that the industry would save $150 million a year by adopting the systems.
"It turns out there were massive savings that we called hard savings-out-of-pocket savings in labor and other areas," Haberman says. "And there were gigantic savings available in the use of information and the ability to deal with it more easily than we had before, but we never quantified that." Hard, quantifiable savings were what would draw retailers. These included checking out items at twice the speed of cashiers using traditional equipment, which would mean shorter lines without staff increases.
Still, while early bar-code systems would automate the checkout, they would not be useful for monitoring inventory, because at first too few items would come labeled with codes. Savings from using the collected information, instead of simply from cutting labor costs, would have to wait until most items bore codes. After that happened, management at every level would have to transform the way it operated.
In the spring of 1971 RCA demonstrated a bulls-eye bar-code system at a grocery industry meeting. Visitors got a round piece of tin; if the code on top contained the right number, they won a prize. IBM executives at that meeting noticed the crowds RCA was drawing and worried that they were losing out on a huge potential market. Then Alec Jablonover, a marketing specialist at IBM, remembered that his company had the bar code's inventor on staff. Soon Woodland-whose patent had expired in 1969-was transferred to IBM's facilities in North Carolina, where he played a prominent role in developing the most popular and important version of the technology: the Universal Product Code (UPC).
RCA continued to push its bull's-eye code. In July 1972 it began an eighteen-month test in a Kroger store in Cincinnati. It turned out that the printing problems and scanning difficulties limited the bull's-eye's usefulness. Printing presses sometimes smear ink in the direction the paper is running. When this happened to bull's-eye symbols, they did not scan properly. With the UPC, on the other hand, any extra ink simply flows out the top or bottom and no information is lost.
For a time such exotica as starburst-shaped codes and computer readable characters were considered, but eventually the technically elegant IBM-born UPC won the battle to be chosen by the industry. No event in the history of modern logistics was more important. The adoption of the Universal Product Code, on April 3, 1973, transformed bar codes from a technological curiosity into a business juggernaut.
Before the UPC, various systems had begun to come into use around the world in stores, libraries, factories, and the like, each with its own proprietary code. Afterward bar code on any product could be read and understood in every suitably equipped store in the country. Standardization made it worth the expense for manufacturers to put the symbol on their packages and for printers to develop the new types of ink, plates, and other technology to reproduce the code with the exact tolerances it requires.
Budgets for the bar-code revolution were on a scale to make the Pentagon blanch. Each of the nation's tens of thousands of grocery outlets would have to spend at least $200,000 on new equipment. Chains would have to install new data processing centers and retrain their employees. Manufacturers would potentially spend $200 million a year on the labels. Yet tests showed that the system would pay for itself in a few years.
Standardization of the code meant the need for a standardized system of numbers to go on it. "Before we had bar codes, every company had its own way of designating its products," Haberman says. Some used letters, some used numbers, some used both, and a few had no codes at all. When the UPC took over, these companies had to give up their individual methods and register with a new Uniform Code Council (UCC).
The code is split into two halves of six digits each. The first one is always zero, except for products like meat and produce that have variable weight, and a few other special types of items. The next five are the manufacturer's code; the next five are the product code; and the last is a "check digit" used to verify that the preceding digits have been scanned properly. Hidden cues in the structure of the code tell the scanner which end is which, so it can be scanned in any direction. Manufacturers register with the UCC to get an identifier code for their company, then register each of their products. Thus each package that passes over a checkout stand has its own unique identification number.
Two technological developments of the 1960s finally made scanners simple and affordable enough. Cheap lasers were one. The other was integrated circuits. When Woodland and Silver first came up with their idea, they would have needed a wall full of switches and relays to handle the information a scanner picked up; now it's all done by a microchip.
On June 26, 1974, all the tests were done, all the proposals were complete, all the standards were set, and at a Marsh supermarket in Troy, Ohio, a single pack of chewing gum became the first retail product sold with the help of a scanner. Decades of schemes and billions of dollars in investment now became a practical reality. The use of scanners grew slowly at first. A minimum of 85 percent of all products would have to carry the codes before the system could pay off, and when suppliers reached that level, in the late 1970s, sales of the systems started to take off. In 1978 less than one percent of grocery stores nationwide had scanners. By mid-1981
The figure was 10 percent, three years later it was 33 percent, and today more than 60 percent are so equipped.
Meanwhile, the technology has been creeping into other industries and organizations. Researchers have mounted tiny bar codes on bees to track the insects' mating habits. The U.S. Army has used two-foot-long bar codes to label fifty-foot boats in storage at West Point. Hospital patients wear bar-code ID bracelets. The codes appear on truck parts, business documents, shipping cartons, marathon runners, and even logs in lumberyards. Federal Express, the package-shipping giant, is probably the world's biggest single user of the technology; its shipping labels bear a code called Codabar. Along the way refinements of the basic UPC have been developed, including the European Article Numbering System (EAN), developed by Joe Woodland, which has an extra pair of digits and is on its way to becoming the world's most widely used system. Other codes, which are given such fanciful names as Code 39, Code 16K, and Interleaved 2 of 5, can sometimes contain letters as well as numbers.
Woodland never got rich from bar codes, though he was awarded the 1992 National Medal of Technology by President Bush. But all those involved in the early days speak of the rewards of having brought a new way of doing business into the world. "This thing is a success story on the American way of doing things," Haberman says. "Our own initiative-take it on ourselves, inviting the world to join in. It has something to say about the little guys with lots of vision.
---Tony Seideman is a free lance writer who lives in New York City
http://www.barcoding.com/Information/barcode_history.shtml
dd: Understanding Barcodes 101
http://educ.queensu.ca/~compsci/units/encoding/barcodes/undrstnd.html
http://educ.queensu.ca/~compsci/units/encoding/barcodes/history.html
This was used many times fwiw in training type sessions for aggregated retail groups in the past
Good basic drilled down core info...lol
dd: Ecologizing Mobile Media
By Howard Rheingold, Thu Sep 09 08:45:00 GMT 2004
The mobile telephone has quickly, profoundly, and unexpectedly altered many aspects of human life -- social, economic, cultural and political.
Although social scientists have looked into several of these areas of change, little is understood about the whole system of changes: exactly what the late Neil Postman would call a problem of "media ecology."
I propose that we -- you, the readers, and I -- apply Postman's "Ten Principles of Technology" (from The End of Education) as probes into this complex storm of forces most people in the world find ourselves experiencing in our daily lives.
Last November, we had a lot of fun, and perhaps sparked a few insights, applying McLuhan's "Laws of Media" as conceptual probes in "McLuhanizing Mobile Media." In a similar manner, I will lay out here the basic questions as Postman proposed and offer my own takes on answers. But the rest of the exercise depends on reader participation -- I know that readers of my articles will know more than I do about most of what follows. And perhaps we'll discover that we know more together than any of us knows alone.
1. All technological change is a Faustian bargain. For every advantage a new technology offers, there is always a corresponding disadvantage.
I'll go for the low-hanging fruit on this one: a great advantage of the mobile telephone is that people are always in touch, always reachable; and a great disadvantage is that people are always in touch, always reachable. The same capability that grants freedom can also enslave; untethered from a desk doesn't mean untethered from the boss. I'm sure there are others, but this is the most obvious.
2. The advantages and disadvantages of new technologies are never distributed evenly among the population. This means that every new technology benefits some and harms others.
With close to half a billion mobile phones sold just this year, I suspect the great divide is not going to remain the one between those who can afford access to phones and those who can't. Increasingly, the advantages are available differentially to those who know what those advantages are and how to make use of them -- the divide between the "know- how" and "don't-know-how" populations. It's a matter of literacy.
3. Embedded in every technology there is a powerful idea, sometimes tow or three powerful ideas. Like language itself, a technology predisposes us to favor and value certain perspectives an accomplishments and to subordinate others.
Texting favors terseness and is often about staying in touch more than transmitting meaningful content -- the meaning is in the continuous communication and coordination of activities, not the text of the messages themselves. If email reawakened writing, will texting put it back to sleep, or favor poetry over prose?
4. A new technology usually makes war against an old technology. It competes with it for time, attention, money, prestige and a "worldview".
Again, I will take advantage of my position as first commenter by making the obvious observation that untethering communications from the desktop means spending more time on the move, in the park or at Starbucks, and less time at home or the office. Communication addiction no longer dictates agoraphobic behavior patterns.
5. Technological change is not additive; it is ecological. A new technology does not merely add something; it changes everything.
More people can organize collective action with people they weren't able to organize before, at times and in places they weren't able to organize before. The ways cities are used, political demonstrations are organized, entertainment is scheduled and daily life is coordinated are already changing.
6. Because of the symbolic forms in which information is encoded, different technologies have different intellectual and emotional biases.
Mobile voice communication is "hotter" in the McLuhan sense, conveys nuance, and leaves less for the recipient to fill in. Texting is "cooler" and leaves more interpretation of nuance up to the receiver of the message.
7. Because of the accessibility and speed in which information is encoded, different technologies have different political biases.
In Seattle, Manila, Seoul and Madrid, we've seen regimes toppled and Presidents elected because texting enables spontaneously self-organized demonstrations and get-out-the-vote. If broadcast media is biased toward centralized control, mobile media are biased toward decentralized out-of-control.
8. Because of their physical form, different technologies have different sensory biases.
This is an interesting one, and I'm a bit baffled. Mobile voice communication has a completely different sensory bias from SMS, and picturephoning adds yet another dimension. Readers?
9. Because of the conditions in which we attend them, different technologies have different social biases.
Here is the source of the social collisions we see on trains when people loudly tell invisible others that they are on the train -- the mobile communication device enables every individual to escape to, dwell in, and impose their personal social space on whatever space they are in, private or public.
10. Because of their technical and economic structure, different technologies have different content biases.
I can see the audio capabilities of mobile devices improving dramatically, and high-resolution video is a matter of bandwidth, but no content beyond voice and music is going to be widely popular unless it works well with a tiny screen.
Those are the ten principles and my brief responses. What are your takes on these?
http://www.thefeature.com/article?articleid=101022&ref=7112963
dd:
Siemens researchers recently demonstrated a cell phone outfitted with a tiny laser that could someday turn the world into a display.
"Mobile phone designers are faced with a paradox," says Siemens user interface researcher Alexander Jarczyk. On one hand, users clamor for smaller, thinner devices. But in the same breath, they scream for bigger screens. With the proliferation of mobile Internet applications, the problem is only going to get worse. Screen real estate is at a premium and designers are hustling to develop new territory, from flexible displays that roll up to zoomable user interfaces. For Jarczyk and his colleagues, though, the world itself is a mobile display. All you need is a projector that fits in your pocket.
http://www.thefeature.com/article?articleid=101550
Cell phone with onboard video projector
My latest article for TheFeature is about the development of a mobile phone with an integrated laser projector:
http://www.boingboing.net/2005/04/14/cell_phone_with_onbo.html
ot, alexandre: Touche to that seimens post on Virtual "Post Its": http://www.tagandscan.com/info.htm
http://www.tagandscan.com/demo/demo.htm
ot: I think Guys and Gals, THis is even much bigger than we might have even imagined last week, lol...jmho...OK, Not Sure What context this DD "fits into" but I have flagged it as "ot" for those purists out there, lol, but seriously, this is Interesting
The Rise of the Participatory Panopticon
Plausibly Surreal – Scenarios and Anticipations
http://www.worldchanging.com/archives/002651.html
bonus dd:
http://www.worldchanging.com/archives/002567.html
http://www.worldchanging.com/archives/002641.html
dd:As cell phones bulk up, how much is too much?
By Steve Lohr
http://news.com.com/As+cell+phones+bulk+up%2C+how+much+is+too+much/2100-1041_3-5694846.html
Story last modified Wed May 04 07:26:00 PDT 2005
Even when it's turned off, the cool cell phone of the moment--the slender, silver Moto Razr by Motorola--makes a statement: It knows what it is. The Razr has a collection of nice features, like a sizable screen, a decent camera, good battery life and Bluetooth for cordlessly communicating with a personal computer or a headset.
Yet the Razr is largely defined by what it is not. It stands apart from the industry drift toward making the cell phone into a digital Swiss Army knife, a supermarket of technology squeezed into a palm-size device that includes a typewriter's worth of keys, and enough storage and software to accommodate everything from family photo albums to corporate presentations.
The Razr is designed for fashion, simplicity and ease of use. Its success suggests that a well-edited boutique of technology, instead of a supermarket, may be a promising path for cell phone makers and wireless carriers, a $100 billion industry being shaken up by new technology, opportunity and uncertainty.
There is a digital land rush going on, driven by rapid advances in technology that make it possible to put more and more tools of higher and higher quality into phones. The recognition that talk is only part of the cellphone's future--that the handset is becoming a personal window into an evolving blend of communications, computing and media--has the existing players in the cell phone market scrambling, and new entrants looking for a way in.
Wireless carriers like Cingular, Verizon Wireless and Sprint are rolling out high-speed networks that can handle television, movie clips and music, in addition to all kinds of information, from e-mail to news. They are searching desperately to find a money-making future beyond talk, which is destined to become a mature business, highly competitive and less profitable.
Handset makers like Nokia, Motorola and Samsung are introducing the next generation: multimedia phones. The latest entrants, announced last week by Nokia, include a model that can hold up to 3,000 songs, and another phone that doubles as a high-quality camera and video recorder that can shoot and store an hour of video. Media companies--from Time Warner and Viacom to Google and Yahoo--are looking to the cell phone as a new market for their entertainment, news and search products, and software makers, led by Microsoft, have also entered the fray.
The ascendant computer-media hybrid, Apple Computer, plans to test the market in a few months with a music cell phone, designed in partnership with Motorola, hoping to extend its music business beyond the iPod. Apple's first step into the market, according to industry executives familiar with the company's plans, will be a modest one--a phone designed to hold a day's playlist of music, about 25 songs, which can be loaded from a personal computer or purchased from a wireless music store. The phone is being carefully positioned as an enhancement to the iPod instead of a potential alternative.
Technologists and industry analysts talk of three screens--the television, the personal computer and the cell phone--each with its strengths, and weaknesses. The television screen is best for a group entertainment experience, the PC screen for individual work and browsing the Internet.
But the cell phone screen, if small, has one huge advantage. It is all but ever-present. People may be annoyed by the cell phone habits of others, but they are hooked themselves. There are 172 million mobile phone subscribers in America, or 59 percent of the population. By 2009, cell phone ownership will rise to 69 percent, JupiterResearch, a technology research company, projects.
In surveys of cell phone users, respondents say there are three things they always take with them when they leave home: wallet, keys and cell phone. A recent survey conducted for BBDO Worldwide, the advertising agency, found that 75 percent of cell phone owners in the United States kept their phones turned on and within reach 16 or more hours a day. And when asked if they had ever answered their mobile phones during sex, 15 percent said yes. Go figure.
In the battle of the screens, "the cell phone has already won," said Kevin Burden, an analyst at IDC, a research company. "It is the one piece of technology you never have to convince anyone they should carry."
The new technology and services are coming in force, but so far the effort to push beyond voice has gone slowly in the United States compared with other nations. Data services, from short text messages to sending photos, range from 3 percent to 8 percent of total revenues for American carriers. In Europe, Japan and South Korea, data revenues range up to 35 percent.
Camera phones are common today, and many people routinely take pictures with them. But only 15 percent of people with camera phones send pictures over wireless networks, said Julie Ask, an analyst for JupiterResearch.
Many industry analysts blame the wireless carriers for clinging to confusing, and often costly, metered payment plans on their networks, including some by-the-megabyte pricing plans for pictures. "Consumers don't know megabytes from dog bites," said Delly Tamer, chief executive of LetsTalk.com, an online cell phone store. "Complexity is a huge bottleneck."
The carriers have made efforts to simplify things. Sprint, for example, has adopted flat pricing but with an additional charge for each service beyond basic data services (e-mail, instant messages and Web access), like its menu of television programming or its picture-messaging service. The carriers cringe at the notion of Internet-style, all-you-can-eat pricing--perhaps one monthly price for voice, anywhere and any time, and a higher monthly price to add data of all kinds.
"We believe in the model where consumers pay for the applications they use," said Jeffrey Hallock, vice president of product strategy and marketing at Sprint.
The carriers are betting a lot that they can entice customers to move beyond voice, and the first hook is an appealing and powerful phone. "The phone is incredibly important," said David Christopher, vice president of handset product management for Cingular, the nation's largest wireless carrier. "It's the embodiment of our services."
Cingular, for example, will offer six phones with higher-resolution cameras of a megapixel or more in the first half of this year, while it had none a year earlier. Nine phones have Bluetooth wireless technology, compared with one Bluetooth phone a year ago. "We've got wildly more capability in our handsets this year, and we're going to keep going," Christopher said. "We're beginning to put very powerful computers in the palm of your hand."
A worthy goal, but one that should be pursued selectively, according to Nicholas Negroponte, a professor of media technology at the Massachusetts Institute of Technology and a director of Motorola. The industry is mistaken, he said, if it seeks to "make one device into the all-doing, all-singing, general-purpose everything." Instead, Negroponte added, people would be better served by a "family of devices that work like a society of machines, not a sacred object."
At Motorola, the future of the cell phone is a matter of continuing research, experimentation and investment. Edward J. Zander, who joined Motorola as chief executive a year and a half ago, after a career in Silicon Valley, says the cell phone business reminds him of the computer industry 20 years ago, unsettled and up for grabs.
"This industry is in a period of incredible flux," Zander said recently in his office in suburban Chicago. "The big challenge for every company over the next five years is to figure out what you are and how you make money."
Motorola, the world's second-largest cell phone maker after Nokia, is working on an array of ideas. One option is to make the cell phone a remote control for the home. The user at work or traveling could set room temperatures, program the TiVo, turn on the oven to start the roast, or watch the kids doing their homework by viewing the images transmitted from small, inexpensive cameras in each room. Smart chips can make the cell phone a credit card, and biometric sensors on the keys can provide fingerprint identification for security. Another prototype phone flips open sideways with a larger screen, camera and keyboard, opening the door to spontaneous "video blogging."
"We haven't figured out ourselves where it's headed," said Geoffrey Frost, Motorola's chief marketing officer.
But the direction does seem to be moving toward Negroponte's "family of devices," cell phones tailored for specific uses. A person might have three or four phones, all with the same number, which can be used one at a time, a service now available in some European markets. A person might use a music-centered cell phone with headphones while jogging or shopping, an office-productivity phone for e-mail and taking notes on business trip, and a very stylish phone for evenings.
The Razr was the result of that sort of focused design. Work started on the cell phone nearly two years ago, when Motorola was struggling. It needed a signature phone. "We needed to recapture our heritage," said James Wicks, the design chief at Motorola. "The goal was to be the 'king of thin.'"
The other mandate for the Razr, Wicks added, was that the slim design would not mean the user experience was compromised. It had to have a roomy 12-key keypad, a sizable screen and good battery life. The twin requirements, Wicks said, drove a series of innovations in the use of advanced materials like aircraft-grade aluminum and molded magnesium, chipset layouts and industrial design. The sleek phone has some advanced features, but design, not loading in all the most advanced technology, was its guiding principle.
The Razr has been a huge hit since it was introduced last year, far exceeding Motorola's internal sales estimates. It is, after all, a pricy phone--$450 with a two-year contract from Cingular, which struck an exclusive deal with Motorola for the Razr.
Even the most technologically adept have sometimes found the appeal of the Razr irresistible and the price worth it. Brian Bershad, a 39-year-old computer scientist at the University of Washington in Seattle, has tried all the geeky, computerlike cell phones, including the BlackBerry and Palm Treo. But earlier this year, he bought a Razr. "It looks really cool," Bershad said. "It feels right."
Entire contents, Copyright © 2005 The New York Times. All rights reserved.
http://news.com.com/2102-1041_3-5694846.html?tag=st.util.print
Power Lunch dd:& ot comment (ps, just picked up one last block of 1000 shares, can neevr time anything, lol, could have saved a few pennies if I waited another 5 mins...I'm sure I didn't even put a dent in anything, lol, fwiw, I'm tapped...also, I keep looking at that nice green "rise" of the bars in the CNN article below on ad revenue projected growth, and it has a calming effect while I think about knowing that we may have "the bridge")
Google: biting the hand that feeds it?
Everybody loves Google, right? Not so fast.
May 3, 2005: 3:48 PM EDT
Krysten Crawford, CNN/Money staff writer
NEW YORK (CNN/Money) - In honor of National Teacher Day, Google featured on its home page Tuesday a graphic of a chalkboard with an apple at its base. Quirky tributes like this are meant to engender goodwill among the Google masses.
Not everyone, however, is feeling warm and fuzzy toward Google.
A new study of national advertisers and interviews with a handful of marketing agencies indicate that the Internet giant could have a customer service problem.
"Google has always been bad -- worse than bad even," said Dana Todd, the president of the Search Engine Marketing Professional Organization (SEMPO), a 300-member trade group founded in 2002.
Independent media analyst Jack Myers, in his fourth annual "customer satisfaction" survey of online sales groups, also found that advertisers aren't entirely happy with Google's service.
Google's core issue appears to be one of customer interaction. Google developed the world's most popular Internet search engine. But it makes most of its money selling what's known as paid-search advertising, or ads that are based on search results or the content that appears on another site that has partnered with Google.
Paid-search advertising is going gangbusters. eMarketer, a technology research firm, estimates that search advertising will grow 40 percent, to $5.4 billion, in 2005. Overall, online advertising revenues are expected to reach $12.9 billion this year, making the Internet the fastest-growing form of advertising.
Google has been the major beneficiary of the online ad boom -- revenues for the quarter that ended in March nearly doubled, to $1.26 billion, and its stock price is up 164 percent since its initial public offering . The bonanza is also drawing fierce competitors, including Yahoo!, Microsoft's MSN.com and America Online's Advertising.com. Time Warner is the parent of AOL and CNN/Money.
Room for improvement
But as Google looks to broaden its services beyond paid search, Myers suggests the company first has some work to do with advertisers.
In his survey, Myers asked close to 200 national advertising executives who spend the majority of their time handling online sales to rank 60 Web sites based on several categories.
Google finished a solid No. 5 overall, better than its seventh-place finish in 2004. Yahoo! was No. 2. Of the nine categories measured, Google's sales team edged out Yahoo!'s crew in one category: product knowledge. Google finished third. Yahoo! ranked fourth.
What caught Myers's attention was Google's ranking in one of the most important categories: "responsiveness and accessibility." Google was No. 18, down from 11th place the year before.
"Google is performing well against the industry," noted Myers. But when it comes to customer interaction, Myers said Google "has dropped considerably" in recent years.
Myers attributed Google's falloff to the Mountain View, Ca.-based company's rapid growth and its dominance, which means advertiser expectations are high. At the same time, Myers says Google's rivals are more accessible and more effective at dealing with advertisers.
But interviews with ad agency reps suggest that Google's reputation among advertisers is worse than Myers's study indicates.
A matter of identity
Todd, the SEMPO president, attributes Google's poor image to the fact that the company has viewed itself as and acted like a technology company. Google has tried to automate as many processes as possible and that, according to Todd, doesn't work in the advertising world.
Another advertising taboo that Todd says Google has broken: the company has gone around the agencies it deals with and tried to sign deals directly with Fortune 1000 advertisers. That's alienated media buyers and, at the same time, fueled the perception that Google is giving deep-pocketed advertisers special treatment, to the detriment of smaller advertisers.
"They haven't perceived themselves, until recently, as a media company," said Todd, who is also co-founder and vice president of SiteLab International, a La Jolla, Ca.-based agency specializing in online advertising.
Google isn't the only one that's apparently breached ad industry etiquette. Todd says Yahoo! once tried a similar end-run around agencies, but has since stopped the practice.
Google and Yahoo! are founding members of SEMPO.
Todd said Google officials have told her they're aware of advertiser concerns and plan to address them.
Tim Armstrong, the vice president of sales at Google, said the company is committed to building a top-notch customer service operation and that the response from agencies and advertisers has been very positive.
"In general I think we're very focused on all of our customers," said Armstrong. Asked whether Google is bypassing agencies to sell directly to advertisers, Armstrong said: "I want to be very, very clear on this point. That is not the culture that we have at Google. We work very closely with agencies and clients in exactly the manner they want us to."
Does Google really need to worry? After all, there are no signs that Google is about to cede its lead as the dominant Internet advertising site anytime soon. And given the rosy projections on future ad spending, Google might not need to win any popularity contests in the advertising world.
And yet, history might offer a lesson in hubris.
The last time Internet advertising flourished -- in the late 1990s -- AOL was king and acted like it. Time Warner officials today openly acknowledge that AOL alienated advertisers with its arrogance and paid a steep price when the dot.com bubble burst, giving advertisers the upper-hand.
It's taken years for AOL to rebuild those relationships. And while AOL's rankings in the Myers survey have improved over time, it lags both Google and Yahoo!.
Jessie Stricchiola, president of search-engine marketing firm Alchemist Media, says the AOL lesson shows that advertisers and their agencies can't be ignored.
"It's still a pretty small industry in terms of the people involved," she said.
http://money.cnn.com/2005/05/03/technology/google_adsurvey/index.htm
http://finance.lycos.com/qc/livecharts/ot_timedout.html
Hit refresh after it times out after a few mins...
dd:Getting From Point A to B With Search
by Gord Hotchkiss, Thursday, Apr 14, 2005 3:15 PM EST
IN PREPARING FOR A PRESENTATION I'm going to do in a month or so to a group of catalogue publishers, I decided to do some research to see how search worked to bring traffic to some well known online catalogues. What searches translated into traffic for Lands End, L.L. Bean, or Victoria's Secret? The more I dug, with the help of Hitwise, the more surprised I got. In each of these cases, variations of the site's name accounted for one half of all search traffic. With Lands End, these variations totaled a little over 48 percent of all its search referrals. Just over 3 percent of all search referrals were for "www.landsend.com", the exact URL users could have just typed in their address bar.
With L.L. Bean, the total was about 42 percent and Victoria's Secret was about 63.5 percent. So, about one out of every two searches that ended up delivering traffic to these sites appears to be someone who was unsure of the actual URL and thought it would be quicker just to search for it.
And that got the mental wheels in motion.
Search as a Navigation Shortcut We've always known that this behavior takes place. It's one of the reasons why "google.com" and "google" perennially shows up as an often searched for term on Google. I think I heard a fellow columnist refer to it as the "people are stupid" factor. But I don't think that's it at all. I think it's the "people are in a hurry" and "people are lazy" factor, and I put myself squarely in both camps.
Yes, we could go up to the address bar and type in the URL. But toolbars put search just a little closer to our cursor. And, if we type the address slightly wrong, the search engine will helpfully ask us "Did you mean...?" It's just quicker and easier to let a search engine eliminate the frustration of getting the right URL typed into that little box.
The timesavings get even more significant when we're interested in a short cut to a specific section beyond the home page. For example, a significant percentage of Lands End traffic searched for "Lands End Overstocks." Yes, you could type in www.LandsEnd.com and then navigate through the site to find the overstock section, but you could also just launch a split-second search (Google's average response time is less than a quarter second) and click right to it. Increasingly, we're using search engines to take us exactly where we want to go.
Implications for Marketing If we're using search for a short cut, there are a few obvious implications for the search marketer. First of all, the better known the site and its corresponding brand, the more likely this will occur. Again turning to Hitwise, we find the top 10 referring terms for the appliance and electronics industry contained only one non brand name search (cell phones). The rest of the search terms were for the vendors you'd expect to dominate this industry.
So, well known brands better have their prime real estate secured in the search results. If you're not No. 1 for the major variations of your brand in the organic listings, you're potentially losing a lot of traffic to the competition. Even worse, if an attack site has somehow gained top spot for your brand name, you're exceptionally vulnerable. I'll give you all a minute to go check this right now on your favorite search engine.
What if you're No. 4 or 5 for your brand? Our eye tracking research shows that visibility and click-throughs drop dramatically as you move from No. 1 to No. 2, 3 or even worse, 7 or 9. Not holding the No. 1 organic spot in this instance is like letting your competitor put their sign over yours in front of your store.
Secondly, it's important to make sure search engines are indexing your entire site. If your customers are using search as a short cut to land deep in your site and your site isn't fully indexed, you're stranding them high and dry.
A Continuing Trend Let's face it, trying to remember the right URL, with the right extension, and spell it correctly is a lot of effort when we can launch a search and see the results in a second or two. The easier search will be to use and the more tightly integrated it is, the more we'll use it as our primary source of navigating the Web. It's like our own online transporter, picking us up and delivering us to exactly the online destination we wanted, without the messy navigation in between. No longer is online search just a way to find what we didn't know existed. Now it's the fastest way to get to even our most familiar online destinations, making a comprehensive search strategy even more important for every online business.
Gord Hotchkiss is the president of Enquiro, a search engine marketing firm. He loves to explore the strategic side of search and is a frequent speaker at Search Engine Strategies and Ad:Tech.
http://publications.mediapost.com/index.cfm?fuseaction=Articles.showArticle&art_aid=29229
dd: I'd Love to Search But Words Get in the Way
by Gord Hotchkiss, Thursday, Apr 28, 2005 2:30 PM EST
THE PERFECT SEARCH ENGINE WOULD be a small microchip implanted in our brain. It would act as an instantaneous connection between the vast complexity of our brain and the vast complexity of the Web. To find something, we would just have to think about it and the chip would match that concept with the most relevant destination online. Unfortunately, such a development hasn't rolled out of the Google Labs yet. So for now, we have to shoehorn our thoughts into a small quarter-inch by three-inch box on the search engine's home page. We have to distill our thoughts into a few choice words and hope this provides the search engine with enough to go by. And there lies the ultimate vulnerability point of search. Often, our ideas are too big to capture in one or two words.
Small Words, Big Searches; Big Words, Small Searches We all have different intentions when we go to search. As I've mentioned in previous columns, many of us turn to a general search engine when we're mapping out unfamiliar territory online. When we define the boundaries of our concept, we often leave them vague and inclusive, because we don't want to rule anything out. So, perhaps I'm at the beginning stages of considering a trip to New Orleans. I haven't done any research yet, so I'm looking for options and alternatives. My mind is open. This particular canvas hasn't been painted on yet. So my search is likely to be broad, i.e. "New Orleans." By keeping it broad, I know I should include everything on New Orleans.
We also use search as a navigation short cut to get to the most appropriate page on the Internet. We want to go directly from point A to B (again, the topic of a previous column) without a lot of detours to get in the way. Often, these types of searches happen well into the research phase. For example, let's say I had done a lot of research into New Orleans and in a previous session I remember seeing a page on upcoming events on the New Orleans's Chamber of Commerce Web site. I don't have the URL and I didn't book mark it. So I go to the search engine and type in "New Orleans Chamber of Commerce Events." It's a very specific search that should take me right where I want to go. I don't want to see everything on New Orleans. I just want to see this one page.
Mapping Our Thoughts to Words The challenge comes in the search engine trying to interpret my intentions based on my key phrases. Let's go back to the first example. Although I've kept the search broad ("New Orleans") I obviously have a concept of the type of sites I'm looking for. They could be restaurant directories, accommodation guides, lists of things to do, official visitor sites, or other rich research sources. This is my concept, unstated to the search engine but residing in my mind.
So, when the search results come up, I'm looking at them through a "semantic map" that continues many words that flesh out my concept and might catch my attention. I'm trying to match the ideas in my mind with the results I see on the page. While I searched for "New Orleans" I'm actually looking for anything that might give me valuable and trusted information on how to make my trip to New Orleans more enjoyable.
The Eyes Have It We've just recently completed two studies that show the impact of semantic mapping in the search process. One was an eye tracking study and one was an analysis of the importance of different factors in precipitating a click through. Based on these two studies, here's what seems to happen. The eye looks for a visual cue, generally the phrase we just searched for, in the title. Starting on the top of page on the left hand side, we start scanning down the page in an "F" pattern. While we're focused on the visual cue, our peripheral vision is open to the appearance of words that might match our semantic map. Even though we didn't search for any of these words explicitly, their appearance in the title and description has a strong implicit impact on which link we start reading. When there seems to be a match based on a quick scan including both where our eyes are fixated and the extra detail picked up by our peripheral vision, we switch to more traditional reading behavior, reading first the title and then the description from left to right. This lateral activity creates the horizontal arms of the "F".
As an example, we saw that people searching for digital cameras were presented with two listings from the same site, with almost identical titles. The listings were first and second in the organic results. Both listings promised "unbiased consumer reviews" in the title, after the query string "digital cameras." We saw fixation points on both of these visual cues. The difference came in what was shown in the description. In the second listing, there were recognized brands mentioned, including Kodak and Nikon. The vast majority of searchers quickly scanned past the first listing and started active reading of the second. It was a better match for their semantic map.
So, what does this mean? Well, it means that it's not enough to be No. 1. It's not even enough to make sure you have the query string in your title. To maximize the potential for click through, you have to understand what might be in your target customer's semantic map and match this through careful crafting of both title and description text. Bidding and organic optimization can put you in the right place, but you'd better have the right message too.
Gord Hotchkiss is the president of Enquiro, a search engine marketing firm. He loves to explore the strategic side of search and is a frequent speaker at Search Engine Strategies and Ad:Tech.
http://publications.mediapost.com/index.cfm?fuseaction=Articles.showArticle&art_aid=29650
Can Search Help Customers Be Heard?
by Gord Hotchkiss, Thursday, Mar 31, 2005 1:30 PM EST
I'M ON VACATION RIGHT NOW with my family. In fact, as most of you are reading this, I'll be flying back from Orlando. While here, I saw a television ad that got me to thinking. The ad was for a real estate company, and the premise was this: Wouldn't it be nice if every company we did business with had a customer satisfaction rating posted prominently? Right up front, you could see if the business you were dealing with rated a 97 percent or a 43 percent. While the ad's message was that this particular real estate company did post their approval rating for every potential customer to see, the thoughts this stirred up in me were a little deeper and more fundamental.
We all know that the Internet is transferring power from the marketer to the consumer. In fact, the use of the label consumer is probably no longer valid. Ray Podder, a brand strategist, hates the use of the term. It conjures up images of a vast mindless herd of Pavlovian dogs eagerly consuming whatever marketers shovel our way through advertising. Ray recommends using the term "empowered customers" instead. So, in this column, I'll follow Ray's lead and use his wording.
The Internet and the proliferation of self-publishing options give us the power to build or dissemble brands instantly. Suddenly, the intended market is sharing the straight scoop on products, without corporate filters or advertising spin getting in the way. We share our real life experiences from our perspective, not from a Madison Avenue idealized one.
But to get back to the commercial I saw, so far no one with enough market traction has taken up the task of aggregating this information into an easy-to-digest rating system. There is no "seal of approval" that comes from customers. But for the first time, the potential is there.
There have been a few players who have attempted to do this. Trip Advisor is one that shares real-world ratings of hotels and other travel related services. And Epinions.com has also offered readers the opportunity to post reviews on a number of products. But neither service has tapped into the online market to any great extent. According to Alexa, Epinions.com is ranked around 1,000 for site popularity. It hasn't gained the critical mass needed to turn it into a hot online property. And considering that it's been around for some time, it may never get there.
This, by a long and circuitous route, leads me to the topic of this column. How about search engines? Can they provide customers with a podium to be heard from? They're already the most popular sites online, so critical mass and traffic certainly won't be a problem.
Search engines rank sites by their own criteria of what makes a good site or a substandard one. They're already in the business of aggregating information and using it to rank alternatives for the user. They are generally considered objective and non-partisan. And they've already drawn a line between their advertising and the editorial section of their page that is recognized by most users. And as they continue to become more vertical (Ask Jeeves' recent acquisition will certainly heat up this race) it seems they'll be looking for a competitive advantage to offer their users. This seems to be a compelling one.
We are on the nexus of the switch to the customer-controlled marketing model. At this point, most empowered customers are totally unaware they wield this much power. Only the adventurous few who have staked their territory online have learned how the Internet gives each of us a powerful voice that can reach millions. In a few spectacular and oft quoted examples, online buzz has synergized to the point where new product introductions took off. Online takes word-of-mouth to a whole new dimension. Like many thing in our fragile society, the relationship between marketers and customers is on the verge of a fundamental and earth-shaking shift. Advertisers, don't tell us how we're supposed to feel about your products. We'll tell you, and you'd better listen!
As a relevant aside, we're starting to hear more and more from companies who are fighting customer launched attack sites who have achieved higher rankings on search engines than the official site for the brand. In this case, the balance of power has swung from the advertiser to the customer. This is unfamiliar territory for the corporate world.
But to this point, there's no online destination with enough market penetration and critical mass that is dedicated to acting as the focal point for customer opinion. In fact, most customers turn to search engines when looking for published information on a product and sift through blog and forum postings. If they're already turning to search, why not close the loop and help aggregate the information they're looking for? Why not find a way to measure online buzz, both good and bad, and present it to us in an easily understood way?
This makes even more sense when you consider that search will aggressively try to place itself at the intersection of all online customer behavior. The areas they're currently looking to control include shopping search and local search. Both have huge potential wins from a revenue potential. If customers could also find an easily digested capsule of popular opinion to help in the making of their decision, I believe it would present a compelling package.
And that places Ask Jeeves in a unique situation. As a recent acquisition of IAC, they join the corporate family of Citysearch, Expedia, and Match.com. Suddenly, Ask Jeeves is in the ideal position to pursue a vertical strategy. And a vertical search destination would be a great place to start a customer rating system. In fact, Citysearch already has both reader and editorial ratings for restaurants and other tourist destinations. After gaining a foothold here, it could be expanded to all the Ask Jeeves search properties.
There's no doubt that customers will speak, and speak loudly online. But will search engines provide them the forum to be heard?
Gord Hotchkiss is the president of Enquiro, a search engine marketing firm. He loves to explore the strategic side of search and is a frequent speaker at Search Engine Strategies and Ad:Tech.
http://publications.mediapost.com/index.cfm?fuseaction=Articles.showArticle&art_aid=28697
ot dd: Credit/Link To Previous:
(ps, is it just me or are the dots starting to be connected???)16th street jetty, here we come....
http://www.cio-today.com/news/Chips-Sales-Better-Than-Expected/story.xhtml?story_id=003000002726
Chips Sales Better Than Expected
By Jason Lopez
May 3, 2005 7:10PM
"The unexpected strength of semiconductor sales, with 13 percent growth over a very strong period a year ago, is a good sign for the industry," said George Scalise, president of the Semiconductor Industry Association.
Many analysts had expected little revenue growth in 2005 for the semiconductor industry. As it turns out, there has been more than a glimmer of expanding sales globally, according to the Semiconductor Industry Association.
The nonprofit industry group reports chip revenue increased 13.2 percent in the January through March quarter of 2005. That translates into sales of US$55.3 billion in the first quarter compared to $48.9 billion in the same period last year.
Sales even slightly beat the robust fourth quarter of 2004.
Good Omen
SIA's data comes from contributions of more than 40 chip vendors that belong to the association. The vendors include some of the biggest names in the industry, such as Intel, Texas Instruments , Xilinx, Transmeta and TSMC.
"The unexpected strength of semiconductor sales, with 13 percent growth over a very strong period a year ago, is a good sign for the industry," said George Scalise, president of the Association.
But the industry group was not entirely caught off guard. When it called for flat 2005 revenue growth, SIA analysts knew they were being cautious.
The SIA said it is preparing to revise its 2005 forecast with more growth in mind. But in the background analysts are changing how they address the huge chip market.
Flying Toasters
The easiest way to categorize chips is by the products that use them. There are flash chips that go in cameras, cell phones and music players, as well as microprocessors that run PCs. But the SIA said it is starting to consider semiconductors on the basis of who buys them.
"We've started looking at it in terms of whose pocket the money comes from," said the SIA's John Greenagel. He outlined three kinds of buyers: government, business and consumer.
"We found that about half the spending on products with semiconductors came from consumers," he noted.
It suggests that markets cannot be defined easily as a set of products. Cell phones, for example, are purchased by government entities, businesses and consumers. More importantly, it shows that the chip industry, as mature as it might seem from Wall Street's point of view, is far from reaching a point of absolute maturity.
There is plenty of room for improvement in chip performance and there are many traditional products that are not yet enabled with chips. Flying toasters? Maybe someday. But chips are replacing everyday devices from simple thermostat coils to audio amplifiers.
"Semiconductors are becoming more pervasive in every aspect of our lives," Greenagel said.
dd:Challenges to Nigeria's Advert Industry
http://allafrica.com/stories/200504210391.html
ot DD: Good Listen: Georges Harik Director of Googlettes, Google
(mentioned in the NY Times Article and kind of off topic, but gives a "taste" of what they are working on in this area...)
http://www.itconversations.com/shows/detail302.html
ot: Sorry, Credit For last Article and link:
http://www.nytimes.com/2005/05/04/technology/techspecial/04guernsey.html?pagewanted=print&positi...
Have a good evening...
SonOfGodzilla
DD:THE INTERNET IN HAND
The Cellphone's Potential as a Search Tool Gets Tapped
By LISA GUERNSEY
May 4, 2005
FEW months ago, a group of friends in Austin, Tex., were dining out when the talk turned to the N.C.A.A. basketball tournament. Someone asked, When does the first round start? No one knew.
So Mohit Goyal, a business analyst with a software company, opened his phone and typed in a few keywords. Mr. Goyal found the answer in seconds, and the group made plans to get together for the first-round game. "I love the fact that no matter where I am, I can get this information," he said.
Mr. Goyal is an early adopter of technology, and his experience is most likely to sound too good to be true to most cellular users. But he was not using an extraordinarily high-tech phone. He was simply adept at using the features on his Nokia 6820.
Search engines like Google and Yahoo are betting that most consumers will catch on to what Mr. Goyal has already figured out - that mobile phones can search the Web when a computer is not nearby.
In the last six months, the potential of mobile search has been promoted in a flurry of news releases. Yahoo announced local and image search services intended for phone screens. Google, which has been offering a mobile Internet search service for several years, now takes questions via another channel of communication, the short-message system SMS. Fast Search and Transfer, a search company that powers enterprises like Lexis-Nexis and IBM.com, created a service called FAST mSearch for use by cellphone carriers and content providers.
Still, searching the Web on a phone is frustrating for most users. Screens are tiny. Waiting for a page to appear can take 10 seconds or more. And when signals fade, the lag times can be unbearable.
"The experience is much like text browsing in the early days of the Internet," said Allen Tsai, founder of Mobiledia.com, a site for comparison shopping of cellphones.
Only 1 to 2 percent of Web-enabled phones are used for anything more than voice calls and text messaging, said Paul Budde, founder of BuddeComm, an Australian researcher of telecommunications issues.
But just as the Internet went mainstream as broadband became more widely available, the mobile Internet will become popular as bandwidth increases, search-engine companies say. With those improvements, they add, payment plans will need to provide incentives for searching.
"If you are charged 10 cents a minute, you're not going to care to find the answer to a 'Seinfeld' episode you are talking about at dinner," said Chris Winfield, president and co-founder of 10e20, a search-engine marketing firm. "But if it's free and fast, you're going to use it."
Mobile operators are already creating faster networks, motivated by the prospect of customers downloading movies and music. Hardware and software makers are tweaking their products to speed delivery.
Technology that recodes Web pages for display on a cellphone is also advancing. "Most Web pages are not written for cellphones; they are written for a computer screen," said Georges Harik, a director of mobile search at Google. "This is a chicken and egg problem that is going to go away," he added, as soon as Web developers realize "there are 1.2 billion phones out there that can connect to information."
But "when you go out to dinner with friends, do you take your laptop with you?" asks Susan Aldrich, a senior vice president with the Patricia Seybold Group, a consulting and research company. Searching with a phone may be "in the early stages," she said, "but I think it is going to explode."
Your Phone Can Do It
No matter what kind of phone you have - short of those made more than five years ago - you can search the Web. Here's how:
Just a Plain Phone? Here's a Solution Almost all cellphones made in the last few years can exchange short text messages via SMS. By using that messaging system, you can ask search engines to deliver bits of information to your phone.
Google's SMS search service, for example, enables anyone to send a query to its search engine. Want to know the weather in Atlanta? Go to the SMS feature on your phone and use your number keypad to tap in the words "weather Atlanta." Send the message to phone number 46645, or GOOGL on most phones. Within seconds, a three-day forecast appears.
Other providers of SMS search include Synfonic (650-430-7183), 4Info Mobile Search (4-INFO) and Smarter.com (610-SMARTER), which uses a product name or part number to search for its lowest available price online. Just be aware that your carrier may charge a few cents for each message you send unless you have signed up for a monthly flat fee for messaging.
Phones bought in the last year or so Many of these phones already come with small browsers that can show you limited sections on the Internet. Yahoo, for example, translates all pages from its local-search database, including full-color maps and user reviews, for display on phones.
Google has multiple mobile-search services, including Froogle, which helps you comparison shop. Answers.com (go to mobile.answers.com on your phone) delivers word definitions and information on famous people and places.
Or do you have a smart phone? Smart-phone users are the early adopters who pay hundreds of dollars for hand-held devices with color screens and heaps of features, like cameras and e-mail access. Think the Treo, the Web-browsable Blackberry, the Sidekick and others.
If you have one of these phones, you can search the entire Web from your phone by clicking to the Web. Many smart phones are preset with search engines. But most Web pages were not designed for viewing on a phone, so they may be hard to decipher.
I used to think the same, but the more I dd (like last night and today, lol), the more I wish I had more cash to pick up a few more, as it is, I am glad to have gotten in when I did with what I was able to accumulate with the kahunas I had to sell off other Dogs and buy in on pullbacks the past few months... with that said, I sold off a little of one of my last remaining "dogs" today and I did manage to pick up another 1500 today on one of the dips.....I agree with Yellowjacket, with all due rewspect, you might want to buy GE or something similiar like a P&G, etc...
DD:Google's Mission in Context
by David Berkowitz, Tuesday, May 3, 2005
"WE'RE NOT JUST A SEARCH engine," she said.
I almost wept. It's like hearing a friend who's been an acclaimed pediatrician say he's switching careers to become a stockbroker.
The woman quoted is Jane Butler, Google's head of travel, who participated in one of the three "Next Big Thing in Search" panels at TravelCom last week. Butler was discussing Google's new advertising developments. Let's first review them and then see what they mean in the context of Google's mission. One new offering is site targeting, allowing advertisers to select specific sites where their contextual ads run. Another is the expansion of Google's image ads program: enhancements include testing Flash ads and a new wide skyscraper ad format. Additionally, for contextual site targeting, Google introduced CPM (cost-per-thousand) bidding, complementing the cost-per-click model. Back when the banner ad was king, CPM was the predominant advertising model. To quote JupiterResearch Analyst Nate Elliott on his blog, "Welcome back to 1997."
To get our terminology straight, publishers sign up through AdSense; advertisers sign up through AdWords. Ads purchased through AdWords may appear on Google (or one of its licensees such as AOL); those are triggered by search. The ads may also appear on Google's network of publishers that have signed up through AdSense. Advertisers might have control over where the ads appear (as they do with the new site targeting), or the advertiser might not have control.
So what do Google's developments have to do with search? On one level, absolutely nothing.
This makes the latest developments all the more confusing. Contextual advertising is not search. With search, a user types in a query to help complete a mission. With contextual advertising, a user reads an article or other content, and ads run alongside it. If the article is about price comparisons of flights to Haiti or DVD burners, the reader is likely planning a purchase. If the article is about Haitian cooking or Sony's stock, the consumer mindset is unclear. And if the article is about Iraq, the Pope, social security, Michael Jackson, or the basketball playoffs, it's unlikely the reader is in the market for anything at all.
In this regard, it doesn't matter whether the advertiser uses a text or image ad, chooses which sites display the ad, or bids per click or per impression. None of these options connect to search. Contextual advertising works for many advertisers, publishers, and contextual ad networks, yet search ads have about as much in common with contextual ads as search ads do with billboards. That's not to belittle outdoor advertising. I just don't see Viacom getting into the search engine business.
According to the Interactive Advertising Bureau (IAB) and PricewaterhouseCoopers (PwC), online display advertising accounted for 39 percent of online advertising in 2004, compared with 40 percent for search. Together, that's nearly an $8 billion opportunity, with plenty of growth ahead. Google has a brand name, advertisers, publishers, and partners to make its contextual initiatives work.
Yet they don't fit in with Google's mission statement: "Google's mission is to organize the world's information and make it universally accessible and useful." Google, the search engine, accomplishes this. Froogle and Gmail can apply. Google News, Local, Mobile, and Scholar exemplify the mission.
AdSense, however, doesn't fit. AdSense is a way for advertisers, publishers, and Google to make money, and it often works well for all of the above. There's nothing wrong with that; I'm a big fan of capitalism. But Google needs to either adhere to its mission statement or change it. Psychologists refer to such an imbalance between beliefs and actions as cognitive dissonance. When the dissonance exists, something must give.
What about Yahoo!? Yahoo! plans to test a contextual image ad network in the coming weeks, according to eWeek. How does this fit with Yahoo!'s mission? "Our mission is to be the most essential global Internet service for consumers and businesses."
Want to play a video game, set up a fantasy sports league, chat, download a pop-up blocker, or access the Internet? All fit with Yahoo!'s mission, as do Yahoo! Search, HotJobs, Yahoo! Mail, and other services. According to Yahoo!'s mission, it can basically do whatever it wants online and stay true to its vision. Not bad, right?
Google's another story though. Its mission is more tightly defined, though there are countless creative and profitable ways to achieve it.
As I was dying to tell Butler before, Google's not just a search engine, but it's so good at being a search engine - organizing information and delivering it in a useful way for consumers. The advertising generally provides relevant options for consumers and benefits all parties.
Google, your mission has rallied consumers, advertisers, investors, the press, and your thousands of team members in shouting your praises. Stay true to your mission.
And if at any time you're unclear what your mission is, Google yourself.
David Berkowitz is director of marketing at icrossing, a search engine marketing agency. He can be reached at david.berkowitz@icrossing.com.
Search Insider for Tuesday, May 3, 2005: http://publications.mediapost.com/index.cfm?fuseaction=Articles.showArticle&art_aid=29805
Forrester Research Releases US Online Advertising And Marketing Forecast — Market To Reach $26 Billion By 2010
Survey Finds That 84 Percent Of Marketers Plan To Increase US Online Ad Budgets In 2005
Cambridge, Mass., May 3, 2005 . . . Almost half of marketers plan to decrease spending in traditional advertising channels like magazines, direct mail, and newspapers to fund an increase in online ad spending in 2005. Total US online advertising and marketing spending will reach $14.7 billion in 2005, a 23 percent increase over 2004. According to a new five-year forecast from Forrester Research, Inc. (Nasdaq: FORR), online marketing and advertising will represent 8 percent of total advertising spending in 2010 — rivaling ad spending on cable/satellite TV and radio.
"Despite significant changes in consumer behavior, there is a large disparity between the amount of time consumers are spending online and the money marketers are spending trying to reach them online," says Forrester Research Principal Analyst Charlene Li. "When at-work Internet use is taken into consideration, online consumers spend more than one-third of their time online — roughly the same amount of time they spend watching TV. Yet marketers spend only 4 percent of ad budgets online versus 25 percent on TV."
While marketers surveyed believe that online advertising channels, such as search engine marketing, online display ads, and email marketing will continue to become more effective relative to traditional channels, barriers that include a lack of online advertising standards and hands-on experience have kept marketers from fully embracing online channels.
The report includes data from an online survey of 99 leading marketers and four forecasts: US Online Advertising And Marketing Spending, US Search Marketing Spending, US Online Classifieds Advertising, and US Email Marketing Spending.
Key data points include:
Search engine marketing will grow by 33 percent in 2005, reaching $11.6 billion by 2010. Display advertising, which includes traditional banners and sponsorships, will grow at the average rate of 11 percent over the next five years to $8 billion by 2010.
New advertising channels will draw interest and spending from marketers. Sixty-four percent of respondents are interested in advertising on blogs, 57 percent through RSS, and 52 percent on mobile devices, including phones and PDAs.
Marketers are quickly losing confidence in the effectiveness of traditional advertising channels and feel that online channels will become more effective over the next three years. Seventy-eight percent of survey respondents said that they think search engine marketing will be more effective, compared with 53 percent of respondents who said TV advertising would become less effective.
The only nondigital advertising channel to reach the same level of confidence as online channels with marketers is product placement — only 8 percent of respondents believe that product placement will become less effective over the next three years.
"US Online Marketing Forecast: 2005 To 2010" is available to WholeView 2™ clients and can be found at www.forrester.com.
Forrester is an independent technology research company that provides pragmatic and forward-thinking advice about technology's impact on business. Business, marketing, and IT professionals worldwide collaborate with Forrester to align their technology investments with their business goals. Forrester offers products and services in four major areas: Research, Data, Consulting, and Community. Established in 1983, Forrester is headquartered in Cambridge, Mass. For additional information, visit www.forrester.com.
Contact:
Erica Cantwell
Manager, Public Relations
Forrester Research
+1 212/672-1757
ecantwell@forrester.com
http://www.forrester.com/ER/Press/Release/0,1769,1003,00.html
Forrester: Search Engine Marketing Will Grow 33% in 2005
http://blog.searchenginewatch.com/blog/050503-103953
Online advertising on upswing
Published: May 3, 2005, 9:10 AM PDT
By Dinesh C. Sharma
Special to CNET News.com
http://news.com.com/Online+advertising+on+upswing/2100-1024_3-5693617.html
Microsoft, Groove Networks to Combine Forces to Create Anytime, Anywhere Collaboration
Microsoft's plans to acquire Groove Networks, a leading provider of collaboration software for ad-hoc workgroups," will allow the company to better meet the needs of large and small organizations for borderless project teams, as well as bring Groove founder Ray Ozzie
and other top executives to Microsoft http://www.microsoft.com/presspass/features/2005/mar05/03-10GrooveQA.asp
BONUS DD: Ray Ozzie CEO Groove Interview 10/04
Part 1
http://www.gartner.com/research/fellows/asset_115813_1176.jsp
Part 2
http://www.gartner.com/research/fellows/asset_117359_1176.jsp
2.) Sorry, here's the link I left out to post14875 http://www.startribune.com/stories/535/5348943.html
Business Forum: Can Ray Ozzie save Microsoft?
Isaac Cheifetz
Business Forum: Can Ray Ozzie save Microsoft?
Isaac Cheifetz
April 18, 2005 FORUM0418
Is Ray Ozzie the Clark Kent of the software industry? Ozzie is the founder of Groove Networks, a collaborative software vendor that Microsoft acquired in March. Previously, he created Lotus Notes, the first "groupware" killer app.
Ozzie has long been respected in the computer industry as a visionary software architect. But career software architects are like writers in Hollywood; essential yet unappreciated, compensated generously but not extravagantly. Bill Gates, Steve Jobs and Larry Ellison are public figures; their senior architects are not.
Power Lunch Break Read: Can Sony Dominate with Cell?
NE Asia MAY 2005 Issue
Cover Story
SCE has begun pushing the Cell microprocessor as its next strategy. If the firm's aim can be realized, the Sony Group could become a semiconductor major.
Can Sony Computer Entertainment Inc (SCE) of Japan pull off its third major success? The first was in 1994, when the company utilized new compact disk read-only memory (CD-ROM) media to shoe-horn itself into a leading position in the home game system market, succeeding in spite of the fact that the market was almost entirely locked up by leaders Nintendo Co, Ltd of Japan and Sega Enterprises Ltd of Japan. The second success was in 2000, when the home game system market was in the doldrums with old technology, and SCE introduced the latest semiconductor technology to attain an unmovable position in the game industry even while being condemned for its "epic game" approach.
But will there be a third success? In 2005, SCE has begun pushing the Cell next-generation microprocessor as its next strategy. The Cell IC is not designed only for use in game systems, but is intended for application in everything from home servers to TVs, mobile phones and workstations. The firm also plans to aggressively push Cell on the merchant market, nurturing technology born from game systems into a platform for diverse networked equipment. If the firm's dream can be realized it will mean that the Sony Group holds a core part of the network era, which could make it into a semiconductor major. This is part of the reason that Ken Kutaragi, executive deputy president and chief operating officer (COO) of Sony Corp of Japan, always seems to mention Intel Corp of the US as a potential competitor in various developments.
Long-Term Strategy
The presentation at the International Solid-State Circuits Conference (ISSCC) 2005, where the Cell was revealed, was standing-room-only as people packed the several hundred seats for a glimpse.
Is Cell really that great? Of the chip outline presented at the conference, the audience was especially intrigued by the very high floating-point operation speed, hitting 256 GFLOPS at 4GHz. (256 GFLOPS is over 40 times higher than the Emotion Engine mounted in SCE's PlayStation 2, and over 15 times higher than Intel's Pentium 4.)
The real quality of Cell is not in the operating frequency or number-crunching prowess of the prototype chip, however, but in the internal architecture. Advances in semiconductor manufacturing technology and the sharp rise in the number of internal operators have made this structure essential to continue to meet diversifying applications from digital appliances to computers. In addition, engineers are also working on an environment that will make it possible to network multiple Cells together to act like a single computer. The goal is to leverage the chip's flexibility and expansibility to make it a core component for the electronics industry, and keep it that way over the long term. "We wanted to make an architecture that would be valid for at least a decade," said James Kahle, IBM fellow, Broadband Processor Technology, Microelectronics Div, IBM Corp of the US, emphasizing the future-oriented design of the chip. The prototype chip is merely the first step in realizing this goal, merely a starting point.
The basic concept of Cell was firmed up in the spring of 2001, when the joint development lab was established by SCE, IBM and Toshiba Corp of Japan in Austin, Texas. SCE and Toshiba engineers flew to the US for the initial meeting with IBM on the Cell concept, meeting a host of top IBM engineers, such as people in charge of developing the POWER4 server microprocessor. The scale of the development team was gradually boosted to several hundred people, mostly engineers from IBM. The fact that IBM, the former leader in the mainframe world, contributed so heavily to the development of an IC for home game systems clearly demonstrates how the key driver in electronics technology has shifted from computers to home electronics (Figs 1 and 2).
Product Development
The disclosed specs for the prototype chip were not maxed-out data created for the conference. The development team has confirmed operation at up to 5.2GHz on the first prototype chip obtained in April 2004, but the ISSCC presentations on Cell merely stated "4GHz or higher". More than likely, the companies are expecting to use about 4GHz in actual equipment for reasons of higher IC yield, lower dissipation and simplified board design. The initial chip exhibited no problems with logical operations, and was able to boot the operating system (OS). Dissipation, however, was a major issue. Masakazu Suzuoki, VP, Microprocessor Development Dept, Semiconductor Business Div at SCE, feels that this has been resolved: "We had a difficult time reducing dissipation at the start, but finally found the solution in the second half of 2004."
Cell chips will be used in home game systems by SCE, high-definition TV (HDTV)-capable digital TVs and home servers by Sony, and HDTV-capable digital TVs by Toshiba by 2006. Hardware and software for these products is now being developed simultaneously at multiple sites in the US and Japan. Entry into the development areas is strictly controlled, so very few engineers have actually seen Cell chips in operation. In these secret labs there are development boards the size of pillows, mounting twin Cell chips with little air-cooled heat sinks small enough to sit in the palm of your hand. Development is under way on 3D graphic draw libraries for gaming, HDTV demodulation software, and more.
Leading the Era
The Cell chip is a multicore design, single-chipping the general-purpose central processing unit (CPU) core to run the OS and handle other tasks, and multiple signal processors called synergistic processing elements (SPE). The prototype chip has the IBM Power-architecture general-purpose CPU core and eight SPEs.
The circuit configuration has been simplified as much as possible so that the CPU core and the SPEs can operate together at 4GHz or higher. This is because the complex instruction scheduling that has become so common in high-performance microprocessors lately tends to boost core footprints and dissipation both.
The quantity of SPEs per Cell will vary with the performance the equipment requires and the scale of the circuits to be integrated into the chip, but will always be an even number. The CPU core is not dependent on any specific architecture, and ignoring business-related factors could easily be designed to use ARM for mobile phones and MIPS for desktop equipment, for example. In fact, IBM appears to be developing a separate Cell chip using a totally different CPU core.
The Cell design approach based on the simplified CPU core and signal processors is leading the way for design trends in microprocessors as they move towards multicore design. As Justin Rattner, senior fellow, Corporate Technology Group and senior director, Microprocessor Technology Lab at Intel explained, top people in the industry share the same opinion: "In the future, it will be crucial to design microprocessors by single-chipping multiple simple CPU cores."
Flexible Interfaces
The design approach aiming for application in diverse systems is evident in the system interface linking Cell to peripheral ICs, too. The physical layer is the FlexIO high-speed parallel transfer technology developed by Rambus Inc of the US. The interface is 12 bytes wide, with seven bytes used for output and five for input. Depending on the specific peripheral ICs used, the widths can be freely adjusted in 1-byte units, supporting a maximum of two peripheral ICs (Fig 3).
The per-pin peak data rate for FlexIO is a high 6.4 Gbits/s, which is higher than the 2.5 Gbits/s delivered by existing PCI Express serial transfer, or even 5 Gbits/s second-generation PCI Express technology. As a result, the system interface offers a peak data rate of 76.8 Gbytes/s, roughly ten times faster than the Pentium 4.
The adoption of FlexIO seems to have been due in part to the fact that it can be used with inexpensive clock ICs. This is crucial in keeping costs down in consumer electronics products costing hundreds of dollars. FlexIO incorporates a circuit to dynamically ensure clock signal jitter due to supply voltage fluctuation, making it possible to hit a per-pin rate of 6.4 Gbits/s even using clock ICs with relatively high jitter.
Swallowing ASICs
Behind this major shift in design policy are the facts that it is time for another change in architecture, which generally occurs every five years as semiconductor manufacturing technology advances, and that application-specific ICs (ASIC) for individual products pose increased development load.
In the five years since the development of the Emotion Engine semiconductor geometry has shrunk considerably. It has been possible for microprocessors on chips of given areas to boost processing performance by ten times over this period through architecture revamps. This is sufficient to even make the shift to a whole new platform worthwhile. The difference in performance between the prototype Cell and the first-generation Emotion Engine is 40x, but they are about the same size: 221mm2 for the former, and 226mm2 for the latter. This is on a par with the Pentium 4, manufactured with 180nm technology, at 217mm2.
With number-crunching performance of 256 GFLOPS, it becomes possible to implement almost all of the signal processing demanded by digital consumer electronics in software. Encoding demanded by Moving Picture Coding Experts Group Phase 2 (MPEG-2) for standard-definition TV (SDTV), for example, can be executed for several dozen streams in parallel. This means that all of the various signal processing circuits currently implemented in individual ASICs can be replaced by the Cell. For applications like mobile phones where signal processing performance does not need to be very high, the quantity of SPEs can be reduced in a special Cell, cutting chip footprint and dissipation.
Full Use of Silicon
One advantage of the Cell, which can vary the quantity of SPEs to control number-crunching capability, is that it will prove very handy in the future by providing the increasing performance digital consumer electronics needs.
Take H.264 encoding, for example. The prototype chip can handle encoding of multiple SDTV video streams in parallel, but only one HDTV stream. If HDTV imagery is being recorded to Blu-ray Disc media with H.264, for example, the system would require even higher performance in order to be able to simultaneously play a game or execute other applications. Other demands are also being raised calling for boosted performance in digital consumer electronics, such as an image recognition function to make it possible to search for a particular scene within massive imagery records.
With Cell it is possible to develop a microprocessor satisfying the requirements much faster than an ASIC, just by increasing the quantity of SPEs. A large number of signal processing operations in digital consumer electronics are executed in pixel units, making it fairly easy to execute them through parallel processing and gain maximum effect from an increase in SPE quantity.
The fact that performance can be boosted without changing chip size, just by increasing the number of SPEs, also contributes to maintaining a high capacity usage ratio at the fab. If advances in semiconductor manufacturing technology are only used to shrink chips it will be necessary to produce cheap chips in volume, increasing the time needed to recover the capital investment into the facility (Fig 4).
Hardware, Software
Cell is more than just the IC: it only achieves full performance when it is used in conjunction with the software. It will not be a trivial task to apply all the power offered by the nine processors in the Cell, including the CPU core, to add value to the host equipment. Balancing the load effectively between the cores will require writing code from a solid understanding of Cell architecture, and that means sophisticated software technology. As one engineer involved in Cell development commented, "Engineers who have only been involved in developing software for general-purpose microprocessors are going to have to relearn everything from the ground up. People who have been involved in ASIC development might be better suited to writing code for Cell."
Each company is involved in its own software development project, and it appears, for example, that multiple varieties of Linux running on Cell already exist. While the firms cooperated in the development of the microprocessor, they remain rivals when it comes to Cell-driven products in the marketplace.
While software development methodology will have to be revamped for Cell chips, once the constituent technology required for digital consumer electronics development (OS, libraries and such) is available, it should become considerably simpler to actually develop the product. More and more functions can be used in multiple pieces of equipment, including H.264 and other Codec software and graphical user interfaces (GUI). Sony is already applying this development method in TVs mounting the Emotion Engine. By utilizing software libraries originally developed for the PlayStation 2, it was able to quickly develop the GUI used in the PSX, called the cross-media bar (XMB).
Outside Sales
In parallel with the adoption of Cell chips in their own products, it seems likely that the manufacturers will begin to push sales to other firms involved in consumer electronics and computers. The more products equipped with Cell chips, the easier it will be to achieve a distributed environment via networking, and that was one of the original concepts of the Cell development plan.
The Sony Group plans to provide not only Cell, but also peripheral and graphics ICs equipped with all the needed input/output (I/O) interfaces. The strategy makes one think of an Intel for the digital consumer electronics world. The firm will probably also provide homegrown OS and software. As mentioned above, the development of Cell software will not be trivial, but for the consumer electronics manufacturers, releasing product software to the competition would be the kiss of death because, along with the software, hard-won expertise would also be transferred.
In fact, Cell is provided with a framework to prevent such expertise from escaping. A function is implemented in hardware that can make it impossible for the dedicated SPE memory space to be addressed by the CPU core. This function could be used to prevent third parties from analyzing software libraries or other code in the SPEs.
In addition to sales to the merchant market, it is also possible that the Cell system interface could be disclosed. If third-party developers provide the peripheral ICs for use with Cell, it would rapidly increase the range of possible Cell variations.
To convince as many IC manufacturers as possible to make peripheral ICs for use with Cell, one possible strategy is to release the specs free of charge, as Intel did with its peripheral component interconnect (PCI) bus and accelerated graphics port (AGP) specs. It seems more likely that the information will only be released under a license agreement, however, Sony's Kutaragi suggested.
by Rocky Eda and Tomonori Shindo
Websites:
IBM: www.ibm.com
Intel: www.intel.com
Nintendo: www.nintendo.com
SCE: www.scei.co.jp/index_e.html
Sega Enterprises: www.sega.com
Sony: www.sony.com
Toshiba: www.toshiba.co.jp
(May 2005 Issue, Nikkei Electronics Asia)
http://neasia.nikkeibp.com/neasia/001090
Bonus DD: http://neasia.nikkeibp.com/newsarchivedetail/daily_news/001158
http://neasia.nikkeibp.com/newsarchivedetail/top/001130
http://neasia.nikkeibp.com/newsarchivedetail/top/001070
http://neasia.nikkeibp.com/mag_content/images/20050426153010/fig1.jpg
Fastclick Takes On Google With Contextual Ads
by Gavin O'Malley, Tuesday, May 3, 2005 7:01 AM EST
AS EXPECTED, ONLINE AD NETWORK Fastclick Monday began offering contextually relevant text ads--similar to Google's AdSense--to its publishing and advertising clients.
"This format provides publishers and advertisers with the flexibility they need to increase their revenue from previously under-utilized inventory," John Ellis, vice president of marketing and product management at Fastclick, said. "Text ads provide our advertisers with yet another opportunity to maximize campaign performance through Fastclick's optimization."
As Google and Yahoo! position themselves as ad networks, stand-alones like Fastclick and ValueClick are seeking revenue alternatives such as placing text ads relevant to a particular page's content.
Shar VanBoskirk, an analyst at Forrester Research, doubted whether Fastclick could challenge goliaths like Google in the space, but said the move was necessary all the same. "If Fastclick didn't expand into search, then all they'd ever be is an ad-network," VanBoskirk said. "What all of these companies are trying to do is create a company that can offer marketers the most complete solution, from keywords to banners."
Fastclick's entry into contextual search advertising and its initial public offering announced last month are directly related, analysts speculated. "Investors are now going to look to Fastclick to make sure they have a full quiver of arrows," said Gary Stein, a Jupiter Research analyst. "It's important that any company can round out all of their capabilities."
Stein, however, was not enthusiastic about Fastclick's pronouncement that its text ads are "customizable," and can be fit easily into publishers' Web sites. "That might have been a great selling point a year ago, but publishers already have a pretty fair degree of control today," Stein said. "Today, success is measured by relevancy algorithms and how many publishers a network has, and they have the latter, but relevancy can only be seen over time."
Ellis tied Fastclick's present and future success to optimization. "We have a strong tech platform, which allows us to optimize our clients' CPA goals," said Ellis.
In the future, Google will have to consider its competition "monolithically," said Stein, figuring that a growing number of ad networks could potentially cut into Google's bottom line.
Forrester's VanBoskirk, on the other hand, predicted that Google will have a clear competitive advantage for some time. "While Fastclick's search marketing is limited to demographics, Google has the ability to target consumers behaviorally and contextually, which gives them a clear edge."
http://publications.mediapost.com/index.cfm?fuseaction=Articles.showArticleHomePage&art_aid=2977...
ot:Time For Sleep...Snap Out Of It Already, lol
By Matt Hines
URL: http://news.zdnet.com/2100-9588_22-5692423.html
Google is seeking to patent a technology meant to help its Google News section sort stories based on their overall quality, which could augment the current methods of ranking results by date and relevance to search terms.
In separate filings with the U.S. and world patent offices, Google detailed a new formula it has developed to help rank news stories in Web search results. The system would allow the company to sort news by source, rather than based merely on a story's direct relation to a certain search term or the time at which articles were published.
The company's filing with the U.S. Patent and Trademark Office, submitted in September 2003, describes the news-rating technology as a tool that "ranks the list of (search results) based at least in part on a quality of (their) identified sources." The technology, based on work by researchers Michael Curtiss, Krishna Bharat and Michael Schmitt, would let Google prerank content from specific news outlets to ensure that those stories appear above other search results.
Company representatives did not immediately respond to requests seeking comment.
At present, Google generates results based on the search engine's perceived relevance of content to a particular term and the time at which any particular piece of data or story is first published online. In the patent filings, Google concedes that while its existing system often generates thousands of results in response to individual search terms, the stories it unearths have no degree of worth assigned to them and may not come from reputable publishers.
"While each of the hits in (a list of search results) may relate to (a) desired topic, the news sources associated with these hits, however, may not be of uniform quality," Google said in the filing. "Therefore, there exists a need for systems and methods for improving the ranking of news articles based on the quality of the news source with which the articles are associated."
The company goes on to describe how content published by news outlets such as CNN and BBC, or companies that are "widely regarded as high quality sources of accuracy of reporting, professionalism in writing," may be of greater interest to its customers, and therefore should top news search results.
Google News has raised some hackles lately. In March, French news agency Agence France Presse sued Google, charging it with using the agency's articles and photos without authorization. The suit has forced Google to begin pulling thousands of photos and news stories. Critics have also attacked the search giant over its decision to include reports from National Vanguard, a publication that espouses white supremacy. In response, Google said it will remove the publication from its index.
http://news.zdnet.com/2102-9588_22-5692423.html?tag=printthis
"The best way to have a good idea is to have a lot of ideas." — Linus Pauling.
DD: 16st Coffee House: Open Source Paradigm Shift
http://tim.oreilly.com/pub/a/oreilly/tim/articles/paradigmshift_0504.html
http://www.miami.com/mld/miamiherald/business/national/11545475.htm
ot:As Mentioned, I have The operations,sales and marketing/ management experience to help anyone run an island;)
http://www.privateislandsonline.com/new.htm
Resume available upon request)
Cheers!
SonOfGodzilla
The telephone wire, as we know it, has become too slow and too small to handle Internet traffic. It took 75 years for telephones to be used by 50 million customers, but it took only four years for the Internet to reach that many users.
- Lori Valigra
640K ought to be enough for anybody.
- Microsoft Chairman Bill Gates, 1981
That Last DD Was For Neom Mgt, eot
way ot: Talk about a growth industry. In June 2001, there were only 30,000 text messages sent in the United States. There were 14 billion text messages sent domestically in 2003, 25 billion in 2004 and this year the total number of text messages sent within the United States will be about 42 billion.
These figures are so seductive, serious investors can't resist flirting with the companies that are directly and indirectly involved in this industry. By 2010, text messaging is projected to be a $200 billion market.
http://www.abqtrib.com/albq/bu_columnists/article/0,2565,ALBQ_19837_3745525,00.html
ot: Evolving Along With The Transistor
http://reform.house.gov/UploadedFiles/Squires%20Testimony1.pdf
Spotlight on Spotlight
by Glenn Fleishman <glenn@tidbits.com>
http://db.tidbits.com/getbits.acgi?tbart=08087
Much will be written about Spotlight, one of Tiger's marquee features that takes system-wide search from a time-consuming annoyance to an efficient part of everyone's workflow. In fact, Spotlight works so well that the idea of filing email, files, and other data will eventually disappear - but not quite yet.
You'll read a lot about the general features of Spotlight: you can find any text in any file quickly, or use it to pinpoint menu items in System Preferences. I'd like to tell you quickly about how Spotlight works and then delve into areas you probably won't hear as much about elsewhere. I'll conclude with musings on how Spotlight might free us from the tedium of forcing organization on top of what we create.
Spotlight in a Nutshell -- Spotlight's approach is simple: everything is indexed quickly and efficiently in an ongoing manner. Install Tiger and reboot, and the first thing the operating system does is index your hard disk. In multiple test installations, I didn't even notice the indexing taking place, although some users report 50 percent of their processing power devoted to the task. You can't use Spotlight until this initial index is done, but clicking the blue Spotlight icon in the upper right of the system menu bar will reveal how long Tiger thinks it will take to be finished. A pulsating dot in the center of the magnifying glass icon lets you know indexing is taking place.
When it's done, Tiger automatically modifies the index for every changed document and adds every new document to it. This happens quietly as well. Let me restate this in case it didn't sink in: Spotlight doesn't run a full re-index of your hard drive every night requiring you to leave your computer on or causing loud drive access noises in the wee hours. All other overlay indexing programs and previous Apple attempts required that kind of churn.
I haven't stress-tested Tiger yet by, say, using Automator to create 1,000 one-megabyte-sized files of random text, but that would be a good way to see Spotlight's ongoing indexing in action.
By integrating index updates into the operating system at the filesystem level, Tiger avoids patching the system at a low level (always dangerous), the above-mentioned overnight reindexing, and subset indexing that omits potentially useful data.
Apple also seems to have pulled off a neat trick: using some kind of optimized index to produce some results right away, Spotlight searches start running as soon as you start typing. By the time you finish typing, either through predictive word finding or sheer good programming, the search is almost done.
I've found Spotlight incredibly zippy on a 1 GHz 15-inch aluminum PowerBook G4 and a dual 1.25 GHz Power Mac G4. I'll be curious to hear about how it feels on the lowest-end machines that Apple supports.
Spotlight is available at any time from the upper right by clicking its icon, or pressing Command-Space. It also appears in every Finder window by default, and, most critically, within any Open and Save dialog box. No more navigating to find files to open! No more navigating to find the right folder to save! I will still love and cherish Default Folder, but it will be much less important to my future workflow.
<http://www.stclairsoft.com/DefaultFolder/>
Apple has made Spotlight available from the command line, too. The mdls command lets you see the metadata associated with any file. The mdfind command is essentially a Spotlight search.
<http://developer.apple.com/macosx/tiger/spotlight.html>
Narrowing Spotlight Searches -- Spotlight rewards those that need more sophisticated searches by allowing you to refine phrases that constrain date and time, file names, and other metadata. Metadata is data that describes data, like the last modified time, the F-stop of a camera, a QuickTime movie's format or length, or the photographer's name embedded into a TIFF image's header.
Most searches will start with keywords, but you will quickly want to drill into subsets if you have many results. Apple has built a nomenclature for searching that they haven't yet exposed well - the special words that you can use to restrict searches. Unfortunately, these words aren't currently documented anywhere on their site or within Spotlight Help in the release of Tiger.
You can experiment with restrictive phrases. Apple's page on Spotlight suggests that you might add "Date:yesterday" after keywords to find just files created in the last day. If you wanted to find all images created yesterday you could enter "Date:yesterday Kind:image". I expect this nomenclature will be fully documented over time. These restrictive words will be especially useful in Open and Save dialog boxes, where Spotlight could produce daunting results.
<http://www.apple.com/macosx/features/spotlight/>
The capability to make use of some of the increasingly rich metadata produced by digital media devices is a boon. Imagine finding all pictures you've taken on a particular Canon camera model at a specific resolution. Right now you need to use a cataloging program such as iView Media Pro and keep that catalog constantly up to date.
There's another way to use these restrictive add-ons without knowing Apple's secret narrowing words - via Smart Folders.
Folders as Search Results -- A couple versions of Entourage ago, Microsoft added pseudo-mailboxes that were actually search parameters presented as a mailbox. Unfortunately, for those of us with zillions of messages, a search took an unbelievably long time with the search engine Microsoft used at the time.
Spotlight has taken that concept and extended it to the Desktop in the form of Smart Folders, which are essentially the live results of a set of search parameters you define. Spotlight's performance is good enough that you don't notice the fact that a Smart Folder is populated dynamically.
Along the way, Apple removed Panther's advanced searching from the Finder; selecting Find from the Edit menu effectively creates a new Smart Folder (using the same dialog as the New Smart Folder command) that isn't yet saved. To create a search that narrows down beyond keywords, you either learn the incompletely documented nomenclature described above, or use Smart Folders.
When creating a Smart Folder, the default parameters are Kind: Any, and Last Opened: Any Date. The buttons above the search parameters list Servers, Computer, Home, and Others. If you leave it set to Home, the search is restricted to the current user's Home directory. I prefer setting it to Computer to take full advantage of Spotlight's capabilities, and because I keep documents and other files stored throughout my hard drive, not just in my Home directory as Apple would prefer. (Click Others to add or remove specific folders or hard drives.)
You can create a Smart Folder, too, in any Finder window by typing a search in the Spotlight field. That Smart Folder doesn't show the default scope of Kind and Last Opened, but you can click the plus sign at the upper right next to the Save button to add bounds.
Smart Folders let you mix the contents of the Spotlight field, in which you might enter keywords, with restricting conditions. Click the plus sign next to any condition to add more. Select the pop-up menu that's the condition's name and you can select one of several favorite conditions, or select Other.
In Other, you will see the full range of predefined metadata that's supported in Spotlight. For instance, select URL and you can choose to find any document that contains that URL. Check the Add to Favorites box and that attribute now shows up in the condition pop-up menu.
I don't want to turn this into 10,000 words on Smart Folders, but there's more: you can show the top 5 or all results for a given document category; sort by date or kind; click the "i" button next to a file to see a summary of its information; view PDFs by a thumbnail of their first page; show images; and so forth.
Rethinking Filing -- Filing is a tedious activity that computers were supposed to save us from, right? That's why I was so excited to see Creo's Six Degrees program a few years ago. Six Degrees integrated with certain email programs under Mac and Windows so that recipients, subject lines (discussion threads), and attachments were the three points of a triangle. You could rotate your email-world around to view it through the window of who you corresponded with, what you talked about, and what files were involved. (The product was sold to Ralston Technology Group and is now marketed as Clarity.)
<http://www.ralstontech.com/>
Spotlight expands that notion far, far beyond those modest but significant goals. Six Degrees was trying to free people from ever having to decide in which mailbox an email message should be stored, and in which folder a file belonged.
I don't think Spotlight yet allows us to break down all barriers and use one giant email folder to store all messages, and one giant Finder folder to store every file we create or receive. But, it is moving us closer to what I think people actually want from their computers: not to spend a good percentage of time categorizing.
Perhaps it will take some time yet, but I perceive the future of information to be much more amorphous. Instead of discrete information chunks, every graphic, letter, report, presentation, movie, or other project piece is just a blob in the middle of some kind of data medium that we navigate through in many different ways: by date, by content, by visual presentation, by keywords, by attributes.
That is, the interface to our data is no longer the worn-out metaphor of files and folders, but a rich interactive approach that mediates between an underlying structure we don't need to understand and our desire to find things by the way we remember them. Say goodbye to descriptive file names, for instance.
I didn't come up with this way of viewing the future of desktop information, nor did Apple. David Gelernter, a Yale University computer science professor, has been talking about this since at least 1991. Although a company he founded to implement these ideas seems to have disappeared, his ideas are well represented in a 2003 interview: read the section on Information Beams.
<http://java.sun.com/developer/technicalArticles/Interviews/gelernter_qa.html>
In that interview, he said, "When I acquire a new piece of 'real-life' (versus electronic) information - a new memory of (let's say) talking to Melissa on a sunny afternoon outside the Red Parrot - I don't have to give this memory a name, or stuff it in a directory. I can use anything in the memory as a retrieval key."
Spotlight is probably the first mainstream operating system or program to take a big step towards Gelernter's humanist view that maps how we think to what we have stored.
--------------------------------------------------------------------------------
This version of "Spotlight on Spotlight" is specially formatted for printing. It contains no logos, graphics, watermarks, visual layout, colored text, or ink- and toner-consuming items. Please do not bookmark this page. You can find this article at <http://db.tidbits.com/getbits.acgi?tbart=08087>.
Unless otherwise noted, this article is copyright 2005 Glenn Fleishman, published in TidBITS 778, copyright 2005 TidBITS Electronic Publishing, all rights reserved.
"Success consists of a series of little daily victories"
-Laddie F. Hutar
ot: Some Reading To Take The "Edge" Off the Day:
Sounds like the future
Record labels embrace DualDisc, a combination of CD and DVD
By Bob Gendron
Special to the Tribune
May 1, 2005
http://www.chicagotribune.com/technology/reviews/chi-0505010464may01,1,332374,print.story?coll=chi-t...
There you are at the record store, faced with paying $15 and change for a full-length CD that has those two songs that you can't get out of your head -- the same two tracks you can get as a $2 computer download. The once-impervious music industry is coming to terms with the same reality, as well as the fact that it no longer controls its own destiny and needs to provide consumers with more. That industry realization is paying off in efforts that offer better choices, convenience, accessibility and value.
Until a few years ago, labels automatically took their shiny bread-and-butter format for granted and all the way to the bank. Customers had few other options and rarely balked when list prices were raised. But the unlimited possibilities of music downloads have transformed the playing field, turning the glamorous CD into a badly aging star that needs a major face-lift. For a music business desperate to steer people into stores and away from illegal download sites, that cosmetic surgery couldn't come soon enough.
Infamous for infighting and contentious disagreements, all four major record labels are setting their differences aside and supporting DualDisc, a new format that they believe offers the best of the audio and visual worlds. A standard 5-inch disc that has two operational sides -- one is aCD, the other a DVD that also contains the entire album plus video and potential treats such as surround sound, lyrics, photos, links and enhanced audio -- DualDiscs are designed to work with CD and DVD players.
For labels, the linking records with DVD is a no-brainer. According to the Consumer Electronics Association there are 127 million DVD players in 70 million households, and by the end of 2005, it is estimated that more than 80 percent of U.S. homes will own at least one. And since revenue generated by packaged media outstrips that of the various digital options, DualDiscs are a logical priority for an industry seeking to boost sales of physical software.
Bruce Springsteen fans will likely be among the first to encounter the technology. The singer-songwriter's new studio album, "Devils & Dust" (Columbia), is on DualDisc, with a DVD side that features footage of Springsteen talking about and performing five songs, in 5.1 surround sound. These flip-side discs are the latest entry in the race to sell music in the 21st Century, a challenge that has record labels experimenting with new products, copy protection, DVDs, online media, cell phone ringtones and unlikely collaborations.
Those thinking that they've only recently heard about the latest and greatest new format aren't delusional. Over the last few years, music lovers have been inundated with an alphabet soup of acronym-laden audio media. Shortly before sales of CDs peaked in 2000, Sony trotted out its Super Audio Compact Disc (SACD), boasting superior sonics and surround-sound capability that claimed to provide listeners an entirely new and vastly better experience.
On the surface, Sony wasn't bluffing about SACD's advantages. But poor business decisions and the presence of a competing format prevented SACD from establishing a foothold. Sony's original decision to make its SACDs single-layer rather than hybrid compatible (the latter indicating that the discs play as CDs on any player, but require an SACD-capable unit to transmit high-resolution and multichannel audio) meant that listeners had to purchase expensive SACD equipment. By the time Sony began issuing hybrids and lowering prices, the format was 4 years old and had barely made a dent in the marketplace.
DVD-A and its reincarnation
As Sony touted SACD, Warner Bros. threw its weight behind DVD-Audio (DVD-A), which boasts similar sonic benefits but plays back on any DVD player and can hold videos, images and text. Many early DVD-A's lacked impressive visuals and had faulty onscreen menus. Moreover, the discs were packaged in bulky jewel cases that didn't fit in conventional retailer racks or home-shelving units. As with SACD, a paucity of first-rate software and compatibility dilemmas relegated the format to the background. But DualDisc, launched by Warner and fellow DVD-A proponent 5.1 Entertainment, implements many of DVD-A's fundamentals in a more mainstream-friendly package.
Prior to Simple Plan's "Still Not Getting Any" (Lava) standing as the first DualDisc released last October, labels had been including bonus DVDs with anticipated CD releases as a way of giving fans something unobtainable via the Internet. For the industry, the trick was to produce a disc that could store both CD and DVD, and make it thin enough to play in all machines.
Thickness hasn't been an issue, but DualDiscs don't conform to the industry's compact-disc standard and can't be read by all players. Pioneer, Toshiba and Onkyo were among manufacturers to have initially issued warnings against playing DualDiscs, noting that doing so may damage the machine. But the problem isn't widespread, and has been limited to select older players and multidisc changers.
That hitch hasn't stopped labels and merchants from embracing DualDiscand developing materials to promote and explain the software. Most DualDiscs retail for the price of a new CD ($18.99), though many are on sale for less. Currently, manufacturing costs are higher and production capacity is limited, but if music lovers respond at the cash register, expect labels to ramp up the volume.
Miyk Camacho, operations manager at Tower Records on Clark Street, cites label cooperation, consumer education and universal compatibility as reasons he believes DualDisc may finally be a format that sticks. "There's really no reason not to buy the DualDisc over a CD. The price is right, the content is good and the education is easy. When DualDisc initially came out, there was a bit of confusion. But it's really easy to explain that there's a CD on one side and a DVD on the other. Part of the goal is to educate people with pamphlets and displays. As long as customers are aware of what it is, there are no compatibility issues. Whereas we made separate stock areas for SACD and DVD-A, the majors don't want any division with DualDisc. In our stores, we file them right in with the CDs."
Though fewer than 100 titles are presently available, momentum is building. In the last two months, Sony BMG issued more than 25 catalog albums -- including Miles Davis' "Kind of Blue," Lamb of God's "Ashes of the Wake" and AC/DC's "Back In Black" -- as well as new releases such as Omarion's "O" and Jennifer Lopez's "Rebirth," which have accounted for a third of each record's overall sales.
Thomas Heffe, President of Global Digital Business at Sony BMG, says the conglomerate has sensed huge demand and received very positive responses, despite having only released its first titles in February
"Since the launch, we have already sold over 600,000 copies," Heffe says. "DualDisc is a great way to . . . give them more value. It's meant as a parallel product to CD, and designed to give people more choice. In the future, there will be some day-in-date releases that only come out on DualDisc (like the Springsteen), and on some occasions, maybe a CD will come out later. But one format needn't come at the expense of the other."
Other parties at the dance
Though it has been the most aggressive, Sony BMG isn't the only player. Atlantic just issued "Something To Be," the first solo album from Matchbox Twenty vocalist Rob Thomas. Like the new Springsteen record, it's only available on DualDisc and in the first week of release sold more than 250,000 copies to debut at the top of the Billboard charts. EMI is presently deciding what titles to offer, and 5.1 Entertainment is reissuing previous DVD-A's as DualDiscs that have high-resolution DVD-Audio.
Recognizing consumers' love affair with multimedia, Universal Music Group is also gearing up its efforts. Paul Bishow, the company's vice president, marketing -- new formats, thinks that such expansion is necessary. "For the last 20 years, the industry basically had one product to sell. We're now in a world in which it's imperative that labels provide music listeners a wide variety of products."
Universal has issued DualDiscs from Snow Patrol and Diana Krall, but its biggest release to date comes this Tuesday, when Nine Inch Nails' new "With Teeth" (Interscope) debuts. Trent Reznor, lead singer of the industrial-rock band, had a hand in authoring the group's 1994 album "The Downward Spiral" for DualDisc. In that he concurrently mixed the record's stereo and surround programs in the studio, Reznor was even more involved for "With Teeth."
As they become aware of the format, Bishow thinks more artists such as Reznor will jump at the opportunity to expand their creativity, and that consumers will respond in kind. "Up to this point, DVD has primarily been viewed as video. DualDisc shows everyone that DVD is a music and video product. Many discs will have DVD-Audio, and given the proliferation of DVD-A players and inroads from car audio, we think that DVD-A may still have a place in the market."
Whether or not the inclusion of audiophile-minded advantages lures the public remains to be seen. For most consumers, better-sounding CD is the answer to a question that nobody asked. But the home-theater approach seems to be working, and because surround sound, high-resolution audio and video content cannot be properly encrypted on the Web, DualDiscs offer a type of copy protection the industry desires, to thwart a thriving, inventive pirating underworld that invents new ways to defeat copy protection schemes.
No matter what DualDisc's fate, consumers won't be left holding the bag -- the software will still play on standard equipment. Nonetheless, any early adopter of new technology can tell you there's a chance that two years from now DualDiscs will have gone the way of the MiniDisc or exist on the fringes like SACD. But given the format's universal compatibility and competitive pricing, such a scenario is doubtful.
Yet it's clear that labels still have a long way to go before they sell the DualDisc concept to the masses, especially Web-savvy individuals such as Vita Martinelli, who buys about 15 records a year. Martinelli's view on DualDisc, after auditioning one, is that the medium is redundant.
A regular guy
"It's not something that applies to me," Martinelli says. "Other CDs have similar multimedia material and you don't have to have a DVD drive to see it. And you can get music videos on iTunes or go to the musician's Web site. But I can see how DualDisc would be useful to someone who doesn't use the Internet that much."
Given the choice between a DualDisc and a normal CD version of an album, which would Martinelli pick?
"I'd buy the regular one. I wouldn't care about the other stuff."
That kind of stance is one reason labels are taking no chances and keep branching out into several other fields, including the booming download market. While often viewed as the biggest threat to packaged media, reports often fail to cite that downloads account for just 2 percent of the industry's total revenue. More than 140 million songs were purchased online in 2004. By comparison, nearly 770 million full-length albums were shipped to retailers. Doomsayers who have been predicting CD's imminent demise since as early as 2002 fail to take into account that many Americans don't own a computer, let alone an MP3 device.
Yet it's impossible to deny Apple's iTunes' ubiquitous presence, simple interface, fun appeal and addictive options. Other online sources such as Napster, Rhapsody, Real Networks, Microsoft Music Store and even Wal-Mart are also vying for Web surfers to visit their sites, click the mouse and assemble a digital library.
Also on the horizon is the impending arrival of HD DVD and Blu-ray, two video-based discs in line to replace DVD and scheduled to launch by late 2005. Both can include high-resolution audio and could be developed as substitutes for CD, though a looming format war could send them to cut-out bins, a la Laserdisc.
3G broadband phones, constantly connected to networks that allow users to access music whenever they want, are also coming. According to Heffe, Sony BMG's mobile music market is already as big as its download market. As the ringtone fad continues to sweep the country, the next wave of cell phones will be music-enabled and could theoretically double as portable music devices. In anticipation of what it's forecasting as the next trend, Sony BMG has aligned itself with mobile network operators with the hope of selling music to anyone, anytime and anywhere in the world. Still, the probability of phones replacing stereos and portable MP3 players is very slim.
Those who refused to part with their vinyl records may see the music industry's marketing chess game as highly ironic, another way in which the past becomes the future. Left for dead in the early '90s, new LPs are still pressed by hundreds of labels and sold everywhere from neighborhood shops to local concerts.
To many ears, vinyl still sounds more lifelike than any digital format that has ever been invented.
- - -
Music DVDs making dent in marketplace
Once afterthoughts and filled with videos seen on MTV, music DVDs have since diversified and helped shore up labels' bottom lines. Part of the explosive home-theater trend, sales of music videos in 2004 grew by 26.3 percent and accounted for $2.7 billion in revenue.
According to Tower Records manager Miyk Camacho, music DVD sales at the Clark Street Tower Records doubled from 2003 to '04. In February, Best Buy announced it was reducing CD inventory and giving more space to DVDs, including the growing list of music titles.
Such releases are no longer restricted to household artists, though multidisc sets from perennial favorites such as Elton John, Rolling Stones and Led Zeppelin are among the all-time top sellers. High-profile titles have come from indie stalwarts White Stripes, Drive-By Truckers and Morrissey. Concert films, ranging from Neil Young's "Rust Never Sleeps" to David Bowie's "Ziggy Stardust and the Spiders From Mars," along with movies and documentaries, such as The Ramones' "End of the Century," are drawing tremendous interest. Publicity pushes often consist of screening events at theaters. Mature, cutting-edge artists such as Sonic Youth and Weezer have reaped nostalgia by offering their entire music-video anthology on a single disc loaded with hours of extras.
And labels have only begun to open the archives. Two examples riding high on Billboard charts that champion deceased legends are AC/DC's "Family Jewels" (Epic), which features rare footage of the band's original lead singer, Bon Scott, and Johnny Cash's "Live At Montreux 1994" (Eagle Vision). As large concerts are increasingly filmed on location, DVD releases that commemorate major tours will become commonplace. Those headed to U2's Vertigo Tour or Paul McCartney's upcoming trek will likely get to relive the experiences in the comfort of their homes in the not-too-distant future.
-- Bob Gendron
Copyright © 2005, Chicago Tribune