Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
eDigital to sell voice-controlled iPod look-alike
By Ana Letícia Sigvartsen
InfoSatellite.com
July 24, 2002
One more iPod look-alike is heading towards the MP3 player market. Following an MP3 player fever that was boosted by Apple's iPod, other companies released very similar products. One example is Toshiba's new Gigabeat MEG50JS, selling now in Japan.
More recently, company eDigital announced another device very similar to the iPod. Called Odyssey 1000, the player can download 4,800 songs, or 400 CDs, at good speed. Although eDigital brags the device is PC and Mac compatible, this is a feature that no longer contrasts with Apple's strategy. The iPod-maker had been letting other companies take the profit by releasing software that made the iPod compatible withWindows. However, in the last Macworld New York, Steve Jobs' company changed its mind and announced the new Windows support for the player - as well as a new 20GB model.
The Odyssey 1000 is powered by e.Digital’s MicroOS 2.0 and is the result of the company's first joint project with the strategic development partner Digitalway. The player will be available to consumers this fall, boasting a minimum of 13 hours of playback time. The Odyssey has a 20 GB hard drive for MP3 and Windows Media WMA playback and doubles as a data storage unit for movies, spreadsheets and e- books. It also brings connection through USB 2.0 and compatibility with Apple's Mac iTunes.
One special feature that is not present in many of the Odyssey's competitiors is a voice navigation capability. The device uses the VoiceNav user interface based on Lucent’s speech recognition technology, which allows users to navigate through their libraries of music using their own voice. It also comes with a built-in microphone for voice recording.
In addition to playing MP3 and WMA, the Odyssey 1000 counts with an FM tuner with 12 available station presets and 16 MB DRAM buffering for anti-skip protection.
http://www.infosatellite.com/news/2002/07/a240702odyssey.html
culater
Update: Forgent Claims Rights To JPEG Patent
July 18, 2002
By: Mark Hachman
Viewing a digital photo album, scanning a picture, even browsing the Web—all of these could become a little more expensive now that a small video firm has claimed it owns the fundamental rights to the implementation of the JPEG standard.
Forgent Networks, an Austin, Tex.-based video networking firm, has laid claim to one of the fundamental technologies underlying the World Wide Web. Even more significantly, the firm has already convinced two firms, including Sony Corp., to pay millions of dollars in royalties.
Forgent representatives declined to make executives available for interviews, although a company spokeswoman answered some of ExtremeTech's questions. According to the spokeswoman, Forgent began reassessing its patent portfolio after a management restructuring about eighteen months ago.
"We own the rights to this patent, we feel it's a legitimate business procedure, and companies do this all the time," the Forgent spokeswoman said.
What the patents cover
The patent in question is No. 4,698,672, titled "Coding system for reducing redundancy", approved in 1987 and assigned to Compression Labs Inc., a Forgent subsidiary. The patent specifically references patent No. 4,302,775, "Digital video compression system and methods utilizing scene adaptive coding with rate buffer feedback", also assigned to Compression Labs but in 1981. Neither patent specifically contains the acronym "JPEG", although it stands for "Joint Photographic Experts Group", the organization which standardized JPEG through the International Standards Organization in 1990.
"Forgent has the sole and exclusive right to use and license all the claims under the '672 patent that implement JPEG in all 'fields of use' except in the satellite broadcast business," according to a statement on the company's Web site. The spokeswoman could not answer why satellite broadcasting was excluded from the fields of use.
"Forgent's 'fields of use' for licensing opportunities include digital cameras, digital still image devices, personal digital assistants (PDA's), cellular telephones that download images, browsers, digital camcorders with a still image function, scanners and other devices used to compress, store, manipulate, print or transmit digital images," the statement continues.
Interestingly, both patents Forgent cites refer to data compression schemes used in video and other applications, although Forgent is claiming they refer to static JPEG images, as well. Forgent owns approximately 40 patents, some through acquisitions, such as the 1997 merger with Compression Labs. Approximately 35 more patent applications are pending approval, the Forgent spokeswoman said.
Although claims on patents and technologies are issued frequently, enforcing them is often far more difficult. However, Forgent convinced Sony Corp. and another company to license the patent, as evidenced by the company's June 17 quarterly filing with the Securities and Exchange Commission. In it, Forgent recorded $22.3 million in revenue--$15 million of which derived from licensing the patent to a third-party company, which the SEC filing does not name. The spokeswoman said Forgent was not asking for licenses, but one-time fees.
"In May 2002, Forgent signed a multi-million dollar patent license agreement with Sony Corporation, a leading manufacturer of audio, video, communications and information technology products for the consumer and professional markets," the filing adds. The Forgent spokeswoman characterized the unnamed company as a "unnamed multinational global consumer electronics company," which she later added was also based in Japan.
"We wanted to ensure the investment community and the general public are clear about the terms of our valuable JPEG data compression technology, one of the many technologies we have in our patent portfolio," stated Richard Snyder, chairman and chief executive officer at Forgent, in a statement. "We are in ongoing discussions with other manufacturers of digital still cameras, printers, scanners and other products that use JPEG technology for licensing opportunities." The Forgent spokeswoman declined to comment on which firms Forgent was negotiating with.
Actually, the so-called JPEG standard is not actually the standard in common use on the Internet. Instead, most JPEG files are apparently built around the JFIF standard, which was developed by the Independent JPEG Group and was placed into the public domain by C-Cube Microsystems, according to the JPEG home page, before C-Cube was acquired by LSI Logic.
A Unisys patent on LZW, which covers the compression algorithm used in GIF, TIFF and other graphics file formats, should expire in June 2003. Electronics For Imaging Inc. also filed suit in January against almost a hundred companies including Microsoft Corp. and IBM, alleging that those companies' products infringe EFI's own imaging patents. EFI's patent enters the public domain later this year, meaning it has until then to procure royalties.
Mining for patents
Although the patents were filed in the 1980s, the Forgent spokeswoman said the company only began reassessing its portfolio after a new management team concluded that the company could not survive in the video hardware business. Employees bought out the products division in January, which is operating as Vtel Products Corp. Vtel Corp. changed its name to Forgent in early 2001. Forgent still maintains a services organization. Forgent contracted with a third-party law firm--whose name Forgent is contractually prohibited from disclosing, according to the spokeswoman--to contact companies about paying fees. Patents typically expire after 20 years.
"We do have other patents we could pursue with licensing agreements," the spokeswoman said.
Interestingly, Vtel Corp. hired Gordon Matthews, the so-called "father of voice mail", as Forgent's chief patent officer. Soon after his appointment, Matthews began a "strategic patent program," offering bonuses to employees for ideas they could patent.
"For us, the [strategic patent program] is not just a way to create a portfolio of meaningful patents," Matthews said in a May 2001 interview with CIO Magazine. "It's also a way to attract and retain world-class employees. If employees know there will be a payoff later for ideas they come up with today, they're likely to stay around."
Some Japanese companies have settled patent litigation of this sort rather than go to trial; for example, Toshiba Corp. paid out about $2 billion in settlements for a class-action suit alleging it shipped floppy disk drives with defective controllers; other American PC manufacturers refused to settle.
If the patent is successfully enforced, however, the Internet and consumer-electronics industry could face another enforcement process such as that embarked upon by Rambus Inc.
Rambus, a Los Altos, Calif.-based memory technology designer, began asking companies to license a DRAM patent, which Rambus claimed to be the basis of all SDRAM and DDR DRAM, as well as its own Rambus DRAM. Rambus' patent rights were acknowledged by smaller DRAM companies, like Hyundai, but contested by the largest DRAM firms, such as Micron Technology. Those suits continue today.
Clarification: An earlier version of the headline claimed that Forgent was a startup company. Forgent has been in business for nearly 20 years, according to a company spokeswoman, but changed its name from Vtel Corp. to Forgent early last year.
http://www.extremetech.com/article2/0,3973,389261,00.asp
culater
Thanks JB-New Digital Audio Players From e.Digital Have Style, Features & Economy All Wrapped Up For Both PC & Mac Users
The gap between PC and Mac users will be bridged by the new Odyssey line of digital audio players coming from e.Digital Corp.
Each of the three new Odyssey pocket-sized, flash-memory-based digital audio players have unique features, giving consumers in many budget categories reason to sing. All come with embedded flash memory, a SmartMedia card expansion slot allowing consumers to add up to 128 MB of additional storage, and e.Digital Odyssey Manager software, which is compatible with both PC and Macintosh platforms.
The extraordinary Odyssey 100 MP3 player features up to 30 hours of battery life from a standard AA Alkaline battery. With a standard 64 MB of embedded flash memory, stereo earphones, and an intuitive user interface, this stylish MP3 player squeezes every last minute out of a standard AA battery, making it the ideal choice for economy-minded consumers. The Odyssey 100 is priced at $129 and is both compact and durable, making it a perfect travel companion.
The Odyssey 200 features not only MP3 file playback, but also an FM tuner, 20 customizable FM station presets with digital tuning, and high-quality voice recording using the built-in omni-directional microphone. Students, businesspeople, and writers of any kind can create and store over eight hours of voice recordings using the Odyssey 200's standard 128 MB built-in flash memory. A SmartMedia card expansion slot lets consumers add up to 128 MB of removable flash. A handy, wired remote control is standard with the Odyssey 200 and can be clipped to a shirt collar, pocket, belt or tie to control playback and other functions while the player sits in a backpack or pocket. A standard carrying case with belt clip is also included, expanding the user's options. The Odyssey 200 is priced at $179 MSRP.
The Odyssey 300's direct MP3 encoding allows users to rip files from their CD collection without the use of a computer and play them back immediately. This tops the list of its features, but its digital voice recorder, FM tuner, FM recorder, and up to 12-hour battery life make it the premier design of the Odyssey line. With a standard 128 MB of flash built-in, the Odyssey 300 holds voice, music and FM recordings totaling up to 8.5 hours, and users who choose to add a SmartMedia card via the expansion slot can double that amount. Its simple, intuitive joystick control is easy to use, and the alpha-numeric blue backlit LCD is easy to read. It also includes a carrying case with belt clip, and a practical wired remote control that can be clipped to a shirt or belt for out-of-pocket control of playback and other functions. The Odyssey 300 will sell for $229 MSRP.
Tom Boksa, vice president of Consumer Electronics for e.Digital, stated, "Retail interest in the Odyssey line is very strong and they will be available in a number of stores in time for the traditional back-to-school sales window. Retail buyers have been extremely impressed with the unique features, very high quality, stylish designs and competitive pricing of these new products. They are going to be a very big hit with consumers."
http://www.dvinsider.com/newsletter/
culater
MPEG LA Issues Revised Licensing Terms
By Mark Long -- e-inSITE, 7/17/2002
The MPEG LA and its member patent holders have announced their agreement on final licensing terms for the MPEG-4 Visual Patent Portfolio License, MPEG-4 Systems (without MPEG-J) Patent Portfolio License and MPEG-J Patent Portfolio License. The official licenses for the MPEG-4 Patent Portfolio are scheduled to roll out beginning this September.
After taking into consideration the views, interests and concerns of prospective licensees, the organization and its member patent holders have decided that not only will the MPEG-4 Visual Patent Portfolio License Agreement cover the Simple and Core profiles of the Visual standard as previously announced, but will also cover all other current MPEG-4 Visual standard profiles. Those parties who elect to sign the license agreement within six months of its first offering six months will not be required to pay royalties on licensed products sold between January 1, 2000 (the effective date of the license) and December 31, 2003.
'By affording access to patents that are essential not only to the Simple and Core profiles but also to additional MPEG-4 Visual profiles without additional royalties, the MPEG-4 Visual Patent Portfolio License provides convenience to the marketplace and adapts seamlessly to marketplace change,' commented MPEG LA CEO Baryn S. Futa in a prepared statement. 'It tells users they are welcome here for the long haul as the technology develops and applications migrate to other MPEG-4 Visual profiles.'
To enhance the widespread use of MPEG-4 Visual technology across various business models, the patent holders have agreed to adopt 'reasonable' annual limitations on certain royalties in order to provide customers more cost predictability. The licensing models will also feature royalty options that require no royalty reports. The MPEG LA has also established threshold levels below which certain use-based royalties would not be charged to encourage early-stage users to adopt MPEG-4 Visual technology. In addition, the licensing agreement includes alternative royalty options to accommodate the needs of various business models.
A Favorable Response
The MPEG-4 Industry Forum (M4IF) responded favorably to the MPEG LA's announcement of licensing term revisions. Although M4IF President Rob Koenen believes that the viability of the license still needs to be proven in the marketplace, he says that its usage fee concerns have been substantially addressed by the MPEG LA's revisions.
'I am delighted that terms are finally known,' said Koenen. 'This sounds the starting bell for the whole broadcast and multimedia industry to start releasing MPEG-4 products and services. The licenses are the long-expected prerequisite for MPEG-4 being fully accepted and deployed.'
'We now have version 1.0 of the product, which is great. It does not yet cover all the requirements, but I am confident that MPEG LA and the licensors will work with potential licensees, as they have done to date, to come with a version 2 license that covers more services and usage models,' said Koenen. The MPEG-4 industry needs a license that allows competitive products to be offered in all of MPEG-4's target markets, consistent with the opportunity presented MPEG-4 as a truly horizontal, cross-platform standard. Although there is no single, comprehensive, open alternative for all markets, there are alternatives in each of them.
Licensing Details
The MPEG-4 Visual Patent Portfolio License applies three models to three separate business environments. Manufacturers of products for cable TV, satellite TV and over-the-air broadcast applications will pay a royalty of US $0.25 for the right to manufacture and sell each decoder and encoder, subject to annual caps per legal entity. Cable and satellite TV content providers will pay a royalty of US $1.25 for the right to use a decoder to decode and use encoded MPEG-4 Visual information.
The manufacturers of decoders and/or encoders who offers Internet or mobile products for sale or distribution will pay US $0.25 per activated decoder and/or encoder subject to an annual cap per legal entity of $1,000,000 for decoders and $1,000,000 for encoders. However, the will be no royalty charged for the first 50,000 decoders and 50,000 encoders that are sold or distributed in any calendar year.
Internet and wireless mobile content providers can either choose to pay a flat rate of $0.25 per subscriber per year or $0.000333 per minute of MPEG-4 video used, with either option subject to an annual cap of $1,000,000 per legal entity. Alternatively the content provider may elect to pay a flat annual fee of $1,000,000 with no royalty reporting obligation. In addition, no royalty is payable on the first 50,000 subscribers during any one calendar year.
With regards to advertiser supported services or any other content provider that does not receive direct payments from subscribers, the MPEG LA says that it will work directly with Licensees to come up with a consistent method of counting subscribers that will accommodate the specific business models of the customer.
Replicators and content providers specializing in Stored Video will either pay $0.01 per 30 minutes or a maximum of $0.04 per movie. For content that is more than five years old, the provider shall pay $0.005 per 30 minutes or a maximum of US $0.02 per movie. Providers of stored video that is 12 minutes in length or less, the royalty fee will be $0.002.
The initial term of the license will run through Dec. 31, 2008, and will be renewable on what the MPEG LA characterizes as reasonable terms and conditions throughout the useful life of any Portfolio patent with rate protection for similar license grants, subject to MPEG LA's right to change terms to meet changing market conditions.
In addition to the MPEG-4 Visual Patent Portfolio License, the MPEG LA intends to offer optional licenses under essential patents for the MPEG-4 Systems Standard (without MPEG-J) as well as the MPEG-J part of the MPEG-4 Systems Standard. http://www.e-insite.net/commvergemag/index.asp?layout=article&articleid=CA233979&spacedesc=n...
culater
OT-Enter the Dragon
China will soon be the biggest PC market in the world, and everyone wants a piece of it.
One problem: A homegrown powerhouse called Legend. http://www.wired.com/wired/archive/10.08/legend.html
culater
OT-What’s In a Name? Not Much Without a Branding Strategy
In suburban Philadelphia, not too many miles from Wharton’s campus, is a retail establishment called Ed’s Beer Store. It’s a wonderfully prosaic name. Customers know what they can buy there, and if they have a complaint, they know whom to talk to.
But what about companies with names like Agere, Agilent, or Altria? Or Diageo, Monday and Verizon? Or Accenture, Cingular and Protiviti?
Except for Monday, which may be a strange thing to call a company but is nonetheless a real word, all these names are fabricated. What’s more, none of them, even Monday, tells potential customers anything about the businesses they are in. Plus, they sound so contrived that you might conclude they will do nothing but elicit snickering and confusion in the marketplace.
According to marketing professors at Wharton, however, that is not necessarily the case. They say peculiar names, by themselves, may mean nothing to begin with. But if backed by a successful branding campaign, they will come to signify whatever the companies want them to mean.
“My general sense is the name doesn’t make much difference,” says professor David J. Reibstein. “What companies end up doing is a significant amount of advertising and creating an image around the name.”
He suggests that relatively new names like Diageo, which owns Pillsbury, Burger King, Guinness and major liquor brands, and Agere, a maker of communications components that was acquired, and later spun off, by Lucent Technologies, can seem strange. But they mean no more or less than Ford, Marriott, Coca-Cola and other venerable brands. Lucent, itself a spin-off from AT&T, means “marked by clarity” or “glowing with light,” according to the company. Diageo is based on the Latin word for “day” and the Greek word for “world.” Why was Diageo chosen? The company says that every day, all over the world, consumers buy its products.
“What does Pillsbury mean? Pillsbury means a lot because of the doughboy character in those ads,” Reibstein says. “What are two of the biggest names that have emerged in the past decade? Amazon and Starbucks. Does Starbucks mean coffee? Absolutely not. For the most part, these names don’t mean much of anything. But we get to know a company and that starts to create an image. General Motors tells us something about what the company does, but Ford communicates only the name of the founder of the company.”
And how about AFLAC, a large international insurer but hardly a household name until the last few years? “It’s amazing the amount of awareness people have of AFLAC because of some duck on television,” Reibstein says.
“I don’t think the name of a company is hugely important in the long run,” agrees professor David Schmittlein. He adds that even fabricated names are “real names” in the sense that “they are pronounceable words. I think of them as largely empty vessels – reliable and durable empty vessels that can be filled up with positive associations.”
In the opinion of professor Barbara Kahn, “a lot of these names are oddball.” But, in the end, that does not matter, she says. “The success of a name is much more a function of the implementation of the branding strategy” than of the name itself.
Some of the unusual names companies give themselves nowadays are not really much different from the kinds of names that pharmaceutical firms have given drugs (Claritin, Celebrex, Lipitor) or car companies have given automobiles (Corvette, Celica) over the years.
When you think about it, pharmaceutical and technology firms have long had funky names, such as Xerox, Cephalon or ImClone. In his book, A Random Walk Down Wall Street, Princeton professor Burton G. Malkiel remembers the “tronics boom” of the 1960s when companies thought investors would be attracted to any name reminiscent of electronics and the space age.
Malkiel writes: “There were a host of ‘trons’ such as Astron, Dutron, Vulcatron, and Transitron, and a number of ‘onics’ such as Circuitronics, Supronics, Videotronics, and several Electrosonics companies. Leaving nothing to chance, one group put together the winning combination of Powertron Ultrasonics.”
In choosing a name for itself or one of its products, Reibstein says, the three most important things for a company to do is make sure it has the legal rights to use the word, that the word does not translate into something embarrassing or negative in a foreign language, and that the word carries no other undesirable connotations. “For the most part, as long as you come up with a name that does not have negative connotations, you can create whatever image you want.”
Schmittlein points out that many company names, by being a play on words, can be linked in customers’ minds with positive attributes or benefits. Two successful examples, in his view, are Agilent, which invokes the word agility, and Cingular, which the company hopes will call to mind the idea of singularity or self-expression. Agilent, a communications, electronics and life sciences company, was spun off from Hewlett-Packard. Cingular, a joint venture between SBC Communications and BellSouth, is a wireless phone company.
In contrast, Schmittlein says, the name Accenture is not as successful as Agilent or Cingular in immediately conjuring up a positive association. Accenture, the former Andersen Consulting, has taken steps to disassociate itself from its previous name. Its website contains only a few references to Andersen Consulting, and all of its news releases issued prior to its name change have been rewritten to eliminate the words Andersen Consulting and replace them with Accenture.
Another company name that Schmittlein says is not as successful as others is Monday, the recently announced moniker for PwC Consulting, formerly part of PricewaterhouseCoopers, the accounting firm. With the negative associations related to Monday – the start of a grinding workweek in the minds of many – the firm should have chosen otherwise. “I wish them well with the new name but I wouldn’t have picked it,” Schmittlein says. “If I have to pick a name, I’m picking Friday.”
Are some names – even those that have been invented by corporations and brand consultants – inherently better than others? “One thing that’s important is its pronounceability,” Schmittlein says. ”Another is how culture-bound the name seems to be. Sometimes that’s good and sometimes it’s bad. If the name stands for a culture, that can be good.” For instance, spring water sold by a company with a distinctively European name may be more exotic, and hence more appealing, to U.S. consumers than water bottled locally.
As strange as many company names can be, they are not pulled from thin air:
· Altria is the new name that Philip Morris, the cigarette and food company, has chosen for itself. The name is derived from the Latin word “altus,” meaning to “reach higher”, according to the company.
· Verizon, created by the merger of Bell Atlantic and GTE, is a telecommunications company. The name Verizon combines the Latin word veritas, meaning truth, with the word horizon. The company says its “veritas values” include integrity and respect, while its “horizon values” include imagination and passion.
· Protiviti, a recently created subsidiary of Robert Half International, provides internal audit and business and technology risk consulting services. The new firm is composed of people formerly employed by Arthur Andersen LLP’s U.S. internal audit and business risk consulting practices, which operated separately from Andersen’s external audit services. According to the company, the name Protiviti is intended to communicate “independence, professionalism, proactivity and integrity.”
Kahn says that as companies increasingly adapt global branding strategies, they seek names, for themselves and their brands, that have few or no existing associations. “With global branding now you definitely want to pick a name that does not have different meanings in different languages,” Kahn says. “You can build brand names for anything that’s pronounceable through imagery. If you want to build associations, if you have a simple consistent message, you can create the sense of associations you want for a brand. What you’re looking for in a brand is something unique and identifiable that is differentiated from other brands. Think about what many brand names are: They’re somebody’s name, which is a made-up name, too.”
Clever names can help companies, but their benefits only go so far, especially if customers are unhappy with a product or service. Says Schmittlein: “Names don’t seem to have any large impact that would override customers’ experiences with the company.”
http://knowledge.wharton.upenn.edu/articles.cfm?catid=4&articleid=593&homepage=yes
culater
iPod for Windows: Apple's Trojan Horse
By Jay Lyman
NewsFactor Network
July 17, 2002
http://www.newsfactor.com/perl/story/18645.html
The move to introduce a Windows-compatible iPod represents a serious effort on Apple's part to serve the broader Windows market and to induce Windows users to switch sides.
Confirming its intent to use the wildly popular iPod MP3 player as a "Trojan horse" to infiltrate Microsoft territory and convert Windows users to Macs, Apple on Wednesday unveiled an iPod for Windows at the Macworld trade show in New York.
In his keynote address, Apple CEO Steve Jobs joked that other new iPod features and products -- including a new version of iTunes organization software, a 20 GB iPod, a wired remote control and a carrying case -- will be made available to Mac users before PC owners can buy the Windows iPod.
However, the move represents a serious effort on Apple's part to serve the broader Windows market and to induce Windows users to switch sides, according to analysts.
"They really appear to be embracing the Windows market," Giga Information Group research fellow Rob Enderle told NewsFactor. "[iPod] is probably the best MP3 player on the market. Moving it over [to Windows] gives people a reason to walk into an Apple store."
Switch Strategy
Enderle said Apple is doing anything it can to get PC users into its stores, and the Windows iPod is likely to help accomplish that goal.
"Every opportunity they get to convince people to buy Apple is an opportunity for market share, which they desperately need," he said.
However, IDC analyst Alan Promisel questioned the iPod's role as a tool to convince Windows users to buy Apple computers.
"The flip side is that it's compatible, so there's no need to switch," Promisel said. "They've eliminated the reason to switch over."
Software Not the Same
Apple is in a similar Catch-22 position with its iTunes software for iPod. While Apple touts features of its iPod software, such as browsing, ratings and playcount, the PC version of the MP3 player will come with MusicMatch software for playlists and music libraries.
"It's not iTunes, but it's better than anything on the Windows side," Jobs said at Macworld.
Giga's Enderle said Apple is likely to form more partnerships with migration software makers like MusicMatch, which will automatically launch and transfer songs and playlists after an iPod is plugged into a PC.
Apple's iPod Autosync features, which allow automatic updates, will also be functional in the PC version of the player.
Reaching for Revenue
Promisel said tremendous support and demand for the iPod will make it a hit with the "broader market and greater spectrum of users" who are on PCs.
"I think there's going to be considerable enthusiasm over this announcement," he said. "I think Apple's going to do very well with this move."
Promisel added that in light of current computer market conditions, Apple is wise to leverage a moderately priced consumer product like iPod to generate revenue.
"It's an excellent opportunity to drive an additional revenue stream," he said
http://www.newsfactor.com/perl/story/18645.html
culater
Why free downloads help, not hurt
By Janis Ian
July 17, 2002, 12:00 PM PT
When researching an article, I normally send e-mails to friends and acquaintances, who answer my request with opinions and anecdotes. But when I said I was planning to argue that free Internet downloads are good for the music industry and its artists, I was swamped.
I received over 300 replies--and every single one from someone legitimately in the music business.
Even more interesting than the e-mails were the phone calls. I don't know anyone at the National Academy of Recording Arts & Sciences (home of the Grammy Awards), and I know Hilary Rosen (head of the Recording Industry Association of America, or RIAA) only in passing. Yet within 24 hours of sending my original e-mail, I'd received two messages from Rosen and four from NARAS, requesting that I call to "discuss the article."
Huh. Didn't know I was that widely read.
Ms. Rosen, to be fair, stressed that she was only interested in presenting RIAA's side of the issue, and was kind enough to send me a fair amount of statistics and documentation, including a number of focus group studies RIAA had run on the matter.
However, the problem with focus groups is the same problem anthropologists have when studying peoples in the field: the moment the anthropologist's presence is known, everything changes. Hundreds of scientific studies have shown that any experimental group wants to please the examiner. For focus groups, this is particularly true. Coffee and donuts are the least of the payoffs.
The NARAS people were a bit more pushy. They told me downloads were "destroying sales," "ruining the music industry," and "costing you money". Who gets hurt by free downloads? Save a handful of super-successes like Celine Dion, none of us. We only get helped.
Costing me money? I don't pretend to be an expert on intellectual property law, but I do know one thing. If a music industry executive claims I should agree with their agenda because it will make me more money, I put my hand on my wallet...and check it after they leave, just to make sure nothing's missing.
Am I suspicious of all this hysteria? You bet. Do I think the issue has been badly handled? Absolutely. Am I concerned about losing friends, opportunities, my 10th Grammy nomination, by publishing this article? Yeah. I am. But sometimes things are just wrong, and when they're that wrong, they have to be addressed.
The premise of all this ballyhoo is that the industry (and its artists) are being harmed by free downloading.
Nonsense.
Let's take it from my personal experience. My site gets an average of 75,000 hits a year. Not bad for someone whose last hit record was in 1975. When Napster was running full-tilt, we received about 100 hits a month from people who'd downloaded Society's Child or At Seventeen for free, then decided they wanted more information. Of those 100 people (and these are only the ones who let us know how they'd found the site), 15 bought CDs.
Not huge sales, right? No record company is interested in 180 extra sales a year. But that translates into $2,700, which is a lot of money in my book. And that doesn't include the people who bought the CDs in stores, or came to my shows.
RIAA, NARAS and most of the entrenched music industry argue that free downloads hurt sales. More than hurt--it's destroying the industry.
Alas, the music industry needs no outside help to destroy itself. We're doing a very adequate job of that on our own, thank you.
The music industry had exactly the same response to the advent of reel-to-reel home tape recorders, cassettes, DATs, minidiscs, videos, MTV ("Why buy the record when you can tape it?") and a host of other technological advances designed to make the consumer's life easier and better. I know because I was there.
The only reason they didn't react that way publicly to the advent of CDs was because they believed CDs were uncopyable. I was told this personally by a former head of Sony marketing, when they asked me to license Between the Lines in CD format at a reduced royalty rate. ("Because it's a brand new technology.")
Realistically, why do most people download music? To hear new music, and to find old, out-of-print music--not to avoid paying $5 at the local used CD store, or taping it off the radio, but to hear music they can't find anywhere else. Face it: Most people can't afford to spend $15.99 to experiment. And an awful lot of records are out of print; I have a few myself!
Everyone is forgetting the main way an artist becomes successful--exposure. Without exposure, no one comes to shows, no one buys CDs, no one enables you to earn a living doing what you love.
There is zero evidence that material available for free online downloading is financially harming anyone. In fact, most of the hard evidence is to the contrary.
Again, from personal experience: In 37 years as a recording artist, I've created 25-plus albums for major labels, and I've never received a royalty statement that didn't show I owed them money. Label accounting practices are right up there with Enron. I make the bulk of my living from live touring, doing my own show. Live shows are pushed by my Web site, which is pushed by the live shows, and both are pushed by the availability of my music, for free, online.
Who gets hurt by free downloads? Save a handful of super-successes like Celine Dion, none of us. We only get helped.
Most consumers have no problem paying for entertainment. If the music industry had a shred of sense, they'd have addressed this problem seven years ago, when people like Michael Camp were trying to obtain legitimate licenses for music online. Instead, the industrywide attitude was, "It'll go away". That's the same attitude CBS Records had about rock 'n' roll when Mitch Miller was head of A&R. (And you wondered why they passed on The Beatles and The Rolling Stones.)
NARAS and RIAA are moaning about the little mom-and-pop stores being shoved out of business; no one worked harder to shove them out than our own industry, which greeted every new mega-music store with glee, and offered steep discounts to Target, WalMart, et al, for stocking their CDs. The Internet has zero to do with store closings and lowered sales.
And for those of us with major label contracts who want some of our music available for free downloading...well, the record companies own our masters, our outtakes, even our demos, and they won't allow it. Furthermore, they own our voices for the duration of the contract, so we can't post a live track for downloading even if we want to.
If you think about it, the music industry should be rejoicing at this new technological advance. Here's a foolproof way to deliver music to millions who might otherwise never purchase a CD in a store. The cross-marketing opportunities are unbelievable. Costs are minimal, shipping nonexistent--a staggering vehicle for higher earnings and lower costs. Instead, they're running around like chickens with their heads cut off, bleeding on everyone and making no sense.
There is zero evidence that material available for free online downloading is financially harming anyone. In fact, most of the hard evidence is to the contrary.
The RIAA is correct in one thing--these are times of great change in our industry. But at a time when there are arguably only four record labels left in America (Sony, AOL Time Warner, Universal, BMG--and where is the RICO act when we need it?), when entire genres are glorifying the gangster mentality and losing their biggest voices to violence, when executives change positions as often as Zsa Zsa Gabor changed clothes, and "A&R" has become a euphemism for "Absent & Redundant," we have other things to worry about.
We'll turn into Microsoft if we're not careful, folks, insisting that any household wanting an extra copy for the car, the kids, or the portable CD player, has to go out and "license" multiple copies.
As artists, we have the ear of the masses. We have the trust of the masses. By speaking out in our concerts and in the press, we can do a great deal to dampen this hysteria, and put the blame for the sad state of our industry right back where it belongs--in the laps of record companies, radio programmers, and our own apparent inability to organize ourselves in order to better our own lives--and those of our fans.
If we don't take the reins, no one will. http://news.com.com/2010-1078-944488.html?tag=fd_nc_1
culater
Telematics in autos may get second life
BY ED GARSTEN
Associated Press
DETROIT - Very rarely has the death of a venture elicited as many ''told you so's'' as when Ford Motor Co. pulled the plug last month on Wingcast LLC, its attempt to give vehicles all sorts of onboard communications capabilities.
Analysts said the 18-month old venture, which never brought a product to market, was an expensive stab at using outdated analog technology to perform ambitious communications tasks known as telematics.
Now the fitful automotive telematics industry is looking to a new savior: the short-range digital wireless communications standard known as Bluetooth.
A Nordic invention named for a 10th century Viking king, Bluetooth allows various components of telematics systems to ''talk'' to each other through radio frequencies, allowing a driver to check e-mail, get directions, call for help or even unlock the car if the keys are left inside.
With a 30-foot range, Bluetooth makes it possible to operate a cellphone with voice commands instead of hands -- even from outside the vehicle. The technology is already being used by consumers to network cellphones, handheld computers, laptops and printers.
The technology would allow a cellphone to work as a modem, downloading movies, music and navigation information and funneling it into the car's onboard computer and onto displays.
The current leader in auto telematics is General Motors Corp.'s hard-wired OnStar system, which doesn't use Bluetooth.
That's fine with DaimlerChrysler AG's Chrysler Group, which this fall will begin offering Bluetooth system called UConnect as a dealer-installed option at a suggested retail price of $299 plus labor. A second version, to be offered as a factory-installed option will be available in early 2003.
''It has to be about flexibility, simplicity and affordability, or telematics will continue to struggle,'' said Chrysler Group telematics chief Jack Witherow.
The initial version of UConnect will offer voice dialing and an audio address book capable of storing up to 32 telephone numbers. Other yet-to-be-announced features will be available in the factory-installed version.
Designed to handle up to five phones per car, the services will appear as charges on a monthly cellphone bill.
Chrysler's move represents automakers' growing realization that developing telematics technology and services is best left to companies specializing in those fields.
''We think we ought to stick to our core strengths,'' said Witherow.
Such thinking prodded Ford to abandon Wingcast, a joint venture with Qualcomm Inc.
''We're still committed to telematics, but how we'll make good on that commitment has changed,'' said Ford spokesman Paul Wood.
The telematics industry is growing at just two to three percent a year, according to a study by Cap Gemini Ernst and Young, but the potential is much greater.
Globally, the telematics market for hardware and subscription services will grow to $27 billion by 2005, from $3.6 billion in 2000, the study predicts.
In 2001, Americans bought 1.85 million vehicles equipped with some sort of telematics, according to a report released in April by the Telematics Research Group. That number is expected to grow to 2.6 million next year and 7.6 million by 2007, the report said.
Besides wireless downloads, Bluetooth will allow the car's components to ''speak'' to the driver.
For example, an alternator that may be six months from failing could trigger the telematics system to advise the driver and automatically call the dealer to order a replacement, said Jim Geschke, who runs the telematics business for automotive supplier Johnson Controls Inc.
For now, OnStar is the standard bearer in auto telematics with more than 2.5 million subscribers, according to Don Butler, vice president of OnStar planning and business development.
Launched in 1996, OnStar is not turning a profit despite its adoption by several Japanese and German automakers. Experts expect OnStar, which uses wires along with built-in wireless connectivity, to be eclipsed by Bluetooth-fueled systems.
''Customers want some cell phone connectivity but they don't want to worry about wires and microphones,'' said Mike Wujciak, an analyst with Cap Gemini Ernst and Young.
Butler insists his service's technology is more reliable than Bluetooth, because you don't depend on a portable phone that can be lost.
Chrysler's UConnect, however, takes the automaker out of the cellphone and service-providing business. Instead, Chrysler hopes to earn revenue from the sale of Bluetooth units.
''It will be the flexible system that's going to win in the end,'' said Wujciak.http://www.miami.com/mld/miamiherald/business/technology/3672767.htm
culater
Memory enhancers
Flash vendors tailor their chips for convergence
Margot Suydam, Technology Editor -- CommVerge, 7/17/2002
The list of electronic products depending on flash memory continues to grow more diverse. Thus product teams need memory that fulfills different needs from application to application. Where one device needs capacity above all else, another might put more of a premium on space or read/write speed. Flash vendors are responding with products that cater to convergence products.
For example, Hitachi Semiconductor is touting a 1-Gbit AND-type monolithic device that achieves a write speed of 10 Mbytes/sec. According to the company, that's a new industry record for a chip of this type and is five times faster than previous Hitachi products. The new device, the HN29V1G91, is the first product to be based on Hitachi's proprietary AG-AND (Assist Gate AND) technology, a multilevel cell approach.
The chip is capable of recording 128 Mbytes of data (equivalent to two hours of CD-quality MP3 music), in about 13 seconds. The company says the chip can also be used as a recording medium for high-end digital cameras that capture large, high-resolution image files and offer moving-picture capabilities. Other key applications include mobile phones and wireless PDAs that can send and receive pictures.
The 1-Gbit device should enable differentiating features whether it's used in discrete form or packaged into next-generation flash-memory cards, the company says. Samples will be available in October.
Also targeting the memory requirements of convergence products, STMicroelectronics has announced its first flash memories with dual-bank operation. Part of the company’s M29 series, the chips target set-top boxes, digital still cameras, MP3 players, cellular phones, and other digital consumer applications.
Dual-bank operation allows a system to read data from one of the chip's memory banks while simultaneously writing or erasing data in the other bank. The chip comes in two varieties: The M29DW323D has one 8-Mbit bank and one 24-Mbit bank, while the M29DW324D has two 16-Mbit banks.
In addition, ST contends that these are the smallest 32-Mbit standard flash memories on the market, available in packages that occupy as little as 7 by 11 millimeters of circuit-board real-estate. Samples are available now.http://www.e-insite.net/commvergemag/index.asp?layout=article&articleid=CA234065&pubdate=7/1...
culater
Microsoft's automotive platform drives in-car telematics systems
Automakers and suppliers demonstrate broad support for Microsoft technology (7/15/2002)
Microsoft Corp. has announced that Microsoft® technology is featured in the computing and communications systems in 12 car models from five auto manufacturers.
Unveiled this year in the U.S., Japan and European markets are BMW, Citroën, Mitsubishi, Subaru and Volvo models showcasing navigation, communication and infotainment devices that deliver safe and reliable services to drivers and passengers on the road today.
According to the U.S. Department of Transportation, Americans spend more than 500 million hours every week commuting in their cars. Delivering personalized services, diagnostics, hands-free communication capabilities, and other convenience and productivity applications to the car requires a flexible software platform that can bring a sophisticated computing environment into the automobile economically.
Microsoft is leading this trend, making in-car computing a reality. Microsoft provides Siemens VDO Automotive AG, the navigation supplier of BMW of North America LLC; Clarion Co. Ltd., the navigation supplier of Citroën; and Mitsubishi Electric Corp., the navigation supplier of Mitsubishi Motors Corp., Fuji Heavy Industries Ltd. (manufacturer of Subaru-brand cars) and Volvo Cars of North America LLC with Microsoft technology.
The Windows® operating system Powered solution is implemented in the following ways:
BMW's 7 Series includes a navigation application, part of the innovative BMW iDrive concept.
Citroën's C5 and Xsara telematics solution includes turn-by-turn navigation capabilities, hands-free cellular phone control, voice recognition and text-to-speech, maintenance status, and wireless synchronization of data with mobile devices.
Mitsubishi's Mirage Dingo, Airtrek, Lancer Cedia and Chariot Grandis include expert navigation capabilities.
Subaru's Legacy Lancaster ADA (Japanese model) includes a leading-edge navigation system, part of the latest Active Driving Assist (ADA) feature, which helps drivers negotiate driving challenges such as sharp turns.
Volvo's S60, S80, V70 and Cross Country models include a high-performance Road and Traffic Information (RTI) navigation system that incorporates locally based services and the Traffic Messenger Channel (TMC).
"We've invested considerable time and resources to build flexible, reliable and cost-effective technology to fit every type of car, from an entry-level sedan to a luxury model," said Bob McKenzie, general manager of the Automotive Business Unit at Microsoft. "It's tremendously satisfying to see our innovations implemented in today's vehicles. Once consumers become accustomed to telematics products and services in the car, opportunities will soar."
Earlier this month, an independent research report revealed that shipments of the company's Windows embedded operating systems had more than doubled this year, demonstrating continuing industry wide support for Microsoft's platform technologies.
In addition, suppliers and automakers continue to line up behind Windows CE for Automotive, committing to offering drivers and passengers a unique, rich and safe telematics experience. "This Windows Powered technology has a key competitive advantage, and our customers have a first-rate in-car computing experience," said Didier Cruse, chief strategy officer at Citroën. "The in-vehicle computing system provides drivers and passengers with turn-by-turn navigation capabilities, hands-free cellular phone control, AM/FM/CD entertainment, roadside assistance, voice recognition and text-to-speech, maintenance status, and synchronization of data with mobile devices."http://www.telematicsupdate.com/homepage2.asp?news=30136
culater
Is speech recognition technology the killer application for telematics?
Are speech driven telematics services what the end users really want or is the consumer being kept out of the loop? (7/15/2002)
During the speech workshop at the Telematics Update Gothenburg conference in June there was an air of confusion surrounding the key question - what do the end users really want out of a speech recognition system? We are told that the capability is already here. Companies are confident about the accuracy of their speech engines and application service providers are chomping at the bit to tell everyone about their new products. But do these companies really know what we as customers want? From the questions posed by many delegates at the conference this didn't seem to be the case.
Speech systems can enable the driver to send and read out emails, take down dictation, or organise your social life whilst in the car without causing dangerous distraction. But has anyone asked us the end users what we want and if we would use this kind of system? And is speech technology really the answer to driver distraction or is it a distraction in itself?
Do I really want to send an email while driving? If I want to drive safely then its unlikely that I do. Who are these products aimed at? Is it the salesman spending all day driving to see clients, the commuter spending hours driving in his/her car to work, the parent taking the kids to school everyday, the holiday maker wanting to know where the nearest hotel is? Are these systems going to suit everyone? Can they really deliver what they claim to be able to do? As consumers are we being told that we need these systems by companies who have invested so much in them they don't want us to believe any different?
The only speech application that I see as compelling whilst driving is the simple hands free voice dial command, and possibly location based information which I can activate by voice without having to take my eyes off the road. But speech systems boast they can do much more.
This issue stretches further than just speech, it extends as far as the services themselves. How are telematics companies going to educate the end-user and encourage us to not only use these systems but to use them safely?http://www.eyeforauto.com/Subpages.asp?news=30100
culater
Plug.In: CARP Battle, Devices Dominate
July 12, 2002
By Ryan Naraine
Things have not been working out so well lately for the streaming media/digital music sector. News of shutdowns, layoffs, and poor financials continue to hog the headlines and, to make matters worse, a controversial Copyright royalty payment structure is being described as the death knell for the struggling industry.
Yet, beneath the underlying apathy, Jupiter's Plug.In digital music conference on "the future of digital music," raised a glimmer of excitement, especially on the device technology side where the industry is racing to keep up with the need for portability -- and accessibility -- to downloaded digital content.
As expected, the discussions at the conference focused on the ramifications of the controversial Copyright Office's Copyright Arbitration Royalty Panel (CARP) ruling (see in-depth internetnews.com coverage here) with U.S. Congressman Rick Boucher, D-Va., starting the buzz with a well-timed swipe at the major record labels for what he described as an inherent distrust of the consumer market.
Boucher's keynote, which included a call for the labels to put all their music online with permanent downloads, transferability and per-track availability, set off a string of upbeat discussions that included a rare admission from EMI executive John Rose that the paid-subscription music download model "appears to be dead."
As Boucher outlined plans for a 'Fair Use' Fight at the Congressional level, EMI's Rose conceded the music industry must turn to alternative business models as it attempts to stamp out piracy. "We cannot build viable business models when competing with 'free.' It is as simple as that. We cannot beat 'free' but we have to work hard to make 'free' less available," Rose declared, urging the recording industry to experiment and restructure the way digital content is distributed and sold on the Internet.
Rose hinted that the industry would be wise to focus on the a-la-carte space and even suggested the answer lies in partnering with ISPs to let service providers absorb some of the costs of distribution.
"There is no one-size-fits-all answer. The existing models have the consumers paying 100 percent of the costs. That is clearly not working. We need to diversify and find revenue streams outside of paid subscriptions," the EMI Vice President added.
Pro-CARP/Anti CARP
Rose's upbeat keynote could not overshadow the pro-CARP/anti-CARP lobbies, which seemed to infiltrate and dominate most of the two-day discussions. From Boucher's "we need to scrap the CARP" declaration to RIAA CEO Hilary Rosen's "CARP was a pretty thoughtful decision," it was clear the Webcasters and the recording industry remain miles apart on the issue.
John Jeffrey, a Live365 executive who manages the business development and legal affairs teams for the streaming media firm, was among the anti-CARP crowd, dismissing the CARP decision as a "terrible result that is strangling an industry in its infancy." The ruling set royalty rates at 0.07 cents per performance for Internet-only broadcasts and AM/FM retransmissions.
Jeffrey, who shared the stage with Jon Simson of royalty-collection agency SoundExchange and Ann Chaitovitz of the American Federation of Television and Radio Artists (AFTRA) -- the union which represents musicians -- maintained the royalty rates were an "unfair burden" borne by Internet broadcasters, a sentiment predictably shared by Tom Des Jardins of Net radio ad agency LightningCast.
"It's clear CARP is a devastating blow. Webcasters simply cannot afford the rates set," Des Jardins said, arguing that the soft advertising market would force thousands of small Internet radio firms out of business.
SoundExchange's Simson disagreed. "The Webcasters that have shut down have very little to do with (the CARP decision). They were shutting down long before CARP," Simson said, insisting musicians should not subsidize the bigger Internet radio companies like America Online (AOL: Quote, Company Info) and MTV Online.
Both Simson and Chaitovitz criticized the CARP arbitration process as cumbersome and expensive and insisted it must be reformed when the current payment structure expires. "Looking back at how much money was spent to simply participate in the arbitration process, it is clear that CARP needs to be changed. The arbitrator's fee alone was more than $1 million," Chaitovitz said.
Jonathan Potter, executive director of the Digital Media Association (DiMA), predicted the royalty rates would force about 85 percent of Internet radio firms out of business. "The per-performance fee structure is debilitating...We believe the major labels will successfully weed out Net radio," Potter said, adding that the 0.07 cents per performance fee would suck 100 percent of the profits of many companies.
While the CARP debate raged, the Plug.In conference also featured displays from a small group of device makers showing off new digital entertainment technologies. With the industry showing a trend towards portability of digital content, companies like iRiver America, Xitel and Simple Devices were all on hand to show off the future of gadgets and devices that will be used to shuttle content from PCs to home-based entertainment centers and vehicles.
It's all about portability
iRiver America, based in San Jose, Calif., debuted its MCU-less VCD software and multi-CODEC player, styling the iDP-100 device as the world's first DataPlay Engine-adapted music player. A company spokesman told internetnews.com the newest DataPlay technology will be enabled in its device.
DataPlay's format, which is scheduled for launch in August, is a coin-sized recording/playback media that can handle up to 500 MB of music (the equivalent of 11 hours of MP3 files or five full compact discs).
"It's all about portability these days. Consumers don't just want their music files on the PC. They want to access it from the home entertainment system or in the car," said Christine Gasparac, an executive at Burlingame, Calif.-based Simple Devices. Gasparac's company is hawking a multi-device platform that delivers premium content wirelessly beyond the PC.
The platform, centered around the company's SimpleServe device-networking middleware, transforms an Internet-connected computer into a network server that streams digital media and data to SimpleDevices-enabled products. With a high-profile deal with Motorola in place, Gasparac said a home PC can now be wirelessly linked to home and car stereos, allowing all three to share music files. The system includes a SimpleFi receiver, a transmitter attached to the PC and SimpleAuto, a receiver with either 10 gigs or 20 gigs of memory.
The company's business includes a SimpleMedia product set that targets Network Service Provider. Gasparac said SimpleMedia combines content, applications and e-commerce tools with a distribution engine that allows network service providers to offer tiered digital media service levels to subscribers. SimpleMedia would sell hosting and technology to let service providers deliver content from third-party content providers and to monetize that service through advertising, e-commerce opportunities and subscriptions.
Xitel, based in Australia, showed off its HiFi-Link product, another device that links PCs with home stereo systems. HiFi-Link, which plugs into a USB port on a computer and into the RCA inputs of a stereo system, shuttles digital music from the PC to the home entertainment system. The device, which retails for $49, promises to bypass the soundcard while processing digital audio directly from the USB without signal degradation.
"Millions of people have created play lists on their computers. But listening to that music on a built-in soundcard gives a less than optimal music experience. (Our product) bypasses the soundcard altogether, allowing consumers to turn their computers into a powerful HiFi component," said Barrie David, managing director of Xitel.
The Corona factor
As the digital media industry tinkers with business models, argues about copyright laws and create new gadgets to keep up with consumers' needs, all eyes are fixed on powerhouses Microsoft (MSFT: Quote, Company Info) and America Online (AOL: Quote, Company Info), for entirely different reasons.
Microsoft is preparing for the worldwide release of Corona, the controversial iteration of the Windows Media Engine, which will be integrated into the company's .NET web services architecture. Corona, which is being tested by software developers, powers the compression and decoding of video and is considered a key to the next version of the WMP software.
It is Microsoft's answer to the MPEG-4 open standard, which was adopted by competitors RealNetworks' (RNWK: Quote, Company Info) RealPlayer and Apple Computer's (AAPL: Quote, Company Info) QuickTime player. Microsoft's Corona push has rankled feathers within the digital media distribution space because it is proprietary and the company's dominance of the software market gives it an unfair advantage, critics argue.
According to published reports, AOL is also making waves in the space with plans to develop technology to reduce audio/video streaming costs, a move that many believe will provide a shot in the arm for a struggling sector. If AOL's initiative can deliver on the early hype, the argument is that reduced costs for streaming audio and video to the consumer market would jump start innovation.
Recently, the sector has been chock full of bad news: Yahoo pulled the plug on its FinanceVision and Yahoo Radio streams, Loudeye Corp. fired its CFO and cut nearly 40 percent of its workforce, SomaFM shut down entirely and bellwether RealNetworks (RNWK: Quote, Company Info) warned revenues would come in much lower than previously expected.
The depressing news is a dramatic wake up call for the audio/video streaming media industry, hailed only a year ago by the Yankee Group as the hot sector of the moment.
http://www.internetnews.com/bus-news/article.php/1403491
http://www.internetnews.com/infra/article.php/10796_1403491_2
http://www.internetnews.com/infra/article.php/10796_1403491_3
culater
"...DivXNetworks, Inc. announced a partnership with e.Digital Corporation to jointly develop and market a range of consumer electronics devices that play back DivX video, including handhelds, DVD players, set-top boxes and digital cameras.
Also, MovieLink, a video-on-demand provider backed by a brigade of 5 firms (Sony Pictures, Viacom's Paramount, Metro-Goldwyn-Mayer, AOL Time Warner's Warner Bros. and Vivendi Universal), said that it plans to launch its VOD service later this year using MPEG-4." http://siliconvalley.internet.com/news/article.php/1006491
culater
DivX Video Codec Dips Into TI Chip
April 30, 2002
DivXNetworks Tuesday made strides towards the next generation of full-motion, high-quality video on the Internet with a deal with Dallas-based Texas Instruments (NYSE:TXN).
The San Diego-based VoIP and codec maker said it has embedded its video codec to the TI's digital signal processor (DSP)-based solution - the TMS320DSC25. The single-chip will have the high-performance audio/visual image-processing engine containing all functions required for communication with external devices. A multimedia system-on-a-chip (SOC) solution, the DSC25 supports DivX playback at full video frame rate and frame size.
The company said the chip is the first embedded solution capable of playing back DivX video, which allows for video at 7-10 times greater compression than MPEG-2.
"A key ingredient in making video convergence a reality is the emergence of a robust embedded solution to power the next generation of multimedia devices that bridge the gap between the PC and the TV, and by enabling high-performance playback of widely popular DivX video, TI's DSC family is well-positioned to lead the convergence charge in the CE space," said DivXNetworks CEO and co-founder Jordan Greenhall. "Over the next six to eighteen months, we will see the roll-out of a variety of hardware devices capable of playing back DivX video, from handheld players to digital set-top boxes. The first port of DivX technology to a high-performance multimedia DSP-based solution such as the TI DSC25 is a major milestone toward that end."
The DivX-enabled chips are expected to support DivX playback on next-generation consumer electronics devices, including digital set-top boxes, handheld video players, next-generation DVD players and digital video cameras.
"By working together with DivXNetworks to optimize the DSC25 for the high-quality playback of DivX video, we can maximize the value proposition we offer to our customers and partners," said TI's Imaging and Audio Group VP and manager Dr. Kun Shan Lin. "We believe the addition of DivX technology to the DSC25 will significantly increase the functionality of our multimedia DSP product line."
DivXNetworks recently announced a partnership with e.Digital Corporation to design and develop DivX-powered consumer electronics devices, with the first device expected to be available by the end of the year.
http://siliconvalley.internet.com/news/article.php/1025471
culater
e.Digital Adds Web Portables [sorry if posted]
Staff
TWICE
6/24/2002
San Diego— e.Digital plans July 1 shipments of a trio of Internet audio portables, one of which rips directly from a CD player without a PC intermediary.
All feature MP3 playback, embedded flash memory, and SmartMedia-card expansion slots. Included software works on PCs and Macs.
The compact Odyssey 100, at a suggested $129, features 64MB of embedded memory and 30 hours of battery life from a single AA alkaline. The step-up Odyssey 200 at $179 features 128MB embedded memory, digital FM tuner, voice recorder with built-in microphone, and wired remote.
The $229 Odyssey 300 adds ripping/encoding of songs via direct connection to a standard CD player.
http://www.tvinsite.com/twice/index.asp?layout=story&articleId=CA224407&pubdate=06/24/2002&a...
culater
OT-Best Buy's Best Laid Plans: A Strategic Overview
By Alan Wolf
TWICE
7/8/2002
MINNEAPOLIS— As part of their presentation to shareholders, Best Buy officers led by president/COO Allen Lenzmeier broke out the company's business units and outlined their short-term goals for each during last month's annual meeting here.
The overarching strategy is to service what the company perceives as two distinct "eco-systems" — entertainment and in-home — with a multi-brand approach. At the same time, it plans to bring Best Buy's best practices — including innovative store design and knowledge management — to each of its new divisions while sharing the best ideas from Musicland, Magnolia and Future Shop with all.
The company is also looking to leverage its vendor relationships to develop new, complimentary businesses that will further entrench it with consumers.
Here's a look at the game plan:
Best Buy Stores
Best Buy caters to techno-savvy 15- to 44-year-olds with annual household incomes of $50,000 or more. This group encompasses technophiles, active singles, and young fun-seekers that like the stores' low-pressure atmosphere and grab-and-go merchandising displays.
The company has been adding about 60 stores a year and will have some 540 units by 2003. As large markets reach saturation with the standard 45,000 square foot stores, it will increasingly tap into secondary markets with populations of about 200,000 with a new generation of 30,000 square foot stores. Some 90 of those configurations will be built during the next three years, while the balance will be in the larger, newly developed Concept 5 format, which was designed to better showcase digital products.
At the same time, the company will begin drilling down to tertiary markets with populations of 100,000 via a 20,000-square-foot prototype store. Tests of two such units will begin this October in Texas and the state of Washington.
Lenzmeier said the company will eventually top out at 800 Best Buy stores in total.
Best Buy places a tremendous emphasis on customer satisfaction and loyalty as a means to increase the amount of money consumers spend in its stores. Customer loyalty is measured monthly, with the goal of attaining a 50-percent "very favorable" rating by shoppers.
On the product front, several initiatives are in the works. Targeted areas include:
Appliances. Best Buy has spent the past two years "re-engineering" its major appliance business in an effort to improve execution. The company currently has a 5 percent market share in white goods, which is the smallest of any category it carries. Majaps also has the lowest close rates of any Best Buy department. The company believes that there is a "significant opportunity" to double its majap market share through improved assortment and better systems to manage that business, Lenzmeier said.
Computers. Taking a page from the Dell playbook, Best Buy is stepping up its after-sale service by bringing fulltime repairmen into its stores. The move follows a six-month pilot project that resulted in improved turn times and a reduction in complaints. "Service after sale is critical," Lenzmeier said. "Now, customers will be able to bring their broken PC to the store."
He said the new service will help increase customer loyalty, and added that the company is also looking at providing installation services for in-home networks.
Video. In anticipation of significant growth in high definition, plasma and LCD TVs during the next five years, Best Buy will begin testing a new design for its video departments this fall that better demonstrates the features of high end products. A national rollout will follow.
In addition, the company may eventually offer in-home installation service for plasma TVs that would either be outsourced or developed in-house.
As Mike Keskey, president of Best Buy Stores noted, the idea is to offer consumers "complete solutions" that include accessories, services and extended warranties, rather than just products alone. One graphic demonstration of that is in Best Buy's home theater area, dubbed Project Living Room, where more services are being incorporated into the sales floor.
Bestbuy.com. The company has rolled its e-commerce unit into the Best Buy Stores operation under Keskey. As Lenzmeier explained, a significant opportunity exists to sell more by providing customers with access to the Web site in stores.
Musicland
The company plans to significantly expand the group's stores into small and rural markets, where they can benefit from lower real estate and labor costs. In light of the soft music software market, the group will expand its assortment of video games, and some 200 Sam Goody stores will be re-merchandised with DVD software. The company will also "intensify" its re-merchandising of Sam Goody with personal electronics, Lenzmeier said.
Longer term, the group is "looking at broader entertainment areas like satellite dishes, satellite radio and concert ticket sales," said Musicland president Kevin Freeland. He added that despite a sharp decline in CD volume, the entertainment software category is still growing via DVDs and video games.
"We're allocating floor space toward growing businesses and managing the resources behind declining businesses," Freeland said. "We have the numbers, the total area is growing, but we just wish it was more robust on the music side."
Magnolia
The specialty A/V chain boasts an affluent customer base of consumers ages 35 to 54 with annual household incomes of $75,000 and higher. There is little overlap between Magnolia's SKUs and those of Best Buy.
The company is opening six new stores in San Francisco, which will bring the store count up to 19. It ultimately envisions 150 Magnolia stores nationwide, and believes it can build the chain into a $3 billion business by expanding its product and service offerings.
International
At this point the company's international endeavors are limited to Canada, which is serving as a training ground for eventual offshore expansion.
As the first order of business, Best Buy is beefing up its Future Shop chain by bringing over such best practices as inventory management, advertising effectiveness and complete solutions selling. Through these measures the company expects that it can increase Future Shop's profitability from 2 percent to 4 percent, which would match that of Best Buy stores.
The company also believes the Canadian market is ripe for expansion, and plans to build 30 new Future Shop stores, remodel existing ones, and open upwards of 65 Best Buy stores north of the boarder. The Best Buy brand will first appear in Toronto, where the first of eight units will open this fall.
According to Lenzmeier, Future Shop presently commands a 15 percent share of the Canadian CE market. With the addition of new stores and Best Buy, the company believes it can boost its total market share to 35 percent.
As chairman Dick Schulze explained, "Historically, there are two key leaders that outperform the field in any industry. Rather than have competing brands like Wal-Mart and Target, or Circuit City and Best Buy, we'll have a co-branded situation. We'll control both brands."
Tom Healy, president of Best Buy International, added that the only way to reach a 30 percent to 35 percent market share in Canada is with two brands, and he assured shareholders that the two franchises would be differentiated in the marketplace.
The company wouldn't pinpoint its first overseas venture, although Europe is the likely target. "We're looking at opportunities everywhere," was all Healy would say on the subject, "and Canada is our current direction."
http://www.tvinsite.com/twice/index.asp?layout=story_stocks&articleid=CA232216&display=Retai...
culater
DataPlay: Circuit City To Launch DataPlay Device
By Doug Olenick
TWICE
7/8/2002
New York— Circuit City is expected to launch a house-branded version of a DataPlay portable audio player at the end of July, according to a DataPlay executive.
Raymond Uhlir, DataPlay's marketing VP, said Circuit City's product will use the Classic brand name and cost about $300. Uhlir displayed a working model of the Classic player at PC Expo last week.
A Circuit City spokesman would neither confirm nor deny Uhlir's statement, other than to say that it is looking into selling Dataplay devices at some point.
Circuit's introduction will come just after iRiver and Evolution launch their players, which also were on exhibit, in early July.
"The OEM brand option offers a better profit model for the retailer," Uhlir said, although he expects Circuit City to also sell DataPlay devices from other brands.
Samsung is slated to bring out a player for the holidays, Uhlir said. At that time about 12 different hardware devices should be on the market. About 50 prerecorded music titles should be on the market for the July launch with 10 to 20 being added each week going forward, Uhlir said.
Toshiba, the only other top-tier CE maker with DataPlay plans, has not made any announcements on whether it will introduce a product.
DataPlay hardware was originally slated to hit retail last fall, but a self-imposed manufacturing delay pushed back the launch.
Imation has readied its blank DataPlay media product to coincide with the hardware launch. Jane Rodmyre Payfer, Imation's personal storage solutions marketing manager, said Imation has exclusive worldwide distribution and manufacturing rights for DataPlay media. Imation intends to start off slowly by shipping just one SKU, a three-pack for $29.99, in July, she said.
Payfer believes DataPlay hardware and software will make a big splash during the holiday season. "DataPlay is one of the very few new technologies being introduced for this holiday season," she said, which should bring the technology a certain amount of cachet among consumers.
http://www.tvinsite.com/twice/index.asp?layout=story_stocks&articleid=CA232228&display=Compu...
culater
Labels defend MusicNet, Pressplay
By Jim Hu
Staff Writer, CNET News.com
July 8, 2002, 10:40 AM PT
NEW YORK--Record industry executives and critics are trading barbs at an industry conference this week, with an outspoken legislator saying major labels' online subscription services may amount to a "duopoly."
Rep. Rick Boucher, D-Va., warned Monday that the recording industry's efforts to sell music on the Internet could have anti-competitive implications.
During a keynote address at the Jupiter PlugIn online music conference here, the congressman also roundly criticized the recording industry's moves to prevent piracy of their copyrighted works. Among the topics targeted by Boucher's address were the recording industry's attempts to protect CDs, existing federal copyright law, and future legislation that could give content providers more power to take matters into their own hands.
But the creation of the recording industry's online music services Pressplay and MusicNet has long been the focus of Boucher's criticism. He called Monday for legislation that requires both services, which combined own the copyrights to 80 percent of all recorded music, to offer the same license terms to competitors. Boucher added that offering preferential licensing terms could have anti-competition implications.
"That level of duopoly...of content ownership and the ownership of distribution is threatening to the arrival of competition in the delivery of music on the Web," he said.
Record industry executives, however, said Boucher's complaints merely reflected his desire for more regulation.
"I think he (makes such statements) to try to give more weight to his desire to regulate the music industry," said Hilary Rosen, CEO of the Recording Industry Association of America. Boucher has "taken a piecemeal approach to individual complaints and tried to put it in the context of a whole strategy. But it's not a strategy."
The two industry-backed services were created last year in attempts to convince consumers they should be paying for music they access through the Web. Pressplay is a joint venture between Sony Music Entertainment and Vivendi Universal's Universal Music Group, while MusicNet has the backing of EMI Recorded Music, Bertelsmann's BMG Entertainment, AOL Time Warner's Warner Music Group and streaming media company RealNetworks.
Pressplay and MusicNet were reactions to file-swapping services such as Napster, Gnutella and Kazaa, which have lured millions of people by allowing them to share music files for free. The recording industry has successfully sued and crippled Napster, requiring it to create a service that respects copyrights.
Since launching, consumers using Pressplay and MusicNet have criticized the services as inadequate, citing catalog limitations and clunky technology. Meanwhile, online music start-ups have alleged that the recording industry offered them unfavorable licensing terms to hinder their businesses. Such complaints prompted the Justice Department to begin investigating the services.
Boucher also took a shot against the recording industry's attempts to introduce copy-protection technology in new CDs, predicting the industry would "pay a heavy price" among angry consumers. He noted that such technology routinely has been thwarted by homemade measures and that continued practice would only push people toward free file-sharing services.
"I'm quizzical about why this approach is going forward," Boucher said.
RIAA's Rosen, however, argued that technological controls could gain consumer support.
"I am optimistic that if handled right and explained fully, consumers will support formats that give them flexibility and also help us stop piracy," she said.
http://news.com.com/2100-1023-942066.html?tag=fd_top
culater
Blockbuster movies via broadband still a pipedream - for most of us at least
By David Walker
July 2 2002
Back in 1996, your columnist did a few sums and realised that by 2001 or thereabouts, computers would be able to receive and display compressed video over the broadband cable network being built in Sydney and Melbourne. It all seemed to add up: the computer and the Internet would become a powerful entertainment medium.
These days, your columnist thanks his lucky stars that he never turned this into a full-fledged prediction because it has proved thoroughly wrong. The Net Equity household did indeed watch Harry Potter on computer last week but the movie came on a DVD delivered in the family car, not via the Internet.
It is technically possible to provide high-quality movies over the Internet.
An hour of TV-quality video encoded in the DIVX format or Apple's new Quicktime 6 will use as little as 500MB of disk space - not much on today's 40GB-plus hard drives and $1 recordable CDs.
Indeed, such files do circulate clandestinely among computer users who create them from DVDs; illegal copies of Spider-Man and Attack Of The Clones reportedly circulated even before their official premieres. Broadband providers even cite the burden of these huge movie files as a reason for abandoning unlimited-download broadband Internet offerings.
But while technology makes Internet-delivered movies feasible, it has not yet made them an economic success. If you've seen Spider-Man, you almost certainly lined up at the cinema and passed your money over the ticket counter.
Where is legal Internet-delivered video succeeding? So far, in niches:
Poor prices for Web ads ensure only a few video content sites will support themselves with ads. News sites actually seem to be reducing their video content offerings. Loss-making news sites have awoken to the reality that streaming video gives you few economies of scale: the millionth streaming customer costs as much as the first.
Online streaming video subscriptions may find more success, with RealNetworks, CNN and FoxSports among the groups reportedly launching subscription offerings in the US, even as they wind back free content. But streaming video quality, though improving, remains poor. And the lousy economics remain a problem for subscription sellers.
Pay-to-download video looks more promising, since the users at least show enthusiasm for it. But the content providers hate this model because of its piracy potential. Production companies can't easily prevent ordinary users passing on downloaded digital video to all their friends for free - and what would happen to DVD and tape revenues then?.
The real success of online video has been in the marketing of movies and television programs. Online movie trailers make a neat fit with broadband Internet: they are short enough to download in reasonable time, and they use attractive, already-created content. The marketers of Peter Jackson's Lord Of The Rings movie, who brought a new boldness to Internet movie promotion generally, allowed fans to download a two-and-a-half minute trailer to their hard disks, rather than just viewing a low-quality streamed version.
Online video will keep on evolving over the next few years, as broadband reaches more users, digital technologies spread through homes and offices and new hardware and software appears. But you'd be foolish to predict the outcome of that evolution. Remember, back in 1996 almost nobody would have expected the movie trailer to be 2002's most successful variety of Internet movie.
David Walker (david@shorewalker.com) is general manager, site, at Internet loan-finding service eChoice. http://www.smh.com.au/articles/2002/07/012/1023864702738.html
culater
Voice technology in the telematics market
Strategy Analytics Report - one of a series of analyses that form part of the In-vehicle Telematics and Multimedia Service (ITMS) (6/19/2002)
The major changes over the last year in the automotive voice technology market have been mainly in North America. The decline of Lernout and Hauspie (L&H) has led to the advancement of a number of voice recognition suppliers. North American voice technology suppliers have moved across the value chain and have taken a significant role in the development of telematics system user interface.
The primary application of voice activation technology in the in-vehicle environment is still hands-free mobile phone operation, however much of the new development activity is focused on integration into on-board systems, and adding value in service delivery. This is broadening the range of partnerships and opportunities for voice technology vendors.
Although voice technology is still developing through in-vehicle use of mobile phones, OEMs are embedding VR as an integral part of telematics system HMI (human-machine-interface), for safer and more convenient operation.
Japan is a market leader in automotive voice technology adoption, due to strict hands-free regulations for in-vehicle use of mobile phones, plus its application in autonomous navigation systems in the domestic marketplace. However, the sophistication of the Japanese systems will not develop as quickly as the North American market, which is younger and more flexible.
Movement throughout the value chain was a key development through the past year. Although most companies are partnered in order to be part of end-to-end product solutions, some have recently developed their own new products to expand across the chain.
Since voice portal providers developed highly functional systems with numerous applications, they have become more valuable to the telematics providers. Voice portals such as BeVocal and Tellme use the voice technology services of providers such as Nuance to allow the user to access a call center. These services also provide the ability to log in and personalize the services, which was shown in the Strategy Analytics' End User Dynamics Report to be a key quality of a successful telematics system.
Strategy Analytics expects that due to safety concerns and market competition, installation of voice systems will grow steadily through the period to 2008. Almost 20 million voice systems are expected to be OEM installed in 2008, representing 60% penetration of telematics systems.
Voice technology development has changed from an on-board/off-board dichotomy to developing though combinations of portals and simplified embedded devices. On-board systems tend to be more precise and useful when the system has a limited amount of necessary commands. However efficient, these systems tend to not meet all the needs of users who want a more natural and flowing interaction.
ITMS Report Analyst:
Suzanne Murtha
email: smurtha@strategyanalytics.com
http://www.telematicsupdate.com/homepage2.asp?news=29508
culater
"Embedded processors also tie into everyday life in the real world (RW) more than other children of technology. Next year's hot gift will probably be driven by a new embedded processor or two. Or four. It's not about "where do you want to go today" it's about "what do you want for your birthday?" A key factor in embedded success is getting to the one-spouse decision: the price point at which you can buy yourself a cool new gadget without asking permission first. Economists say that point is around $299; below that, sales take off." http://www.extremetech.com/print_article/0,3428,a=21424,00.asp
culater
A Gadget Burns Hours of Music Onto Discs the Size of Quarters
By SARAH MILSTEIN
Slip a quarter into a slot, and you used to get a handful of peanuts or a three-minute phone call. Now iRiver is introducing a digital music player that both reads and burns discs no bigger than a quarter. Slip a disc into the iRiver iDP-100 player, and it's good for up to 11 hours of music.
The discs, made by DataPlay, can hold up to 500 megabytes of data, enough for about 150 songs in any of several formats, including MP3 and WMA. The tunes can be downloaded to the discs by using a U.S.B. cable. DataPlay also has deals to sell prerecorded discs by artists on several major labels, including Universal, EMI, BMG and Zomba. Blank discs will cost $5 (250 megabytes) to $10 (500MB); prerecorded ones will cost about as much as a regular CD.
The blank DataPlay discs can also be used to store MPEG-4 files, digital photographs, games and documents in any digital format. While the iDP-100 cannot read such files, it can be used to transfer them between computers with U.S.B. connections.
The iDP-100, which is expected to go on sale in mid-July for about $350, looks like a yo-yo in a green plastic frame and weighs 7.5 ounces. A standard-issue rechargeable battery affords 11 hours of play time, and an optional 20-hour rechargeable battery will be available.
http://www.nytimes.com/2002/06/13/technology/circuits/13GEE5.html?ex=1024632000&en=2099e7e6b4004...
culater
OT-The Technology Innovation Squeeze
By Teri Robinson
www.EcommerceTimes.com,
Part of the NewsFactor Network
June 12, 2002
http://www.ecommercetimes.com/perl/story/18185.html
IDC's Stephen Minton said a sluggish economy is not always bad for R&D over the long term, noting there is some evidence that a downturn often precedes innovation.
An economic downturn means fewer dollars for salaries, business resources and, unfortunately, research and development. But is a poor economy synonymous with a dip in technical innovation, particularly at bellwether companies like Cisco and IBM?
Not necessarily, according to analysts. "There's less money across the board, but the dollars that once would have gone into marketing have been put into R&D," Kent Allen, an analyst with the Aberdeen Group, told the E-Commerce Times. "Most companies' boards are not demanding that they spend US$1 million a month to build brand."
During the economy's nearly two-year slide, technological innovation did not come to a complete standstill. For example, industry giant Intel was able to produce smaller, more powerful chips, and Proxim and Research In Motion took wireless innovation to new heights.
But this economic downturn has been characterized by a more measured approach to innovation, meaning that some Star Wars-like technologies, such as smart cards and artificial intelligence, have languished.
--------------------------------------------------------------------------------
Please note that this material is copyright protected. It is illegal to display or reproduce this article without permission for any commercial purpose, including use as marketing or public relations literature. To obtain reprints of this article for authorized use, please call a sales representative at +1 (818) 528-1100 or visit http://www.newsfactor.com/about/reprints.shtml.
--------------------------------------------------------------------------------
From Anything Goes
When the economy was red hot, every innovative idea came to fruition -- regardless of its utility. Just look at Pennsylvania-based e-Vend.net, which installed Internet-connected washing machines in the dormitories of the Massachusetts Institute of Technology (MIT). Students can check washing machine availability online and can receive e-mail alerts from newly vacated machines.
"Wireless was a big area because it was fashionable," Aberdeen Group analyst Andre Arkhipov told the E-Commerce Times. Vendors worked hard to apply location technology to wireless devices in hopes of stimulating marketing opportunities for e-commerce-driven enterprises.
But the ability to send a 99-cent Big Mac coupon via cell phone to someone about to pass a McDonald's hardly seems necessary, or even prudent, in a less-than-exuberant economy.
Instead, the focus has shifted to the practical rather than the breathtaking. "On the financial side of the world, there has been a definite shift," Arkhipov said. "Most companies are looking to streamline processes. They are looking for near-term results. They are not looking for fashionable stuff."
Practical Magic
Vendors are responding in kind with technology designed to ease enterprise woes rather than with gee-whiz innovations.
Sun Laboratories is one group that has taken up the mantle of practicality, noting on its Web site, "Even though our research may push the boundaries of what is possible, we work hard to keep our development focused on what is practical and profitable."
There is also a change in who is doing the innovation. "Smaller companies that I cover actually kind of got their technology going in the wake of the downturn," Allen said. On the whole, he noted, smaller and private companies "have done a better job of forcing R&D" because public companies must meet their financials and have less latitude with budget dollars.
Practical Magic
Technologies left on the back burner during the economic downturn now are beginning to make their way to the forefront, more out of necessity than because of a true economic recovery.
The federal government, dogged by older technologies and an aging workforce, will be spending more on its IT needs -- and outsourcing more of them. And while federal dollars represent a small slice of IT spending, the private sector is expected to join in by reviving technologies like smart cards and pushing biometrics even further.
Recent examples include Palm Beach, Florida-based Applied Digital Solutions' "Verichip," an implantable device that the company hopes can be used for defense and security purposes; and a deal announced last week by Littleton, Massachusetts-based Viisage to provide its face-recognition technology to the U.S. Army.
Terrorism Spurs Innovation
Yankee Group program manager Andy Efstathiou told the E-Commerce Times that the atmosphere created by the terrorist attacks of September 11th will add to the demand for new technology. "There is an increased interest and increased demand," he noted.
"I would assume that this type of environment -- post-September 11th -- would propel the adoption of smart card technology," Efstathiou added.
Indeed, such technology, which is widely adopted in Europe but lags in the United States, is just one example of an initiative that has been revived in the fight against terrorism.
According to Efstathiou, disaster recovery and business continuity technology and systems also will "be given additional impetus as a result of September 11th."
Will AI Beat the Hype?
Other innovative technologies also are making progress despite the economic downturn.
Advances in artificial intelligence (AI) eventually could turbo-charge customer analytics, giving companies speedier insights into individual buying patterns and a host of other consumer habits.
AI has gotten a bad rap in the past and has been characterized as too much hype and not enough practical application. But that is about to change, according to the technology's proponents.
SAS Institute, a privately held software company that specializes in business intelligence, is among the companies studying the use of AI-enhanced data mining.
SAS officials claim that one client, financial services firm Dreyfus, cut its customer attrition rate by more than 40 percent through use of an advanced analytics program, which helped identify and fix problems that were sending customers elsewhere.
Finding Fraud
Anne Milley, director of analytics strategy at SAS, said AI speeds up the analytics process and points users toward deep logical patterns that algorithms alone might not pick up.
"Neural networks go after patterns that might be significant indicators of fraud," Milley told the E-Commerce Times. The technology holds particular promise for alerting insurance companies to false claims, she said.
Michael Wellman, director of the University of Michigan's Artificial Intelligence Lab, told the E-Commerce Times that "the state of the science [of AI] and the practice, including application to business, has never been healthier."
Innovation Ahead
Regardless of the reason, however, the innovation landscape should change soon. IDC research director Stephen Minton and others said they expect a tech recovery could occur as soon as next year, spurred by an improving economy and a need for greater security in the wake of September 11th.
Minton explained that a sluggish economy is not always a bad thing for R&D over the long term. There is some evidence that a downturn often precedes innovation, he noted.
"In a funny kind of way, a downturn is the best thing that can happen to innovation," he said. "If you look back at the 1980s, it was the crash of the PC stocks and, in the end, everyone was predicting the end of PCs.
"Out of the ruins, there were a lot of companies like Microsoft, Cisco and Compaq that really started to appear and become the huge, global innovators they were over the next 10 years," Minton added.
http://www.techextreme.com/perl/story/18185.html
culater
BlueTooth Heads to the Highways
June 12, 2002
By: Carmen NobeleWEEK
The Bluetooth industry is hoping to find opportunity in failure.
Supporters of the short-range wireless technology have been searching for a sweet spot since Ericsson AB introduced it more than three years ago as a way for cell phones to communicate with headsets, PDAs, laptops and other nearby devices.
Since then, industry players have been pitching the technology for applications ranging from billboards to wireless LANs, but it has been slow to mature and even slower to be adopted.
When Ford Motor Co. pulled out of Wingcast LLC, its joint venture with Qualcomm Inc. to produce telematics services in vehicles, the automaker said it was still looking at Bluetooth. Wingcast was based on cell phone technology.
And at the Bluetooth Congress in Amsterdam this week, several Bluetooth components manufacturers will be making the case for Bluetooth as the ultimate technology for automotive telematics.
For starters, a telematics executive from Chrysler USA will deliver a keynote speech that champions Bluetooth in the car. Chrysler plans to include Bluetooth support in the consoles of some of its 2003 models.
"There was so much early hype about telematics that when it didn't happen immediately people said it was going down the gutter," said Ken Noblitt, business development director at Cambridge Silicon Radio Inc. in Richardson, Texas. "Telematics is not slowing down. That's a load of bull."
CSR at the Bluetooth Congress will be discussing partnerships with several telematics companies that are planning Bluetooth-enabled applications for a handful of major auto manufacturers, officials said.
Initial applications will focus on hands-free communication between a console in the car and devices such as cell phones that drivers are likely to carry with them. More complex applications will be available when the industry comes up with Bluetooth solutions that can reside in a car's engine, but current Bluetooth radios can't survive the heat. CSR officials said that auto companies other than Chrysler will announce Bluetooth plans by fall.
Bluetooth veteran Extended Systems Inc. is teaming up with Visteon Corp. and BMW to make the case for Bluetooth in the car. The companies at the show will demonstrate a way to control various devices and car console functions via a Bluetooth-powered voice connection.
Widcomm Inc. will demonstrate what the San Diego, Calif., company calls "CD Quality" audio, an application also aimed at the telematics as well as the entertainment industry, officials said.
The auto industry is far from universally embracing Bluetooth, however. For one thing, the Bluetooth Special Interest Group has yet to ratify a standard profile specifically for hands-free communications.
There is also the money issue that has been dogging Bluetooth from its inception. A few radio companies have created single-chip solutions that cost about $5 in volumes of millions, but it has taken nearly four years to lower the price of Bluetooth radios.
Sources at Toyota Motor Co. said that the company is exploring Bluetooth, but is far from committing because prices haven't dropped as quickly as anticipated.
http://www.extremetech.com/article/0,3396,s=201&a=28038,00.asp
culater
e.Digital goes on an Odyssey( -- The Jerusalem Post, 6/9/2002)
e.Digital Corporation plans to introduce a new family of digital audio players called Odyssey, which can be used on both PCs and Macs.
Each of the three new Odyssey pocket-sized, flash- memory-based digital audio players has a range of features and is designed for budget categories. All come with embedded flash memory, a SmartMedia card expansion slot allowing consumers to add up to 128 MB of additional storage, and e.Digital Odyssey Manager software, which is compatible with both PC and Macintosh platforms.
The compact Odyssey 100 MP3 player features up to 30 hours of battery life from a standard AA Alkaline battery. With a standard 64 MB of embedded flash memory, stereo earphones, and an intuitive user interface, this MP3 player squeezes every last minute out of a standard AA battery, making it the ideal choice for economy-minded consumers.
The Odyssey 200 features not only MP3 file playback, but also an FM tuner, 20 customizable FM station presets with digital tuning, and high-quality voice recording using the built-in omnidirectional microphone. Students, businesspeople, and writers of any kind can create and store over eight hours of voice recordings using the Odyssey 200's standard 128 MB built-in flash memory. A SmartMedia card expansion slot lets consumers add up to 128 MB of removable flash. A handy, wired remote control is standard with the Odyssey 200 and can be clipped to a shirt collar, pocket, belt, or tie to control playback and other functions while the player sits in a backpack or pocket.
The Odyssey 300's direct MP3 encoding allows users to rip files from their CD collection without the use of a computer, and play them back immediately. Other features include its digital voice recorder, FM tuner, FM recorder, and up to 12-hour battery life e. With a standard 128 MB of flash built-in, the Odyssey 300 holds voice, music, and FM recordings totaling up to 8.5 hours, and users who choose to add a SmartMedia card via the expansion slot can double that amount. Its simple, intuitive joystick control is easy to use, and the alpha-numeric blue backlit LCD is easy to read.
The Odyssey line is scheduled to begin shipping July 1. Prices are: $ 129 for the Odyssey 100; $ 179 for the Odyssey 200, and $ 229 for the Odyssey 300.www.edigital- store.com http://www.e-insite.net/index.asp?layout=article&articleId=LN461H-WR30-002T-32S8-00000-00&ti...
culater
Dueling Audio Technologies To Wage War in Space, On the Ground
By Mark Long -- 6/11/2002
e-inSITE
According to the findings of a new report from Allied Business Intelligence (ABI), the launch next month of the new Sirius digital radio service is just one sign of the impending fragmentation of US Digital Radio Market, which is projected to continue well into the decade as terrestrial digital radio comes of age. The fragmentation may parallel the deployment of digital video technology, both in the sky and on the ground, during the 1990s.
The AM/FM broadcasting community will be entering the digital domain shortly, with help from iBiquity's IBOC (In-Band On-Channel) technology. IBOC-ready receivers are slated to reach both the US automotive OEM and after markets in 1Q03.
'XM, Sirius, and iBiquity have created their own proprietary chipset specifications, with any hopes for standardization still several years away,' said senior ABI automotive analyst and report author Frank Viquez in a prepared statement. 'In the interim, major chipmakers will need to cater to these various US digital audio standards in addition to global standards such as DAB (Digital Audio Broadcasting), and DRM (Digital Radio Mondiale). This promises to keep silicon costs high until significant mass production levels are reached.'
Sirius Satellite Radio Technology
Sirius Satellite Radio is introducing a new technology for enhancing audio listening performance just in time for the launch of its nationwide service next month. The technology will employ a new version of the PAC v4 Audio Codec developed by by iBiquity Digital, which is based on the latest generation of psycho-acoustic modeling techniques for understanding the physiology of hearing. According to iBiquity, the new PAC v4 audio codec responds rapidly to the dynamics of audio signals to provide an 'open sound-stage' effect, with a high dynamic range and stereo separation. The codec also employs a sophisticated model for encoding audio signal harmonics to achieve higher coding efficiency for complex signals. In addition, the technology incorporates unique features such as an adaptive filtering capability and efficient multi-stage noiseless coding.
XM Satellite Radio's CT-aacPlus Technology
Last April, XM Satellite Radio unveiled the details behind its own sound-enhancing technology, which features near CD-quality CT-aacPlus audio encoding with Neural Audio optimization. The company's CT-aacPlus-enabled digital radios are based on custom chips developed by STMicroelectronics and the Faunhofer Institute, which have been designed to process the satellite and repeater signals and then decode the music, speech and data.
According to XM Radio, CT-aacPlus combines Advanced Audio Coding (AAC) with Coding Technologies' Spectral Band Replication (SBR) technology to create additional bit-rate efficiencies. The company's CT-aacPlus-enabled digital radios are based on custom chips developed by STMicroelectronics and the Faunhofer Institute, which have been designed to process the satellite and repeater signals and then decode the music, speech and data.
The XM platforms devote a considerable amount of bandwidth to error correction and concealment processes to ensure an extremely reliable delivery mode that minimizes momentary signal losses from interfering with the continuous flow of music. The XM Radio sound channels are further optimized through the use of proprietary pre- processing software from Neural Audio, which employs neural network computing techniques to implement algorithms that are based on models of the brain's perception of sound. Neural Audio has created a customized version of their process, designed to enhance CT-aacPlus results by optimizing temporal and spectral elements prior to encoding to improving soundstage clarity as well as increase signal intelligibility.
Neural Audio's 'stereo transcoder' algorithm has been designed to preserve the imaging and spatiality of stereo and surround-sound content. XM customers with matrix-style Dolby surround sound equipment can therefore receive a full surround sound experience in their automobiles.
The Terrestrial Transition To Digital Audio
The terrestrial transition to digital is expected to offer radio listeners significantly enhanced audio quality, equivalent to CD quality on the FM band and with new digital AM radio transmissions matching the quality already achieved by the current FM radio standard. In addition, the new digital radio broadcasting standard will give radio stations the ability to offer new wireless data services.
IBOC technology has been designed to operate simultaneously with each station's existing analog transmission and within the existing spectrum allocation. Existing radio receivers will continue to receive analog signals while new IBOC-based receivers will feature the ability to receive both analog and digital signals, with stations continuing to be found at their current locations on the radio dial.
The new consumer-oriented IBOC receivers are scheduled to become commercially available early next year after taking their initial bows at the Consumer Electronics Show (CES) in Las Vegas. Radio stations will be able to use the IBOC technology's wireless data service capabilities to transmit artist and song identification tags, as well as local and station information, so that it can be displayed on the front-panel screens of the new IBOC receivers.
Philips Semiconductors will be marketing an ASIC for iBiquity Digital's In-Band On-Channel (IBOC) digital broadcast technology that combines the IBOC with its own IC expertise to produce IBOC-capable chips for digital radio applications. The Philips ASIC is reportedly capable of supporting multiple platforms for consumer product applications. According to ABI, the domestic shipments of digital radio receivers is expected to increase from approximately 650,000 units in 2002 to over 33 million by 2007.
http://www.e-insite.net/commvergemag/index.asp?layout=article&articleid=CA221952&spacedesc=n...
culater
Hardware challenges may create de-facto standards
The challenge of keeping up with the rate of change in electronics, a rate far swifter than the product lifecycle of automobiles, will be a major hurdle to would-be telematics providers. (6/11/2002)
By Tim Moran
Running in the race may be the only way to gain generally-accepted telematics standards, though, a panel discussion implied during Telematics 2002 Detroit, held at Cobo Center in mid-May.
"Whatever happens, you have to have a flexible and agile vehicle-based system," said Jim Geschke, vice president of electronics integration for Johnson Controls.
Geschke was one of five panelists convened to discuss reducing the cost of in-vehicle electronics and configurations. Others on the panel included Ken Khangura, Ford chief electrical engineer; Hany Neoman, Wavecom chief operating officer; Rich Pearlman, Denso International America, Inc., vice president for telematics; and Mark Peters, Robert Bosch Corp. director of information electronics. The panel was moderated by Neil Cox, SAIC telecommunications sector executive vice president.
Cox asked the panelists if they foresee a common existing operating structure for future telematics systems, so that a consumer could go from one vehicle to another vehicle of a different manufacturer and have a very similar type of experience without having to get an owners manual out.
"We all are sitting here with cellular phones, and we all know we have to push 'send' to make a call," he said, setting the example.
Geschke said that making the cell phone, not the car, the technology "brick" can enable that common approach. JCI's module being supplied as part of DaimlerChrysler's U-Connect program relies on that concept, mating a Bluetooth-enabled cell phone with an electronic module that connects to in-car systems.
"The handheld cellular phone assures compatibility car to car," Geschke said.
Pearlman said that discovering what people want in their car will drive standards, rather than any form of hardware creating operating systems and protocols.
"The issue is trying to restart, or drive, a stalled telematics industry in North America," he said.
Neoman, of Wavecom, disagreed by saying the interface between the car and the customer is probably the key. Khangura, of Ford, said the crux of the matter was enhancing the consumer experience in the vehicle through well-developed human-machine interface (HMI).
"The challenge really there for the automotive manufacturers is, what is the brand, what is the HMI, how do you optimize the whole interior? What is brand distinction? That is the front end," Khangura said.
Cox next asked now standards for telematics communication are evolving, and who sets them. He noted that the US wireless industry couldn't agree on a standard, and now has three -- GSM, CDMA and TDMA -- a costly differentiation that doesn't add much value for consumers.
Pearlman said standardization needs to follow consumer desires.
"It's too much effort trying to standardize something that doesn't exist yet. The standards won't make the market. It's applications: What do people want in their car?" he said.
Geschke said standards may be the wrong concept to use; instead, a unified way to deal with how drivers interact with telematics may be the key.
"We think we owe it to the consumer public to manage all these functions that are coming into the vehicle," he said.
Doing that with electronics that rapidly become obsolete, in vehicles that may not even begin being manufactured before their controller chip technology is out of date, provides a real block to developing a cost model for telematics hardware.
"How is the industry going to deal with obsolescence? The cars are going to outlast any RF scheme that's used in the industry; cellular's about 12 years old in this country, and it's on hits third generation of technology," said Cox.
Khangura said the question is still puzzling automakers.
"We haven't come up with good solutions yet. What you've got going here is a vehicle that, from a customer's expectation, has longevity that's supposed to last for 10 years. It's a huge investment, second only to buying a house. You've got to start at, first, electrical architecture. Then the vehicle. Then the consumer electronics; how do you bring those pieces together, then bring to it the constant changes to freshen it?" Khangura mused.
Geschke reiterated the flexible, handset-focused solution as an answer to the problem, but others on the panel pointed out that such a system does not offer some of the "always on" safety and security features telematics relies on. In addition, panel members said the cellular telephone industry has created universal expectations for consumers that connectivity is a commodity item available at a low price, and that handsets are a give-away item. Phone-familiar consumers may resist paying for telematics hardware if the service model is apparently cellular phone based, they noted.
In addition, automakers might not be able to implement the kinds of cost breaks they usually get from mass-market pricing of supplier parts when it comes to telematics devices.
"An automotive product that's going to launch in two years, today's chipsets aren't going to be available in two years. It's not just cost, it's what kind of service and what you're going to provide to the consumer," said Pearlman.
Rapid consumer electronics obsolescence may mean that telematics must make all of its money at the initial vehicle sale point, panel members said. Cox, the moderator, answering an audience question about the residual value of a three-year-old telematics system, said: "The presence of the technology on a resold car means nothing to the value of the vehicle."
With component cost being a touchy, strategic issue between automakers already, and with new discussions from OEMs focusing on "de-contenting" some consumer cars to reduce manufacturing expense, slicing even accepted systems such as anti-lock brakes, the panel said it was unlikely that any universal cost model would be available soon for telematics-equipped vehicles.
And panelists said cost may be the wrong target to stalk, too. With electronics content in cars leaping in future to represent one-quarter of a vehicle's cost, and with some cars carrying more than 30 different microprocessors, the cost of electronic content may be less important than the perceived value that content delivers.
Telematics still faces birthing pains and challenges, but unless it's allowed to undergo those events, facing the market test of consumer demands, a true general cost model can't be accurately developed.
http://www.telematicsupdate.com/homepage2.asp?news=29309
culater
Wingcast down; telematics still flying - By Tim Moran
Like Icarus of the Greek myth, Wingcast LLC launched itself too high and in late May crashed to earth. (6/11/2002)
But the regrettable end of a concept company, even one with 200 employees, doesn't mean that telematics is dead, or even that Ford Motor Company's enthusiasm for telematics will be lessened.
Industry participants said the Wingcast dissolution seems like prudent business management, rather than any sign of trouble for telematics.
"Ford was on top of the world," when its original Wingcast investment was made, said Steve Millstein, president and CEO of telematics service provider ATX, Inc. "Hell, dot-com wasn't even a bad word back then. Since then, we've got the dot-com bust, September 11th, the economy isn't doing well -- Ford took a step back and said 'Do we need to own this?"
The answer was a clear "no." The chance to monetize Wingcast through an initial public stock offering had long since disappeared; the venture was costing Ford real dollars with no sign of a near-future payback.
Wingcast's demise does spell the likely end for automakers going into business as telematics service providers, and may add power to the push toward non-embedded systems for suppliers other than OnStar. The Wingcast crash may give Bluetooth-enabled cell phones the momentum needed to create a new standard model in telematics service provision.
As late as May 15, Wingcast CEO and president Harel Kodesh could tell a telematics audience that Wingcast's business model would break conventional boundaries, offering a broad service range without the insular branding that characterized early automaker-backed telematics service offerings.
"Brands are really not our assets. They are the OEM's assets," Kodesh said in a speech before more than 500 attendees at Telematics Updates 'Telematics Detroit 2002' conference.
And Kodesh suggested that Wingcast would help lead a change to the telematics market itself, broadening its user base.
"Most of the telematics market today is really addressed to high-tier customers, and people just automatically pay the bills. But in order to build something that people who drive consumer cars will be willing to pay, we felt we need to do more than just safety and security. So we came out with a package of architecture to allow us to offer services both today and in the future."
But a Ford spokesman said that discussions were underway throughout May that had put the handwriting on the wall for Wingcast, and that several attempts to find a buyer or investor to take over the operation failed.
"Wingcast management and Ford management had had numerous discussions throughout the month of May. It wasn't looking good: The business case was getting weaker and weaker, and, ultimately, we would be ceasing our funding in the very near term," said Ford spokesman Dave Reuter.
Reuter said Ford had been the only cash source for Wingcast since the beginning of May, 2001, and as part of Ford's "back to basics" program initiated in January that cash flow to a non-core business had to be cut off.
"We said we were going to look at the businesses we were involved in, and anything that was not core to delivering great trucks and cars to our customers could be divested," Reuter said.
Qualcomm Inc. had invested $25 million at the start of the venture, in exchange for 15 percent ownership of Wingcast. A Qualcomm statement said that the company had recorded about $14 million in losses related to Wingcast, and the company expects to write off the remaining $11 million as a loss in the third fiscal quarter.
Reuter, at Ford, said that when the automaker turned off the funding tap, Wingcast management itself made the decision to cease business and to dissolve the company. He said Ford was accepting resumes from any of the 200 Wingcast employees who wished to apply through regular channels, but noted that there were no guarantees of employment, even at the highest level.
"Harel (Kodesh) was treated just like any of the other employees at Wingcast; he was laid off as of Monday (June 3)," Reuter said.
The Ford spokesman said the dissolution of Wingcast simply means that Ford doesn't feel the need to have an equity interest in a telematics service provider company -- and that Ford remains committed to telematics in current and future vehicles. The 2003 Lincoln Town Car will have an ATX-provided system; the 2002 Lincoln Continental, which carried a telematics system, was already planned to go out of production at the end of this model year. That leaves the Lincoln LS and the Aviator SUV looking for telematics service provider options, and Ford apparently is directly approaching both hardware and service provider companies to look for the best value it can get.
"We think telematics provides value. It's just a question of how you get there. We feel, going forward, that we don't necessarily have to own the company," said Reuter.
Ford was faced by the paradox that the hardware cost to install telematics systems in the high-end Lincolns was too high, at $1,295 for the option, to allow the system to easily be marketed on consumer volume cars. One option Ford is now considering to avoid that pricey option is to adopt Bluetooth enabled, non-embedded cell phone technology.
"It won't be embedded CDMA technology; it will also help the customer because they won't have to have an additional telematics bill," Reuter said.
Such a system is currently embodied in DaimlerChrysler's "U-Connect" approach, using a special Johnson Controls-supplied electronics module and in-car systems connecting a Bluetooth-equipped cell phone with the vehicle's stereo speakers, mirror-mounted controls from Gentex and voice processing software from IBM. To be introduced on the Chrysler Pacifica, the system is meant to be device- and service-agnostic; customers will be able to accept AT&T service, or bring their own cell phone service into the car as long as their phone carries a Bluetooth antenna, or "dongle."
Wingcast's bowing out, though, doesn't bring any immediate windfall to others in the TSP market. It simply ushers in competition for a market that continues to change, one in which Millstein, of ATX, says initial misconceptions are still being reluctantly abandoned.
"People have misdefined what telematics is," Millstein said.
Instead of being entertainment and back-seat-services driven, telematics is very much engine compartment and driver information based, what the ATX leader calls "block-and-tackle" automotive.
"There is a role for telematics to play in the back seat, but it's not to substitute for products that are already available today. All the car is, is another access node. To think that you'll pay for another access subscription, I think, is wrong. The consumers, at least in the United States, want their AOL wherever they are. They don't want another subscription for it," Millstein said.
He believes the automotive telematics market will exist partly to help auto manufacturers get better at knowing what car buyers want, or to help move expensive warranty update service out of the service bay and onto the wireless network.
"Getting to the customer before they make a decision to buy the car, before you have to give them $2,000 to get to your showroom floor -- how about you touch them a few times in 'negative time,' before they make that decision? Or (for warranty service) how about you do software patches through telematics rather than bringing those people into the dealership to do a patch on all of those microprocessors that are on the car?" Millstein asked.
So far from telematics being limited, he pictures future gains and growth being slower, but potentially bigger, than when the telematics concept was viewed as a bonanza market which automakers could dominate on their own.http://www.telematicsupdate.com/homepage2.asp?news=29307
culater
Archos Unveils 'World's Smallest Portable Hard Drive'
By Tim McDonald
www.NewsFactor.com,
Part of the NewsFactor Network
June 05, 2002
http://www.newsfactor.com/perl/story/18081.html
The device can carry computer files or digital music or images. Files can be dragged and dropped to and from the portable drive, which is "hot swappable."
Archos Technology this week unveiled what it called "the world's smallest, lightest portable hard drive."
The California company's Mini HD is a 20 GB hard drive that fits in a shirt pocket or the palm of a hand, and is ideal for "carrying computer files to and from work or while traveling," according to the company.
The unit is powered directly by a notebook or desktop computer. At 4.7 by 3 inches and 1/2-inch thick, it is slightly larger than a deck of cards and weighs 6.5 ounces.
'Hot Swappable'
The device can carry computer files or digital music or images. Files can be dragged and dropped to and from the portable drive, which is "hot swappable," meaning the user need not shut down and reboot the computer when connecting or disconnecting the device.
The drive also is multiplatform, is compatible with PCs and Macs as well as notebooks and desktops, connects via USB and is USB 2.0 compatible. It retails for US$200.
Music and Data
Archos, an Irvine, California-based subsidiary of France-based Archos SA, has been designing and producing peripherals for notebooks and desktops since 1998. It was one of the first companies to release a high-capacity, portable MP3 player based on a hard drive.
In March, the company unveiled its Jukebox Recorder 20, which found favor with most reviewers. The device was one of the first portable players to offer instant MP3 encoding of any audio source, allowing users to record up to 20,000 minutes of 128 kbps audio.
The Jukebox Recorder functions as both an MP3 player and a 20 GB portable hard drive, and it holds about 330 hours of MP3 music. It costs $369, or $240 without the built-in MP3 player.
Portable Storage Shrinking
Music and data storage devices continue to evolve, adding more storage capacity while shrinking in physical size.
"External storage is definitely going to be a growing market," IDC hardware analyst David Reinsel told NewsFactor. "One of the things that prevented it was a lack of a data port with sufficient bandwidth, and USB 2.0 gives the market a pretty nice range, up to 60 MB a second."
In April, Iomega launched its newest external hard drives, each weighing about eight ounces, with 40 GB, 80 GB and 120 GB capacities. Those drives started shipping in May.
And, of course, there is Apple's iPod, which the company considers primarily a music player but that also has myriad other uses. The new 10 GB iPod sells for $500 and can be used like a Palm or other handheld computer. With data transfer speeds of up to 400 megabits per second (Mbps), the iPod is fast enough to transmit live video.
Microdrive: 16 Grams
Toshiba is touting its 10 GB and 20 GB portable hard drives, the smallest of which is 1.8 inches long -- smaller than a credit card in length and width.
Meanwhile, IBM's 1-inch Microdrive offers storage capacities ranging from 170 MB to 1 GB. Weighing 16 grams -- less than one AA battery -- it has a sustained data rate of 4 Mbps and is a "significant performance improvement over current flash memory," according to IBM.
The Microdrive is used in portable electronics, such as digital cameras, handheld PCs and digital audio players.
Let's Get Small
Archos, founded in France, specializes in miniaturization. It claims to have designed the first PCMCIA memory card, and to have patented technology that regulates power for peripherals directly from the notebook. With Iomega, it also designed the first palm-size Zip drive.
The company's partners include Toshiba, Fujitsu and Iomega
http://www.techextreme.com/perl/story/18081.html
culater
OT-A Scent the Bears Are Missing
It's called operating leverage. The U.S. now has plenty of it -- and it can mean explosive earnings growth is at hand
The economy is clearly in recovery, but for the past two weeks the bears have been growling louder than ever. The explanation for this paradox is that the pickup in economic activity is happening so gradually that a major jump in corporate profits this year seems less and less likely. The prospect of yet another disappointing earnings season, combined with worries over escalating global political risks, is keeping most investors out of the game.
However, they may be making the mistake of confusing a tepid economic rebound with a weak earnings recovery. The former doesn't necessary lead to the latter. Even though sales at most companies aren't about to suddenly take off, corporate earnings still can. The financial markets seem loath to recognize this fact, but smart investors who get in early will reap the rewards if they remember the concept of "operating leverage."
Operating leverage is the idea that companies can make more money from each additional sale if they don't have to increase fixed costs to produce more. "The benefits of operating leverage are immense," says Jay Mueller, chief economist at Strong Funds. "That's because the minute business picks up, the existing workforce and the existing plant and equipment can do a whole lot more without adding additional costs." Profit margins expand, and profits boom.
BACK FROM THE BOTTOM. Right now, the broad economy is rife with operating leverage. Companies have cut costs to the bone. "For two years, all they've been doing is cutting," says Bob Kippes, portfolio manager of AIM Aggressive Growth (AAGFX ). "When the top line improves, companies are so lean and mean that it will all go straight to the bottom line," he predicts. As for the impact on earnings, "it should be explosive."
The best evidence of that operating leverage at work is the very low rate of industrial-capacity utilization. That figure bottomed in December, 2001, at 74.4% -- a 19-year low -- and has ticked up this year to April's 75.5%. Despite the modest improvement, utilization is still at levels not seen since the 1982 recession -- and is much lower than the 1990-91 recession's bottom of 78.1%.
On one hand, that's a sign that businesses continue to suffer from overcapacity. Looked at another way, however, that low-but-improving rate of capacity utilization is probably the best indication that profits can jump later this year, says John Lonski, chief economist at Moody's Investors Service. The last time rates were this low, profits of nonfinancial companies jumped 34% annually, on average, one year later, he says.
INFLATION BUFFER. "Investors might be surprised by the earnings growth later this year," says Lonski. "Why not go ahead and assume more risk, either as a business or an investor?" Given that capacity utilization is already ticking up, he thinks earnings could rise 15% year over year in the fourth quarter over 2001 (albeit off a low base) while gross domestic product likely will rise at only about a 5.8% annual rate by that time.
Investors can also see low capacity utilization as good news since it means there's little near-term risk of demand getting ahead of supply, sparking inflation. And that means the Federal Reserve shouldn't have to raise interest rates for quite some time. Says Lonski: "Large amounts of unused production capacity acts as a buffer that diminishes the inflation risks implicit to rising demand."
The other clear sign of operating leverage at work in the U.S. economy is the jump in productivity, says Mueller. A measure of business efficiency, productivity grew 8.4% in the first quarter (the largest jump since exiting the 1982 recession). That more than offset the 2.8% rise in hourly compensation and resulted in a 5.2% drop in unit labor costs. More reductions like that, and a profit jump can't be far behind.
COMING ATTRACTION. Of course, for productivity increases and low capacity utilization to work their magic on corporate profits, business sales still need to pick up. As Fed Chairman Alan Greenspan keeps pointing out, final demand has to increase.
It's coming. A recent stream of economic reports show that business in both the service and manufacturing sectors is picking up faster than expected. Retail sales and durable-goods orders are growing. Home sales continue to be robust, and consumer sentiment is improving. "It has already begun to turn the corner," says Mueller.
Spurring greater demand for U.S. goods is the falling dollar. In recent weeks, the greenback weakened substantially against the euro and the yen. That should result in more demand for American products overseas, since they'll be more price-competitive. Meantime, the monetary and fiscal stimulus put in place in 2001 is still kicking in.
"CHANGE IN SENTIMENT." Of course, for the economy to really soar, businesses have to start spending again -- on hiring workers, new technology, and capital improvements. And they will, once the evidence clearly shows that profits are perking up. "It's these improvements in profitability that lower the risk aversion of Corporate America," says Lonski. Then, companies will spend more, and investors can feel good about buying stocks again.
Thanks to operating leverage, it shouldn't take much of a pickup in sales to get that. "We don't need a global economic recovery," says Kippes. For the small companies he invests in, "All we need is a subtle change in sentiment, and these companies are going to do well."
He notes that investors are so disenchanted with the stock market that they don't really care that the signs of an earnings recovery are clear. "Eventually, they'll respond as the weight of evidence builds," he says. "It's not a matter of 'if' but 'when.'" http://www.businessweek.com/bwdaily/dnflash/jun2002/nf2002066_9508.htm
culater
OT-New Market Trend: Short, Distort
By Joanna Glasner
2:00 a.m. June 3, 2002 PDT
For every successful stock market scam, there is bound to be a buzzword to go along with it.
In the bull market of the late 1990s, that buzzword was "pump and dump." Predicated on the relentless optimism of the times, the scam involved promoting the hell out of some worthless company, boosting its stock to unsustainable heights, and then selling quickly before it fell apart.
So long as investors were willing to believe good things about obscure companies with businesses they didn't understand, the strategy worked.
Now that investors are loath to believe anything good about a public company, con artists are finding that a new tactic -- the "short and distort" scam –- is working better. With this strategy, scammers profit by selling short –- or placing a bet that a stock will decline -– and subsequently forcing shares down by spreading nasty rumors about the company.
Investment advisers say the technique is particularly effective for tech stocks and other so-called new economy companies that have borne the brunt of the market downturn.
"In a bear market, no one's going to believe that Amazon's going to go to $100. It's easier to say they've got some accounting problems, and the stock goes down in a heartbeat," said Rick Wayman, a financial advisor and co-founder of ResearchStock.com.
A recent case in point was the arrest in late May of Anthony Elgindy, a well known short-seller and publisher of now-defunct InsideTruth.com, a website with a history of warning of regulatory troubles at small- and mid-sized firms.
The federal government charged that Elgindy conspired with FBI agents to illegally obtain information about publicly traded companies under investigation. Elgindy would profit from the knowledge by short selling shares of the company -– a transaction that involves borrowing stock and selling it on the open market. He would later profit by unloading borrowed shares after the investigation became public knowledge and the stock declined.
Elgindy's case wasn't a classic short-and-distort scam, because much of the information he published, although illegally obtained, was in fact accurate. However, the high-profile arrest did draw attention to a trend that Wayman believes has been largely overlooked by regulators: the booming trade in spreading bad news.
While most traders have learned to cast suspicion on overly positive corporate press releases or puffy analyst reports, market psychologists say investors are less aware of the threat of negative rumormongering. Post-Enron skittishness exacerbates the situation.
"The psychological biases that are strong right now would help these scams," said John Nofsinger, a finance professor at Washington State University and author of Investment Madness: How Psychology Affects Your Investing.
Nofsinger says a consistent bias among investors is to expect whatever happened in the recent past to repeat itself. Because they've lost money recently on tech stocks and firms caught in accounting scandals, investors naturally believe there's more bad news ahead.
Wayman says short-and-distort tactics work best with smaller companies, since their stocks prices tend to be more volatile. Companies hit by short-sellers say it's generally difficult to fight back, given the speed at which rumors can be disseminated online.
"These websites don't go back and correct things generally. That's the problem," said Wil Williams, spokesman for SureBeam (SURE), a maker of equipment for irradiating food and other products that has been a target of Elgindy and other short sellers.
Although he doesn't mind legitimate short selling -- which can help keep stocks from reaching unsustainable heights -- Williams said the company has had issues with shorts attempting to profit by spreading false information.
Wall Street has been well aware of the Internet's power in spreading malicious untruths since the notorious Emulex hoax of 2000. In that scam, a college student sent out a phony negative press release that caused shares of the high-flying technology stock Emulex (EMLX) to tumble, costing investors nearly $110 million.
Companies whose stocks are riding high are favorite targets of shorts, said Erik Schielke, director of business development at CFG media, an investor relations firm.
"They'll say you can't sustain this growth rate forever," Schielke said.
But according to Wayman, even companies that haven't been performing well, and have been largely out of the public eye, make appealing targets for rip-off artists.
"The reason this is so effective for the undervalued stock is because the market is starved for information," he said. Without other news to compete with, "the shorts and distorts take center stage."
http://www.wired.com/news/print/0,1294,52785,00.html
culater
Hard Disk Drives Answer Call for Greater Storage Capacity in Consumer Electronics Products
As the number of broadband Internet users and digital video subscribers has increased over the past few years, there has been a concurrent demand for advanced consumer electronics products that can tap into and play back digital files. According to In-Stat/MDR, today’s consumers of digital content are also looking for a more “personalized” experience that allows them to customize and store vast amounts of digital video, music, and data. Many consumer electronics manufacturers have responded to the demand for greater storage capacity by turning to the magnetic hard disk drive, an option that meets their customers’ desires for increased storage while maintaining reasonable product price points. Hard disk drives are now found in products like personal video recorders (PVRs), in video game consoles like Microsoft’s Xbox, in portable digital music players like Apple’s popular iPod, as well as in other products like digital audio jukeboxes and even television sets.
Many of the leading hard disk drive manufacturers have been developing products for the consumer market for several years, and they are finally beginning to taste success. While the PC industry remains the most important market for hard disk drives, the consumer electronics market is developing into an excellent secondary market segment. While use of hard disk drives will grow in the consumer segment, other storage formats will provide plenty of competition. Flash, RAM, and optical storage solutions will continue to be widely used in the consumer products space, although the advantages that hard disk drives offer to consumer electronics manufacturers - reliability, cost competitiveness, and capacity - should continue to provide hard disk drive suppliers with a distinct advantage.
In-Stat/MDR also found that:
Worldwide unit shipments of PVRs are forecasted to increase from 1.2 million in 2001, to over 6 million in 2003. Over the same period, PVR product revenues are forecasted to increase from $550 million to over $2.3 billion.
Satellite set top boxes with integrated hard disk drives will continue to make up the bulk of PVR unit shipments, although cable set top box-based PVR unit shipments are expected to grow rapidly over the next two years.
Portable digital music players that integrate hard disk drives have proven to be extremely popular. Led by models like the iPod and the Rio Riot, worldwide unit shipments are forecasted to increase from 230,000 in 2001 to over 950,000 in 2003.
Product development of consumer electronics products with hard disk drive-based storage is continuing. The next generation of products will include hard disk drive-enabled PDAs, cell phones, and even wearable computers
This Market Alert is drawn from the In-Stat/MDR report, From TiVo to the iPod, Hard Disk Drives Penetrate Consumer Electronics Products (#IN020009MI), examines and updates the integration of hard disk drive-based storage in consumer electronics products. The report offers worldwide unit shipment and revenue forecasts for PVRs, digital audio jukeboxes, portable digital music players with integrated hard disk drives, television sets with integrated hard disk drives and video game consoles with hard disk driveshttp://www.instat.com/rh/commverge/newmk.asp?id=221&SourceID=00000024000000000000
culater
Inventors breathe easier after Supreme Court patent ruling
By Margaret Quan, EE Times
May 31, 2002 (12:12 PM)
URL: http://www.eetimes.com/story/OEG20020531S0075
WASHINGTON — Engineers, inventors, electronics executives and patent attorneys breathed a collective sigh of relief this week when the U.S. Supreme Court, in a unanimous decision, partially reversed a lower-court ruling that restricted the scope of patents and their inventor's ability to defend them against infringement.
The ruling in the case of Festo Corp. vs. Shoketsu Kinzoku Kogyo Kabushiki Co. Ltd. partially restores an inventor's right to use the "doctrine of equivalents," a concept that allows broad interpretation of patents, to defend against infringement — as long as the inventor can prove that he or she could not have foreseen the development of a competing technology.
The Supreme Court sent the case back to the U.S. Court of Appeals for the Federal Circuit, which in effect overthrew the doctrine-of-equivalents concept in a November 2000 decision.
The doctrine lets patent holders amend their patents after filing to guard them against competing inventions that — save for small changes — are substantially the same as the patented one. Doing so is standard operating procedure for most technology companies.
The Supreme Court's Festo ruling "is a positive decision" for the high-tech industry, said Michael Kirk, executive director of the American Intellectual Property Law Association. He said narrowing amendments to patent claims would now mark the outer limit of patent protections.
The case stems from Festo's 1988 suit claiming that pneumatics manufacturer Shoketsu Kinzoku Kogyo Kabushiki had infringed two of its patents for magnetically coupled rodless pistons. Following the doctrine of equivalents, Festo had amended its original patents to cover the differences between the Japanese company's equipment and its own.
The Supreme Court rejected the "absolute bar" standard against such amendments set by the Appeals Court. At the same time, the justices gave Festo's proposal of a "flexible bar" standard a thumbs-down, opting instead for the "foreseeable bar" proposed by IEEE-USA in an amicus curiae (friend-of-the-court) brief.
The IEEE-USA brief maintains that "the doctrine of equivalents should be permitted to apply unless the limiting effect of the amended language with respect to an accused device would have been foreseeable at the time of the amendment. Applied objectively, from the perspective of a reasonable person skilled in the art, this 'foreseeable bar' applies principles that are readily, if not commonly, understood by both the public and the judiciary."
Justice Anthony M. Kennedy, writing for the court, said, "The patentee must show that at the time of the amendment one skilled in the art could not reasonably be expected to have drafted a claim that would have literally encompassed the alleged equivalent."
Alexander Poltorak, chairman and chief executive officer of General Patent Corp. (Suffern, N.Y.), called the Supreme Court decision "a mixed bag," adding that it will have "profound consequences for all technologies, all existing and future patent cases, and all existing patents."
Though the decision does not abolish the doctrine of equivalents, as the Appeals Court ruling had done, Poltorak termed it a "new reading of patent law" that will narrow the scope of some patents. In the past, he said, the doctrine of equivalents sometimes led to a lack of clarity about patent rights and a monopoly situation in particular industries.
Inventors' victory
Industry observers termed the decision a victory for inventors because it effectively prevents patent holders from using their monopoly to close off entire areas of technology innovation. Critics have argued that some software patents have blunted innovation in that field.
Attorneys involved in the case said the high-court ruling balanced competing standards vital to continuing innovation in the electronics industry. "We have a new set of rules [on the scope of patent protections that] is clearer than ever before," said Andrew Greenberg, a member of the IEEE-USA's Intellectual Property Committee who helped prepare its friend-of-the-court brief in the Festo case.
Patent examiners often require inventors to file amended applications that narrow their claims. Under the court's new guidelines, holders of amended patents give up protection for only that which is foreseeable by others familiar with the technology.
An inventor's "decision to narrow his claims through amendment may be presumed to be a general disclaimer of the territory between the original claim and the amended claim," Justice Kennedy wrote.
Intellectual-property experts said the ruling places the burden of proof on patent applicants to show that their amended claims do not surrender broader patent protections against equivalent claims. They also noted that lower courts ignored a 1997 Supreme Court decision instructing courts to think twice before adopting changes in the patent system that disrupt the expectations of inventors.
The court struck a balance between pioneering inventions and incremental improvements to technologies that make inventions more practical, said Greenberg.
A key example, he said, is the Internet, where refinements in network technology have made the Internet ubiquitous.
"The new ruling helps protect the patent rights of inventors, many of whom are IEEE members," said John W. Steadman, IEEE-USA's vice president for career activities and head of the University of Wyoming's electrical and computer engineering department. "Had Festo been allowed to stand as it was, the very process in which a person is required to amend the patent during the patent process would have significantly restricted what courts called the doctrine of equivalents."
Steadman explained that the decision means an inventor has a right to say, "If I could have foreseen the invention or method or technique employed by the infringing device to create much the same function in much the same way to get the same result, then I give up my right to it. But if the development was not foreseeable, then I couldn't give up my right to it."
Engineers take heed
Meanwhile, patent experts this week advised engineers to take care in drafting their applications.
"What this decision means for engineers and inventors is that they must be very diligent in researching the patent history of previous developments," said Edward A. Suominen, an engineer who is a patent agent with Louis Hoffman P.C., a law firm of in Scottsdale, Ariz.
He advised EEs to "keep up with developments in the field of technology they are working in, work with someone who knows how to write a patent and be consistent about what they claim in a patent."
— George Leopold contributed to this story. http://www.eet.com/sys/news/OEG20020531S0075
culater
Apple Videoconferencing Device Coming Soon
Contributed by Kelly McNeill
osOpinion.com
May 31, 2002
http://www.osopinion.com/perl/story/18014.html
Compared with cell phones and PDAs (personal digital assistants), which have limited connectivity, low storage capacity, minimal upgradeability and exceptionally small screens, the desktop computer offers extensive options for connectivity and expandability.
It's no wonder that both Steve Jobs and Bill Gates have deemed the desktop computer the ideal product for maximum external device connectivity, or, as Jobs calls it (only to be echoed by Gates), a "digital hub."
Although PDAs, MP3 players, digital cameras and the like have a long history with regard to their ability to synchronize with personal computers, only recently have they become regarded as key drivers of desktop computer sales.
Premiere Platform
An oft-repeated mantra among Apple executives in the recent past has been that the Macintosh is the premier digital hub for these types of devices.
Products like Apple's iPod MP3 player have allowed the company to leverage its control of hardware, software and OS to maintain the best user experience possible -- something that isn't quite as easy for separately governed, component-based PCs.
When Apple unveiled the iPod, we got our first peek at the company's digital device strategy.
Give Me a Hint
During Apple's April shareholders meeting, CEO Steve Jobs alluded to the company's plan to offer a best-of-breed videoconferencing device in the near future. I believe this product is likely to be the second in Apple's series of digital hub devices.
At last February's QuickTime Live conference, the company hyped a product called QuickTime broadcaster, which reportedly will capture and encode QuickTime content, including MPEG-4, for live streaming via the Web.
I believe Apple was planning to release the videoconferencing device Jobs referred to at the shareholders meeting but was forced to put it on hold because of licensing problems with MPEG-LA (the largest group of MPEG-4 patent holders).
All Signs Say Yes
When you also consider the company's exceptionally strong push toward providing both consumers and professionals with the best hardware and software products for video editing, it seems that the company is poised to make a big splash in the video production market and thus make large strides with its QuickTime file format.
In recent statements, Apple has hinted that it is close to securing appropriate terms with MPEG-LA for the licensing of MPEG content. Those statements suggest that the company may have the product ready for July's Macworld Expo.
Coming Soon
If my speculations are correct, this portable device would allow consumers (at least those using a Macintosh, considering Apple's stance with the iPod) to stream video content to the Web at any time from any place.
The potential for such a device seems incredible.http://www.osopinion.com/perl/printer/18014/
culater
Car Dashboards Look More and More Like Jet Cockpits
By David Kiley and Earle Eldridge
May 31, 2002
http://www.wirelessnewsfactor.com/perl/story/18011.html
In its new flagship 745 sedans, German automaker BMW has filled the cockpit with more than 700 features that can be controlled by buttons and switches, voice command and -- the piece de resistance -- a joystick in the center console that runs a cursor on a dashboard screen.
It's the latest attempt by automakers to stuff an ever-growing array of gadgets into cars, everything from hands-free cell phones, navigation systems and DVD players to sensors that watch for road obstacles, cruise control that reads the distance of the car ahead and cameras that give rear views on an in-dash screen.
Industry watchers expect such gadgetry to be offered in 80 percent of cars sold by 2006. Already, it's moving from luxury cars to youth-oriented vehicles such as the under-$20,000 Pontiac Vibe and Toyota Matrix sport-utility wagons and to mass-market cars such as the Nissan Maxima and the Toyota Camry.
No wonder. Automakers can make big money on cars filled with stuff while a growing number of younger, tech-savvy Americans are driven to be constantly connected -- even while they're driving.
Rumble of Concern
Below the drumbeat for the newest in-car gadget is a rumble of concern from those who wonder how hungry Americans really are for multi-tasking on the road; from those who say that driving safely is enough of a job in a car; to those who question how reliable all the equipment is and how much it will cost to fix.
Those who want more stuff seem to be winning, at least with luxury automakers such as BMW.
Some people rave about BMW's iDrive system. "I am in love .... Everything about the car is amazing," says Andrew Bullen of Ramsey, N.J.
'Car Was Driving Me'
But for a lot of people, it's just too much stuff. "The car drives wonderfully, but I found the whole business too much of everything," says Ned Connelly of Boston, a BMW 5 Series owner who was looking to trade up. "I enjoy driving, but I felt like that car was driving me."
Even among automakers, there is no consensus about how much is too much. BMW says it will make iDrive standard in all future models, including the bottom-priced $23,000 1 Series, which will make its debut in 2004.
"IDrive is a system to improve on driver-distraction issues, and as a driver gets more accustomed to it, that's what happens," says Thomas Jefferson, the 7 Series product manager. "It's also a system that allows us to make the whole cockpit more user-friendly."
A Different Approach
But when Bob Lutz became General Motors product boss last summer, he threw out plans to load up the new Cadillac XLR coupe with an array of technology as standard equipment, ordering it to remain optional. "Added technology and gadgets are only attractive to people who want and use them, and not everybody does," he says.
Some automakers like the bragging rights of using the newest technology, says Jan Zverina, Chrysler's head of product communication. "The more stuff you have, the more advanced you can appear."
What do we want?
Figuring out how much drivers want in their cars is tricky, experts say. "There is a certain portion of drivers who view the car as sanctuary from the world," says Frank Forkin of J.D. Power and Associates. "And then there are technophiles who want everything, and it's hard to convince that crowd that there is a limit to what they should be able to do."
Page 2
"Our buyers like new technology, but only if it delivers a real human benefit," says Jack Collins, Nissan/Infiniti product planning chief.
When Infiniti, Nissan's luxury brand, launched its redesigned Q45 sedan last year, it ran an ad showing a driver backing down a spiral parking garage exit using a camera that projects an image of what's behind onto a dashboard screen. "The response to that ad was phenomenal," Collins says. He says 60 percent of buyers choose the rear camera and voice command controls, both part of an option package costing $9,000.
By the Numbers
Other numbers on sales of optional technology:
Cadillac says about 12 percent of its DeVille buyers opt for the $2,200 Night Vision, which projects images of people, animals and other things that are beyond the headlight beams onto the windshield.
Chrysler says about 5 percent of its minivan buyers are opting for a $1,600 rear-seat DVD or VHS entertainment system. It's standard equipment on an up-market version of the minivan.
GM says the renewal rate for OnStar service after an initial free period is 56 percent. Cost ranges from $16.95 per month for the basic safety and security package, which notifies authorities if an air bag deploys and will open a car by satellite, to $69.95 monthly for the top-of-the-line package that includes Internet access and e-mail. The lion's share of subscribers pick the basic package. Internet and e-mail interest has been minimal, Power's Forkin says.
Acura says about 17 percent of its CL coupe and TL sedan buyers opt for a $2,150 option that includes a navigation system and OnStar. The rate is twice that among buyers of its MDX sport-utility vehicle.
Perfect the Low-Tech First
Some industry experts wonder whether automakers are getting ahead of themselves.
"I definitely think there is a large body of people to whom less can be more when it comes to equipment like telematics, navigation systems and voice command," says Dan Gorrell of Strategic Vision, who consults with automakers on what to put in and what to leave out.
Gorrell tells clients to perfect lower-tech features -- like better grocery holders in minivans, standard two-pronged electrical outlets, reliable hands-free cell phone service and real-time traffic reports -- before going on to things like on-board Internet access.
"What you leave out of a car is just as important as what you put in," says Rich Schaum, Chrysler's product development chief. He says Chrysler has gone back to old-fashioned volume and tuning knobs for radios after customers said too many small switches and buttons complicated operations.
Study the Manual
Schaum's standard for designers to meet before he commits to a technology: "It has to be operable without the (owner's) manual.... totally intuitive to use."
People who have driven BMW's 745i say it takes at least 45 minutes of owner's manual study before the car can be driven with confidence. The car comes with a cheat sheet to stick on the steering wheel for valet parkers.
Safety First
Is it safe?
While automakers try to figure out how much car owners want, they and others also are trying to figure out how much drivers can handle safely.
"These products are rolling out much faster than we can complete the research to understand it and how it affects the driver and traffic safety," warns Paul Green, who directs human factors research at the University of Michigan's Transportation Research Institute.
Researchers say people think they can handle multi-tasking, even while driving. "But there is no research that shows humans have evolved in the last 10 years to where they are better at doing more than one thing at a time," Green says.
Doing More Than Driving
"Most drivers think driving is a waste of time," says Andy Norton, who tries to figure out how GM vehicles should be equipped. "They say, 'I have the capability to do more while I'm driving.' People who study human factors usually tell them they are wrong."
Numbers on how technology in cars affects safety have been hard to come by, largely because drivers can't be trusted to tell the truth about what they were doing at the time of an accident.
Automakers and the government are trying to remedy the void in research:
GM is running 14 test cars around the Detroit area this year equipped with adaptive cruise control, a device that uses radar to slow a vehicle down automatically if it gets too close to another car. Cameras and computers in the car capture how the driver acts and feels as the device is engaged.
The National Highway Traffic Safety Administration is sending 100 cars equipped with cameras onto Northern Virginia roads to get as much information as possible about distracted drivers.
The National Traffic Safety Board is investigating the role cell phones play in fatal accidents as Fatality Analysis Reporting System data doesn't accurately capture it.
Ford Motor has a driving simulator that looks for the point at which multi-tasking in a car leads to reckless driving.
But Is It Reliable?
Jeff Greenberg, who directs the simulator work, is wary of multi-tasking in cars. "A typical multi-task for people is playing voice mail while checking e-mail at your desk, and often you miss the voice mail message and have to go back. Now apply that failure to driving."
Will it work?
Another worry about technology-loaded cars is reliability. Scott Cutler, 55, of Wilmington, Del., is one of a group of 745i owners sharing stories on the Internet about glitches ranging from a faulty navigation system to hardware that doesn't mesh with the software and key fobs that won't open doors. "I like to be on the cutting edge," he says, but the bugs are bothersome.
"I like hands-free calling, a kick-ass stereo and maybe a nice, easy-to-use navigation system, but that's it," says Peter Brown, 30, of Hackettstown, N.J., who owns a Camry and has been eyeballing the 745i. "But I don't want any of it if it's not very seamless, easy to use and reliable.... There is no charm in stuff that doesn't work."
Too Early To Tell
Mechanics say it's too early to know how reliable the technology will be. But William Filley, director of the mechanical division of the Automotive Service Association, which represents mechanics, says "with consolidation of all the computer systems, it may become easier to diagnose a problem."
There is a hint of the types of problems the technology might present in service bulletins that automakers send out, according to Alldata, which supplies automaker information to mechanics and others.
Alldata says that in the past two years, automakers have warned about problems, including a navigation system that might not connect to a communication system; a power window screen that could cause radio interference; a navigation system that could give incorrect distances; and a key fob that sometimes didn't work.
Future Costs
Gabriel Shenhar, senior auto test engineer for Consumer Reports magazine, says paying to fix these problems if they show up early in ownership will be covered by the manufacturer's warranty. But owners could face costly repairs for gadgetry that breaks down after the warranty expires.
Despite the concerns about all the stuff, Forrester Research predicts technology in cars will soar from $3 billion in annual sales and subscription revenue in 2001 to $20 billion by 2006.
Some say there has to be a limit.
"I've had cab drivers in Germany watching cartoons on the dashboard while driving 140 on the autobahn," Chrysler's Schaum says. "I don't ever want to see that in the U.S."
http://www.techextreme.com/perl/story/18011.html
http://www.techextreme.com/perl/story/18020.html
culater
Can USB 2.0 deliver on inflated claims?(edig mention-see bold)
Maury Wright, Editor-in-Chief -- CommVerge, 4/1/2002
For going on three years now, we've heard Intel harp on the merits of USB 2.0. At first the USB camp claimed that the second version of the interface would boost data rates from 12 Mbits/sec to about 200 Mbits/sec. Later the USB group raised the upper limit to 480 Mbits/sec—presumably topping the current version of IEEE 1394 (FireWire), which maxes out at 400 Mbits/sec. A faster USB should enable speedier links to PDAs and music players as well as other convergence-related applications.
So with USB 2.0 products finally coming to market, I decided to see just how close the interface comes to the stated maximum data rates and how it compares to FireWire.
USB 1.1, which is in wide use today, has been practically limited to connecting relatively low-speed peripherals. Joysticks, printers, low-end scanners, and even mice are the most popular USB 1.1 products. The interface also shows up in low-resolution, low-frame-rate Web cameras and cameras for kids' applications like Lego Studios. In addition, the link is employed by audio products like the Microsoft Game Voice, which mixes audio input and speaker control.
Companies also offer storage products like external hard drives and CD burners based on USB 1.1. But while these products offer the advantage of installation without opening the PC, their performance is unacceptable. With CD burners, the slow USB data rate contributes to buffer-underun problems, which result in failed burns. As for hard drives, modern ones can sustain data transfer rates of several hundred megabits per second, so the 12-Mbit/sec USB 1.1 interface is essentially intolerable for use with hard drives.
With USB 2.0 products finally coming to market, I decided to see just how close the interface comes to its stated maximum data rates.
Today, USB 2.0 proponents claim that the revved-up bus can offer data rates superior to FireWire while maintaining forward and backward compatibility. Existing USB 1.1 peripherals should work fine when plugged into a USB 2.0 host, although the older peripherals clearly can't take advantage of the faster data rates. Likewise, a USB 2.0 peripheral should still work fine when connected to a USB 1.1 host, albeit at the throttled-back rate of 12 Mbits/sec.
Before we get into the details of my work with USB 2.0 and FireWire, I had best admit that I've been an open proponent of FireWire and a frequent critic of USB 2.0. In fact, I once called USB 2.0 a "gravely ill dog" that should be allowed to die (see "Universal it's not," June 2000). I believe that the peer-to-peer and intelligent nature of FireWire is superior to the host-centric nature of USB, in which the host must poll all the connected devices to see which ones need to use the interface. Still, USB has generally offered lower cost than FireWire, and I'll welcome a faster USB for applications like syncing my PDA or downloading music to an MP3 player.
My USB 2.0 evaluation started when Maxtor shipped me one of its new 120-Gbyte 3000LE disk drives (left). A PCI card with five USB 2.0 connectors (four external and one internal) accompanied the drive. The add-in card is similar to several on the market that use a USB 2.0 chip developed by NEC. Companies like IOGEAR offer such cards for around $60. Maxtor sells its card for $50, but the company is really in the card business only to support drive sales.
I didn't expect the installation of the card and drive to be complicated. Still, I was surprised that nothing more than a brief quick-start guide accompanied each product. Perhaps there was more documentation on the driver CD or floppy disk, but I never needed to look. The drive comes preformatted for Windows using large disk support, and I had the card and drive working inside 10 minutes. The most important thing to remember is to install the software before adding the hardware—a common sequence with FireWire and USB products but the opposite procedure from the old days of add-in cards.
Once I had the card and drive working, I immediately sought to test the USB 2.0 compatibility claims. I had a USB 1.1 hub connected to the USB 1.1 interface that's built into the motherboard on my PC. In turn, the hub was hosting a Microsoft Game Voice unit, a Microsoft Sidewinder joystick, and a CompactFlash reader. I moved the hub connection to the new USB 2.0 PCI card, and only the CompactFlash reader balked. I also connected several other peripherals to the USB 2.0 card, either directly or through the USB 1.1 hub. Products including a Lego Studio camera and an Intel QX3 USB microscope worked without a hitch. I've yet to go back and troubleshot the CompactFlash reader, but generally I'm convinced that the USB community has delivered on its promise of compatibility.
IOGEAR's USB 2.0 PCI card
My reservations about USB 2.0 going into the project, however, centered on two issues. First, I've long doubted that USB 2.0 peripherals would come anywhere near the specified 480-Mbit/sec data rate. In fairness, computer interfaces rarely meet such maximum specs. For instance, in the case of 10-Mbit/sec Ethernet, you'd be lucky to measure better than 6 Mbits/sec in practical use. The discrepancy between specified maximums and realized data rates is generally due to both time wasted while many nodes arbitrate for access to the interface and to inefficiencies or overhead in communication protocols and data formats. And on any shared medium, a higher number of active nodes means lower bandwidth per node. So I knew USB 2.0 wouldn't deliver 480 Mbits/sec, but I also expected that the polled, host-centric bus might be, relative to the stated maximum, far less efficient than a peer-to-peer link like FireWire.
The second issue that concerned me was the probable mix of low- and high-speed peripherals on a USB 2.0 interface. USB proponents love to point out that the connector is universal. You can plug any USB peripheral into any USB port on a hub or host and expect everything to work. But what would happen to the speed advantages afforded by USB 2.0 if a bunch of 12- or even 1.5-Mbit/sec peripherals were connected to the same bus as a high-speed device like a disk drive? Due to the polled nature of USB, I suspected that mixing and matching slow- and high-speed devices might further degrade performance.
Transfer pass
To test my theories, I created a folder with right around 100 Mbytes of data in a mixture of small and large files, and I stored that folder on the my test system's C drive—an ATA-100 drive. I planned to record the time it took to copy the folder from the ATA drive to the USB drive under various conditions. I'd run each test multiple times and average the results to normalize small differences in performance and hopefully eliminate any inconsistency in how fast I could start and stop my stopwatch. I also planned to run the same test using a 60-Gbyte Western Digital FireWire drive.
I started by timing the performance of the USB and FireWire drives with no active task (other than Windows Explorer) running on the PC. Moreover, during this initial test of each drive, I made sure that nothing other than the drive in question was connected to the PC.
This initial test configuration ought to deliver the maximum data rate that USB 2.0 can achieve. The interface on the ATA drive that contained the test folder is significantly faster than USB 2.0 or FireWire. And the 1-GHz Pentium III processor in the system is certainly faster than the average deployed PC. I clocked the USB drive at just over 9 Mbytes/sec (about 73 Mbits/sec), and the FireWire drive at 14.3 Mbytes/sec (a little over 114 Mbits/sec).
Clearly USB 2.0 isn't going to come close to the promised 480 Mbits/sec. And it also proved significantly slower in practice than FireWire. I must admit, however, that its performance would be suitable for most consumers in most usage scenarios. And I was a little surprised that the FireWire drive didn't perform better.
In case you're wondering, I do realize that the two drives aren't identical and that their raw performance could possibly affect my results. But both of these external subsystems feature state-of-the art disk drives capable of writing data to the media at around 400 Mbits/sec. Moreover, both have large buffers. I don't believe in either case that the drive itself had an effect on the data rate across the interface.
Board the bus
Next I planned to see what would happen to the USB 2.0 data rate if I added peripherals to the USB interface and increased the load on the Pentium III CPU. First I tested the USB data rate while the PC was playing an MP3 tune from the ATA drive. The PC relied on a USB-connected Microsoft Game Voice to switch the audio output, but the audio didn't traverse the USB bus in any way. Still, the load on the CPU reduced the USB 2.0 data rate to 7.7 Mbytes/sec.
I had requested some other USB 2.0 peripherals to test. I planned to see how different the test results might be when only USB 2.0 devices were connected to the 2.0 host, relative to the results when 1.1 peripherals were in the mix.
At that point, however, no other 2.0 peripherals had arrived, so I added a bunch of my USB 1.1 peripherals back onto the USB 2.0 connection. My first test centered on using the Lego Studio camera to stream a live video window in the background while I performed the folder-copy test. I was surprised to find that USB 2.0 performed pretty well, delivering just better than 7 Mbytes/sec. The presence of the video stream on the USB bus had only a slightly greater impact than using an MP3 player, and the latter should have affected only the host CPU.
I also tried to test the effects of having a USB 1.1 mouse and joystick attached to the USB 2.0 interfaces. The mouse had a small effect, much like the MP3 playback. I never managed to figure a way to test the effect of the Sidewinder joystick. I tried running several Microsoft games that use the Sidewinder while I launched the copy test. In each case, however, the game would automatically enter pause mode when I brought Explorer to the foreground to begin the copy operation. I thought about trying to play the game in the foreground while the copy operation ran in the background, but couldn't come up with a way to discern the latency between the start of the copy operation and when it was swapped into the background.
Next, I connected my son's Intel USB Microscope to the USB 2.0 card. I also still had the mouse connected via USB, and the joystick was connected but not being used. With the bus loaded in this fashion and the Intel Microscope application (essentially streaming video) running in the background, I encountered a number of problems with the folder-copy test. In one case, the system hung and had to be reset. In numerous other runs, I measured data rates between 1.7 and 5 Mbytes/sec. Quite often the folder-copy operation would appear to stall for seconds at a time. I assume the video feed from the camera in the Intel Microscope is of a higher resolution than the one from the Lego camera.
Still, I wondered how much of the problem could be attributed to USB bus traffic and how much to the CPU load. So I connected the FireWire drive to the system and ran the folder copy to that drive several times while the Microscope application was active. This yielded copy performance of just over 5 Mbytes/sec on average. So the CPU load affected the time it took to copy the folder to the FireWire drive, but not to the extent it did with the USB drive. In addition, the FireWire drive never exhibited the stalls in the copy operation or the wide discrepancy in results that the USB drive did.
Standing room only
I still hadn't received any other USB 2.0 peripherals to test, so I decided to burden my FireWire interface and see how it handled a congested bus. I started with the most difficult test I could envision. I connected my DV camcorder to one port and the FireWire drive to the other port on my two-port FireWire adapter card. Generally, you can daisy-chain FireWire peripherals, whereas you must use hubs to cascade USB connections. But I knew from experience that the camera connection doesn't work when daisy-chained behind other peripherals.
With camera and drive attached, I launched neoDVD from Mediostream. This application, discussed in an earlier Digital Den (see "Now and Den," February 2002), moves video between a camcorder and a storage device. The camcorder outputs a 25-Mbit/sec video stream via FireWire and uses the isochronous capabilities of the interface to reserve that bandwidth. With live video streaming into a background window, I performed the folder copy to the FireWire drive.
The FireWire transfer rate dropped to 7.3 Mbytes/sec (compared with, as you'll recall, 14.3 Mbytes/sec for the copy operation alone). I was surprised that the video traffic imposed such a significant performance hit. However, I realize that the camcorder video stream is much higher in resolution and frame depth than any USB camera.
I also wondered how mixing data storage devices on FireWire might affect the measured rate in my tests. It occurred to me that perhaps ripping an audio CD would provide an interesting test obstacle. I had a USB 2.0 CD/DVD burner on the way that could read data and audio CDs, and I had a Yamaha FireWire CD burner on hand that could do the same. So I went ahead and tested the FireWire products. I started a rip operation from the FireWire CD drive to the main ATA drive. Then, with the rip in progress, I performed my folder-copy test to the FireWire disk drive, which was daisy-chained to the CD drive. The copy operation clocked in at 10 Mbytes/sec.
Setting aside my FireWire products, I was happy to see two more USB 2.0 peripherals arrive for evaluation. IOGEAR sent its MemoryBank CompactFlash reader/writer (left), and QPS sent a DVD/CD burner based on a Pioneer drive. The Pioneer drive can read and write CD-R, CD-RW, DVD-R, and DVD-RW media as well as reading data and audio CDs and video DVDs.
Unfortunately, I didn't have the same success installing either the IOGEAR or QPS products that I enjoyed with the Maxtor disk drive.
In the case of the IOGEAR MemoryBank, it appears the stumbling block might be a late change to either the CD that contains the drivers or the product manual, or both. The manual advises you to run an installation wizard, but the CD contains no such "setup" program. The CD does include what appear to be Windows drivers (.inf files) for the MemoryBank. Unfortunately, every attempt to manually install these from Windows Device Manager results in a message indicating that the appropriate driver isn't available. Even with tech support from IOGEAR, I had yet to rectify this problem when this issue went to press. Watch our Web site for an update.
I do think that the USB 2.0 MemoryBank will be extraordinarily useful. My family is big on MP3 players, such as e.Digital's MXP 100, which I've written about previously (see "Goin' mobile," December 2001). But filling that player's IBM Microdrive isn't quick via its integrated USB 1.1 link, and the 256-Mbyte CompactFlash cards take a while too. In contrast, a speedy FireWire connection is one of the features that's made Apple's iPod a success. A product like the MemoryBank will bring hard-disk-like data rates to the task of writing music to CompactFlash cards.
Bus-based burner
Meanwhile I ran into a different sort of problem with the QPS CD/DVD burner (right). The evaluation drive I received had already been reviewed by someone else, and it arrived with no installation CD. Moreover, the device appeared to have taken a hit at some point. After much trial and error, I believe the drive itself is damaged. I was able to download drivers from the QPS Web site, and the drive shows up in Windows Explorer and My Computer. But the drive, which should read almost any disc, currently won't even read a data or audio CD.
Again, watch our Web site for an update. I'm planning several things for the QPS drive. I hope to repeat my transfer-rate test while ripping a CD from the drive via USB 2.0. And the drive will also be part of an upcoming Digital Den where I'll test the various recordable and rewritable DVD formats.
I would like to acknowledge QPS for having one great idea. The QPS CD/DVD burner came with a padded carrying case. Given the proliferation of USB and FireWire peripherals, this is a good thing to have—not just to carry the drive between systems but to store the drive, power cable, and USB cable when it's not in use. When I first tested FireWire products (see "Digital director," March 2001), I noted the chaos that had grown on the desk around the PC I used to evaluate the products. While both USB and FireWire support hot plugging and don't require you to open the PC, they result in a tangle of power, data, and audio cables—not to mention stacked drives, printers, scanners, and so on. In my house, we're trying to control the mess by storing peripherals that aren't used everyday. The original boxes are less than convenient, but a case like the one QPS provides is ideal.
Express or local?
So, are you wondering what a diehard FireWire proponent might have to say in closing about USB 2.0? Well, I won't give it an unqualified glowing endorsement, but I expect it to garner a semi-permanent spot in my PC. I must admit that USB 2.0 performed better than I expected. In fact, it seems to take the combination of a pesky application and a problematic device (like the Intel microscope and its control program), to create real trouble.
I won’t give USB 2.0 an unqualified glowing endorsement, but I must admit that it performed better than I expected.
I expected USB 1.1 peripherals to create much more havoc. In fact, I assumed I'd use this space to warn users to stringently separate USB 1.1 and USB 2.0 buses despite the common connector. I thought that a PC with only USB 2.0, like a recently introduced Gateway model, would be a disaster waiting to happen. Clearly, my tests show that my concerns were mostly unwarranted.
Still, give me a choice, and I'd definitely buy a FireWire peripheral rather than a similar USB device. The FireWire unit will generally perform better. Looking back, USB was a no-brainer because the USB host was built into every PC. Now, however, more than half of consumer-targeted PCs come with FireWire as well, so the cost differential is going away. On the peripheral side, the margin has narrowed as well. Maxtor sells its 120-Gyte USB 2.0 drive for $299 and a 160-Gbyte FireWire drive for $399.
And I will continue to maintain that there was really no need for USB 2.0 in the first place. Had Intel swallowed its ego and endorsed FireWire a couple of years ago, we'd all be better off. The FireWire roadmap takes us to 3.2 Gbits/sec and links the computer to the digital living room. Moreover, FireWire can directly link any two devices without the services of a host PC—a capability that USB proponents have discussed for some time but have made no apparent progress on (see "Hot plugs," March 2001). All digital camcorders ship with FireWire, and it will soon be standard on DVD players and HDTV systems. Perhaps being stuck with both USB and FireWire won't be so bad, though. IOGEAR even offers a PCI card that has both kinds of connectors.
http://www.e-insite.net/commvergemag/index.asp?layout=article&articleId=CA203871
culater
Technology takes guts
By Rod Atkins, General Manager, IBM Pervasive Computing
May 2002
Often in the technology media, the tendency is to focus on cool gadgets and consumer application. While cool is fun, we mustn’t overlook the less glamorous aspects of computing as we look ahead.
Sure, it’s fashionable to talk about the newest PDA or mobile phone, but that’s not where the real action is. You’ll hear a lot of talk about 3G, about location-based services, about extending e-business and e-commerce applications to mobile devices, about being able to get SMS messages via cell phone in the States, and being able to conduct banking and other commercial transactions over PDA. You should hear a lot about it -- there are some amazing possibilities and capabilities coming.
But the truth is, all the cool devices in the world won’t help anyone unless there’s something there to connect them to, and that connects them with each other. Infrastructure matters, as much as or even more than the devices. Because while the focus may be on all the different gadgets we can put chips into and connect to the network, the reality is that the back end is where all the heavy lifting comes in. To make this happen, it takes guts. Technology guts.
The infrastructure is the system that supports the network, carries the information, and stores it where it can be used. All the data we’re talking about accessing via mobile devices, all the transaction records and histories that will allow mobile commerce to happen, are input through devices, like mobile phones and PDAs. But that information has to be stored somewhere. You have to be able to easily cross reference and combine all that data.
This data isn’t collected and used by a clerk sitting behind a desk, typing into a database. It’s collected by the system, stored and cross-referenced by servers and computer systems, without ever requiring human action. Those computers and functions on the back end are the infrastructure.
The system and networks have to work. People are going to give up on these systems if every time they try to get into them, they get turned away. What we’ve gone through in the last few years with the Web has made people smarter and more comfortable with technology. It’s also made them more demanding. They expect connections to work. They expect systems to do what we say they will do. Put simply, we don’t have second chances.
We all remember that as businesses made the transition to having a Web presence, there were plenty of highly publicized outages and crashes -- and that was just PCs accessing information. Now, we have to think about the same number of users, but with multiple devices and sources of input. And as we make the user experience easier and computerize more and more of the items in the average home, the number of users and inputs grows exponentially, too.
Which means transaction loads skyrocket. Scalability requirements escalate. The infrastructure must scale to tens of millions of users instead of hundreds of thousands.
There’s the paradox: the simpler something is on the front end, the more complex it is on the back end. That’s why infrastructure matters.
As we develop greater infrastructure capabilities, it’s paramount that we remember above all that it needs to work. It can’t matter what kind of a device is used, whose servers your network runs on, or whose middleware enables the applications. Everything must work together seamlessly. And that’s why open standards are critical to the future of computing.
Middleware matters. All the devices and all the powerful infrastructure only work together if the middleware lets them work together. And despite what some may tell you, proprietary operating systems and networks aren’t going to drive the new world of pervasive computing. Middleware must support all modes of access, be they PDA’s, cell phones, kiosks, Internet-connected appliances, embedded devices, or even traditional PCs.
The reality of a truly pervasive network is that it must be an aggregation of autonomic, socialized networks -- with that aggregation entirely transparent to the user. We need an entirely heterogeneous environment, platform-agnostic, dependent on open standards.
So as you read the articles about all the amazing consumer applications and cool new devices coming our way this year, be excited-- it’s going to be an exciting year. Just remember that it’s not the devices that will make it happen. As with most things in life, it’s what goes on behind the scenes that is far more interesting, and what really makes all the flashy stuff possible. http://www.wirelessreport.net/opinionandeditorial/may02/technologytakesguts.html
culater
OT-Middleware: A Brief History
Contributed by Adam Barr
osOpinion.com
May 28, 2002
http://www.osopinion.com/perl/story/17955.html
Windows began as a middleware layer on top of DOS, smoothing out differences between hardware that was functionally equivalent but technically different.
Middleware has been a hot topic in the news lately. One of the main goals of the nine holdout states in their lawsuit against Microsoft has been to mandate removal of middleware components from Windows.
Why is control of middleware seen as such a key issue? Can middleware generate significant revenue?
To understand the issues involved, we must go back to the elder days of computing. Before the introduction of the IBM PC and DOS, the personal computing industry was fragmented, with many different types of hardware running various operating systems.
All of the hardware exposed similar functionality to software developers: parallel port for printing, serial port for modem, keyboard for input and text display for output. It would have made sense to write a common layer that sat above the different operating systems and exposed the same API (application programming interface), so developers would have had an easier time porting their applications.
Going in Circles
But this idea went nowhere. The reason was performance: On an Apple II with 16K of memory, there wasn't room for the overhead of a middleware layer on top of the native operating system. Programmers might not have been thrilled about having to rewrite their applications for each platform, but they did it.
Along came the IBM PC and DOS. Microsoft marketed DOS to developers with a middleware argument. It was apparent that 16-bit microprocessors, and the Intel 8086/8088 in particular, were going to take over, but it was not obvious that the IBM PC design would become ubiquitous.
So, Microsoft presented DOS as a middleware layer, between the application and the BIOS and raw hardware, that software developers could code to. Microsoft then would do the work of porting DOS to whatever 16-bit systems became popular.
One of the other operating systems that IBM sold for the original IBM PC, the UCSD p-system, was explicitly a middleware system that compiled binaries to intermediate "p-code" that was then interpreted by the operating system at run-time.
Middle Man
With some prodding by Microsoft, the market for 16-bit computers quickly narrowed to just the IBM PC and its clones, all with the same BIOS interfaces and hardware. Those machines soon had enough memory and processing power that the performance issues of middleware became non-issues.
In the late 1980s, however, a different problem arose that prevented middleware from gaining traction. That problem was "feature lag." As the PC platform gained popularity, it became more heterogeneous, with graphics cards, joysticks, CD-ROM drives and sound cards appearing in some hardware configurations.
This situation complicated things for both middleware designers and software developers targeting middleware.
Applications need conditional code to handle optional hardware. Much worse, however, is the fact that when a new class of hardware appears, there is a delay before the middleware layer is altered to support it. Application writers who avoid middleware and write directly to the underlying system don't have this problem.
Too Much
Unfortunately, the direct coding approach can be a lot of work. For example, when DOS application developers wanted to go beyond basic text, they had to include per-application drivers for the various graphics printers on the market. Graphical applications had to deal separately with each video adapter. Supporting new hardware often required making hardware-specific BIOS calls from within an application.
This meant that the DOS/BIOS model was not ideal from Microsoft's point of view. While the executable format was DOS-specific, and such functions as memory management were exported by DOS, BIOS calls were made directly to the firmware and remained unchanged if an application was ported to another PC operating system.
Therefore, so-called "DOS" applications were only partially bound to DOS (keep in mind that the original IBM PC had no hard drive but instead booted from a floppy disk, so it was entirely reasonable that a user might use different operating systems for different tasks).
Then Microsoft came along with Windows, which solved all of those problems. Windows began as a middleware layer on top of DOS/BIOS, smoothing out differences between hardware that was functionally equivalent but technically different.
The Next Step
For application writers, Windows was a huge advance over DOS, just as DOS had been a huge advance over the fragmented 8-bit world. It also became ubiquitous, allowing it to overcome the feature lag problem: Today, no new piece of hardware is released without a manufacturer-provided Windows driver, and new functionality in the BIOS/firmware is coordinated with new releases of Windows.
Since the mid-1990s, middleware has attempted a comeback. In fact, there are two classes of middleware: the traditional kind that exposes an API for applications, such as Java and Web browser plug-in architectures, and a new kind that serves as a platform for data, such as media players, instant messaging systems and the page-viewing components of Web browsers.
What are the chances that these new systems will become strategic advantages for the companies that produce them? Stay tuned for Part 2.
http://www.osopinion.com/perl/story/17955.html
culater
Major label first: unencrypted MP3 for sale online
By Kevin Featherly, Newsbytes.
May 25, 2002
For apparently the first time ever, a major record-label subsidiary is releasing an unencrypted MP3 file onto the Internet, hoping fans will fork over 99 cents for the right to own and use the song without constraints.
Maverick Records and Vivendi Universal Net USA jointly announced Thursday that a special dance remix of, "Earth," a track by bassist Meshell Ndegeocello, marks the first time a major-label artist has ever put a downloadable MP3 song up for sale on the Internet.
The song became available for download for 99 cents Thursday at a number of VUNet USA sites, including MP3.com, Rollingstone.com, GetMusic.com and MP4.com. The 50,000 subscribers to the Emusic MP3 service also will be able to buy and download the tune.
"This is a case of the music labels seeing if the honor system is going to work online," said Steve Vonder Haar, an analyst with Interactive Media Strategies in Arlington, Texas.
Because the track is an unencrypted MP3, it will be possible for listeners to burn the song onto CD and to transfer it to portable players. And, like CD tracks that easily can be converted to MP3 files, the song inevitably will find its way onto the numerous illicit file-sharing networks.
Jonathan Lamy, spokesman for the Recording Industry Association of America (RIAA), said his organization customarily declines to speak publicly about the business practices of its individual member labels, and he would not comment on Maverick's MP3 release.
"This is a bold step for Maverick Records and Meshell Ndegeocello," said Derrick Oien, president of VUNet USA's Music and Media Group, in a written statement. "They deserve recognition for giving digital music fans a simple way to collect and enjoy this previously-unreleased new song."
Better Late Than Never?
GartnerG2 music industry analysts P.J. McNeely said the move is one other labels probably will watch closely. But it's also an experiment, McNeely said, that was very late in coming.
Napster, the notorious MP3 swapping service that set the music industry on its ear only to be later sued into submission, launched the MP3-download phenomenon in late 1999.
"Unfortunately," McNeely said, "we didn't see this like two years ago. The labels are slow to embrace new distribution channels and marketing methods. The fact is this technology isn't new points at the lack of speed with which the labels have been moving."
Nonetheless, McNeely saluted the move as at least a step in the right direction for the music industry. "It's good news," he said. "Hopefully the rest of the labels are taking note, and have plans to do the same in the near term."
Brad Hill, a digital-music industry observer and author of a forthcoming book, "The Digital Song Stream," expressed doubts about the way Maverick Records is approaching its MP3 experiment.
"The overriding principle here," Hill said, "is that they're trying to sell a CD on the basis of a single. That's what bothers me the most. The fact that they're even charging for the single makes it worse in my mind."
Indeed, Maverick and VUNet USA used space in their press release to promote "Cookie: The Anthropological Mixtape," Ndegeocello's latest CD, which is due out on June 4. The first single from the album is not "Earth," but instead a track called "Pocketbook." The original, unremixed version of "Earth" will be on the record, however.
Hill said that, despite what he considers the companies' market faux pas, the mere fact that a label finally has set at least one track free onto the Internet is at least marginally a good sign.
"I'm glad to see that," Hill said. "But it seems like such a tiny incremental release. Honestly, I'd just like to see something more bold. As far as the major labels are concerned and as far as the RIAA is concerned, they view the situation as panic in the streets a collapsing industry. Something like this, which really just supports older analog marketing models, is complete unhelpful to me."
Analyst Vonder Haar said the MP3 release is clever on several levels. It is a publicity ploy that will generate publicity and interest in a relatively unknown artist, he said. It is also a tentative attempt to use MP3s to tap into a new source of revenue, by making people pay for the chance to hear a portion of a product that they would then have to pay for again, in slightly different form, on CD.
And, he said, if the song generates more interest among pirates than buyers, it will provide ammunition labels can use to justify their attempts to squeeze the peer-to-peer pipe shut, Vonder Haar indicated.
"That's why it's a no-lose situation to try this on a very limited basis," he said. "If a lot of people come and buy it, then P.T. Barnum was right. If they don't come and buy, or if only a few copies get sold and it starts showing up on all the various music file-sharing services, then the record industry can collectively say, 'A-ha! I told you so!'http://www.computeruser.com/news/02/05/25/news2.html
culater