FED & SED Monitors; New technology 2009

Posted by mr bill | Posted in | Posted on 8:56:00 AM

0

Sony presented a FED monitor (Field Emission Display) of 19.2 inches, with a resolution of 1 280 X 960, a luminosity of 400 cd/m2 an exceptional contrast rate which reaches 20.000:1! Refresh rate would be between 24 and 240 images per second and the angles of vision would be total (180°), a perfect table.

Field emission display (FED) technology was invented in the 1970s as a possible alternative to the traditional cathode-ray tube TV but has been never commercialized.


What is a SED Monitor?

It’s the next generation of television screens; the SED (Surface-conduction Electron-emitter Display) monitors offer a vivid color images, high definition displays and more larger screens.
The researches began in 1986 by Canon and Toshiba, and later they formed SED Inc. in October 2004.

FED and SED, Two Competitors?

The FED approaches technically the SED, The two Technologies uses hundreds of thousands of micro guns (electron emitters), each one able to generate a pixel, from where the advanced quality of the image With no warm up, heat and No backlight. The FED uses micro guns in cones forms, whereas the SED uses micro guns in the shape of slits, the FED is as finer as the SED, it consumes less energy and it offers a broad angle of vision, and a total absence of dead pixels. Sony aims mainly the professional markets in particular the data-processing monitors (As Computer monitors & televisions) whereas the SED is intended for the market of the large screens.

The marketing of the SED is planned by Toshiba Corp. and Canon for 2009 with Full HD models of more than 26 inches.

Source:http://www.xcess.info

Why Are iPhone Users Willing to Pay for Content?

Posted by mr bill | Posted in | Posted on 11:13:00 PM

0

Apple

It may be no surprise that the best-selling computer book so far this year is “iPhone: The Missing Manual,” by my colleague David Pogue (O’Reilly, 2007).

But here is something that did surprise me: The most popular edition of this book isn’t on paper or the PDF file that O’Reilly Media also sells. It is the downloadable application for the iPhone, according to Tim O’Reilly, the chief executive of O’Reilly Media.

Amid all the discussion of micropayments and other ways that the creators of news and other content can be paid for their work, the iTunes App store is shaping up to be a surprisingly viable way to sell all sorts of information and entertainment.

There is a lot more content of the sort you would have bought in the past but now you can get free on the Web: a directory of Congressional offices, standup comedy routines, gym workout videos, Zagat restaurant guides and a growing library of books. There is also a fair bit of free content, public-domain e-books like the complete works of Shakespeare and lots of advertising-supported media. (BusinessWeek has a report this week on the App store’s role in music.)

What’s most interesting is how iPhone users are willing to spend money in ways that Web users are not.

I’ve criticized Apple from time to time for not having a coherent approach to delivering free content with advertising. But in some ways, the development of a market for paid content is a bigger and less expected achievement.

Why has this happened? Apple has created an environment that makes buying digital goods easy and common. With an infrastructure that supports one-click purchases of songs and videos, it was easy to add applications in the same paradigm. Paying for software, especially games, is not new to Apple customers. So when you see the iPhone manual or the Frommer’s Paris guidebook, it feels natural to click. (And of course, your credit card is already on file with Apple.)

There are certainly other precedents. Many people who steal songs through Limewire nonetheless pay $1.99 to use the same tunes as ringtones. And for avid book readers, Amazon’s Kindle has found a market willing to pay for electronic books. Apple is also starting to sell subscriptions to bundles of music, video and images from certain bands, like Depeche Mode. This is technically a product of the Music store, not the App store, but it still shows how people may be willing to pay for various bundles of content online

There is a lot of work to do here. For example, I find the O’Reilly iPhone book a little hard to use. The text doesn’t seem particularly well-formatted for the iPhone page. And I would love to see more interactive features that utilize the phone interface (including some of David’s videos).

Andrew Savikas, O’Reilly’s vice president for digital initiatives, agrees with me, saying that the iPhone manual was rushed to get it out before Christmas. The company now has 20 titles in development for the iPhone (and eventually other mobile phones), and it is spending more time weaving in hyperlinks and adding other features.

“There is a lot more we can do to take advantage of this as a new medium,” he said. O’Reilly, which sells to a lot of early adopters, has a range of digital distribution media.

“We try to say all of our writing is writing for the Web, and all of our publishing is digital publishing, so all our focus is building things into the content that make it more friendly to be digital,” he said.

Before media companies rejoice that Apple has found a way to persuade a generation used to getting everything free on the Web to pay for some content, they should look a bit more closely at O’Reilly’s experience with the iPhone manual.

The book, which sells for $24.99, was initially offered as an iPhone app for $4.99. When the publisher raised the price to $9.99, sales fell 75 percent. O’Reilly quickly dropped the price back down to the lower level.

“This audience is very price sensitive,” Mr. Savikas said.

So even if all content doesn’t have to be free, it may well have to be cheap.

Source:http://bits.blogs.nytimes.com

Innovation: The battle for the paperless book

Posted by mr bill | Posted in | Posted on 6:27:00 AM

0

E-book readers are set to grow in popularity as features improve and prices fall (Image: Sony)

Innovation is our new column that highlights the latest emerging technological ideas and where they may lead

I've just read War of the Worlds for the first time and found it pretty astonishing. Not only because H G Wells' tripods head for a staid corner of southeast England rather than the high-rise cities Hollywood aliens prefer, but because I read it on two portable electronic gadgets - a touch-screen smartphone and a dedicated e-book reader.

Although I had thought of this as a kind of head-to-head trial that would reveal which platform will take over from paper books, I was surprised to find myself unable to declare a winner.

Instead I became convinced that talk of one or the other emerging to take the place of the paper book is misguided. Instead, both forms will likely coexist and complement one another, providing a new way for us to access the long-form content hitherto trapped inside relatively bulky paper tomes.

Mini marvel

The two gadgets I used were a Sony Reader and an Apple iPhone. Like the newly launched Amazon Kindle 2, the Sony reader is a paperback-sized gadget that displays pages on a black and white E-Ink screen.

Both Kindle and Reader come bundled with 100 books from free, out-of-copyright collections like Project Gutenberg, and iTunes-like software that lets you load and buy more texts. Unlike the Reader, which requires a cable connected to a computer, the Kindle can even do this wirelessly over a 3G network.

But when you're on the move carrying an e-reader that's much more than pocket-sized is not always ideal. That's why I found myself experimenting with the e-reader software packages available on the iPhone to keep up with the progress of the Martian tripods.

A vast number of such "apps" are available to download through the AppStore, I used a free one called Stanza. Just like the dedicated readers, it helps you get hold of a large range of free and paid-for e-book download sites.

Text feed

As you would expect from a phone, the text is sharp but small. The smaller format, though, brings with it big benefits. Being able to take a phone out of your pocket and read books anywhere, even if you're strap-hanging on a crowded bus or standing in a supermarket queue, is a revelation.

Of course, one place this will be no surprise is Japan, where some of the top-selling printed books started life as e-books. Some are even written in text abbreviations and sent as a series of SMS instalments for cellphone consumption by commuters.

Of course, the brightly lit LCD touchscreen of a cellphone drains batteries fast and using a dedicated reader is preferable when the space is available. But it is only going to get easier to read books on a cellphone, as companies like Google, Nokia and Palm build their own app stores that can offer readers software and book content.

The point is, books are not about to be made obsolete by electronic readers. They will still be around but their content - the only thing that really matters - is increasingly going to be consumed electronically, on multiple platforms too. And I'm all for it.

Source:http://www.newscientist.com

IBM voice ace: Kindle no threat to audio books

Posted by mr bill | Posted in | Posted on 6:17:00 AM

0

Executives at the Authors Guild say the text-to-speech feature in Amazon's Kindle 2 could hurt sales of audio books. Not all of the experts agree, including the guild's.

Andy Aaron, an IBM text-to-speech expert, says synthetic voices don't know when to add emphasis or inflection when reading.

Andy Aaron, an expert on speech-to-text technology, recently commented in an interview about how much such systems have advanced. In an op-ed piece published Tuesday in the The New York Times titled "The Kindle Swindle?" Roy Blount Jr., president of the Authors Guild, used Aaron's quotes to support his argument that the Kindle's voice feature could threaten the future of audio books.

But when asked to elaborate, Aaron told CNET News on Wednesday that the audio-book market has little to fear from "synthetic voices."

"I'm a big believer in (text-to-speech) and a booster of it," said Aaron, who is with IBM's Watson Research Center. "But I don't think at this point, or for the foreseeable future, it's going to compete meaningfully with a professional book reader...Am I going to sit down and put my feet up and listen to text-to-speech read 'War And Peace' or Harry Potter for six to eight hours? For someone who has the choice, I think they would rather get an audio book."

Amazon appears headed towards a showdown with the Authors Guild over text-to-speech technology. This enables computers to read text in a lifelike voice. Paul Aiken, executive director of the Authors Guild, a trade group representing 9,000 authors, argues that Amazon isn't compensating authors for Kindle's text-to-speech feature. He claims authors' copyrights are being violated.

Amazon representatives did not respond to a request for comment.

Aiken generated a lot of attention when he first raised concerns about the Kindle following the debut earlier this month of the e-book reader. On Wednesday, Aiken said Amazon never informed the guild--or book publishers for that matter--of the retailer's plan to include the feature.

In the weeks since the Kindle debut, the guild has had discussions with Amazon and the online retailer is taking a "hard-line position," Aiken said. All this doesn't bode well for finding an amicable resolution.

Aiken wouldn't say what the guild's plans are but confirmed that guild administrators won't rule out filing a lawsuit.

"Anytime you have a new means of accessing content," said Aiken, "there's always some sort of aggregator that wants to control it and keep the value for themselves."

As for Aaron's assertions that text-to-speech systems won't threaten audio books for a long time, Aiken says nobody knows the future.

"Things move quickly," Aiken said. "I think the technology has made a generational leap in just the last few years."

To prove the point, the guild has posted demonstrations of text-to-speech technologies offered by Apple four years ago (the video posted above). The voice is monotone and unintelligible in places. It sounds like it was lifted from a bad sci-fi film.

The next clip is a recording of Kindle's text-to-speech offering. (At right, I've included a humorous demonstration of Kindle text-to-speech function posted to YouTube by a user called Kindlejunkie). The differences are sharp. The Kindle's voice pronounces words clearly and sounds far more lifelike. There is however, no inflection or emphasis. The thing drones on.

It's not that the technology can't create dramatic effects. Aaron says the technology has advanced to a point where synthetic voices can be made to sound happy or apologetic. The major roadblock for these systems, however, is that they don't know when to insert these effects or choose the effect that is most appropriate.

What's missing in computers is the ability to understand what they're reading, said Aaron.

"Even a mediocre human reader is interacting with the text and understands every word that he or she is reading," Aaron said. "Text-to-speech doesn't. It can be really good. It can be really smooth. It can sound very lifelike. But it doesn't understand what it's reading. Do you want to listen to a reader that doesn't understand what they're reading?"

The obvious question here is if text-to-speech systems can read something with a specific emotional tone, couldn't a publisher go into a digital book and mark where they want to insert a specific effect?

They could, says Aaron, but that would take an enormous of amount of time and expense. At that point it's easier to hire a human reader and create an audio book.

Here's a little bit about how they create a voice for text-to-speech. First, a professional reader is hired to read text created for its "phonemic diversity." The sentences are designed to cover a wide range of word sounds. The process takes more than 60 hours to complete, Aaron said.

Algorithms are used to help figure out how to manipulate the sounds correctly.

Aiken concedes that text-to-speech systems can't provide many of the dramatic effects that a human can. But he does think they're good enough to erode sales of audio books.

One thing to remember is that the potential to compete with audio books is only one part of the guild's complaint. Aiken argues that Kindle's voice feature should be considered a separate derivative and authors should share in its revenues.

What's for certain is guild managers don't believe Amazon should give text-to-speech away for free just to help market Kindles.

"This should be considered a legitimate new market for publishers and authors," Aiken said. "It's a technology that should be used for incremental revenue. With all the squeezing that's going on in publishing, you just can't let this one go."

Source:http://news.cnet.com

Microsoft lawyer 'won't speculate' on more Linux suits

Posted by mr bill | Posted in | Posted on 6:10:00 AM

0

Microsoft lawyer mum on more Linux suits
Microsoft's top intellectual property lawyer said that the company's legal action against TomTom over Linux was specific to that company, but he declined to say whether other suits over the open source operating system might follow.

"I wouldn't speculate at this point," Horacio Gutierrez told CNET News in an interview late Wednesday. Gutierrez did add that Microsoft's patent suit against TomTom, which includes three claims related to file management techniques used in the Linux kernel, was specific to that company.

It is the "TomTom implementation of the Linux kernel that infringes these claims," Gutierrez said. "There are many flavors of Linux (and) many implementations of the Linux kernel. Cases such as these are very fact-specific."

Microsoft filed complaints in federal court and with the International Trade Commission on Wednesday alleging eight counts of patent infringement by TomTom. While five of the patents relate to car navigation systems specifically, three of the claims pertain to TomTom's use of the Linux kernel in its products, Gutierrez said.

Gutierrez said Microsoft chose to include the open source claims alongside the proprietary GPS system claims because both related to TomTom. He characterized the suit as a dispute with TomTom as opposed to a new salvo against Linux.

"This is just a normal course-of-business dispute between two companies," he said, adding that no special thought was given to what it meant to include the Linux claims in the suit.

"That is not the focal point of the action," he said.

Asked whether that meant that Microsoft would seek compensation from all products that use the Linux kernel, Gutierrez said, "No. That is really not what we have in mind. This case is about TomTom's infringement."

He stressed Microsoft's preference for signing licensing deals with companies, including those using Linux.

"Our position is and has been that we believe licensing is the right way to approach and resolve these things," he said.

Gutierrez said that the move did not reflect a change in Microsoft's overall position toward open-source software. "I think there shouldn't be any ambiguity on our expectations as a company. We recognize that open-source software will continue to be a part of the industry."

But, he said that the company's "appreciation and respect for the open-source community is not inconsistent" with its desire to protect its intellectual property.

That said, he acknowledged the suit could hurt some of the efforts the company has tried to make in recent years to mend fences with the Linux world.

Sometimes, he said, disputes will lead to lawsuits. "Sometimes they will evoke hard feelings. Sometimes those feelings will make moving ahead with our open-source strategy more challenging, but there is no change to our open-source strategy and the work many teams across Microsoft do every day to move it forward."

Although Microsoft did not call out the Linux claims, Gutierrez said the company was not trying to hide them. While Linux is not mentioned in the federal lawsuit, he said, they are noted in two paragraphs of the ITC claim.

Source:http://news.cnet.com/

Quake boosts browser video games

Posted by mr bill | Posted in | Posted on 12:26:00 AM

0

Quake Live
Quake Live is a re-working of 1999 title Quake III: Arena

Classic game Quake III will be re-released for the web browser on Tuesday, highlighting the rapid development in web games.

It runs inside browsers after the installation of a software plug-in.

"It is a significant step which proves browser games can be sophisticated," said Michael French, editor of games industry magazine Develop.

Quake Live is a version of a PC game which was first launched in 1999.

The game is being released free of charge for browsers by id Software, and is supported by advertising. It opens to the public as a beta later on Tuesday.

Mr French said: "A lot of the foundations of the mechanics of modern shooters were established by Quake.

"It makes a lot of sense for id to be trying new avenues for their intellectual property.

"One of the things id has always been known for is being cutting edge in graphics but also for finding new ways to get their games to gamers."

Id Software is not the first company to offer browser versions of games that were once synonymous with physical formats: Garage Games offers web versions of games like Fallen Empire and Marble Blast Online, while there are also a number of online multiplayer titles such as PMOG.

High profile

But Quake is the most high-profile PC franchise to branch out into the browser space.

Mr French said: "It proves that consumers are willing to try these things. All kinds of people could now be exposed to games for the first time.

"There is no console or hardware in the way. This is gaming for people who are more used to using Facebook."

Mr French said browser-based games had already surpassed the graphical sophistication of titles that used to rely on console hardware such as the original PlayStation.

"You won't play this and be put off thinking it is old fashioned or ugly. It is very playable and watchable," Mr French said.

However, he said browser games were not yet a substitute for a dedicated piece of gaming hardware.

The games industry will be watching id software's browser developments very closely.

"The Massively Multiplayer Online space is certainly the area most likely to move to browser. Some well-known role playing game franchises could also move to the browser and are probably already in development."

Source:http://news.bbc.co.uk

Safari 4 a big step up, but not as far as rivals

Posted by mr bill | Posted in | Posted on 12:12:00 AM

0

With Safari 3, I admired Apple's chutzpah for bringing its browser to Windows. With the new Safari 4 beta, I'm actually starting to admire the browser, too.

A big user interface overhaul makes Safari look polished rather than clunky on Windows, builds in better search abilities, and makes good use of the fact that people often visit the same sites over and over.

However, the lack of something like the extensions architecture that Firefox pioneered still means Safari 4 (download for Windows and Mac OS X) is better only than Safari 3, not the competition.

The new software puts Safari 3's brushed-metal appearance on the scrap heap and bolts on a Windows-native appearance. I'm not one of those user interface conformists, but I found the brushed metal interface downright ugly on Windows, in part because of the blotchy font rendering.

Safari 4, though, generally looks slick. And I like its user interface, too, which through what appears to be a case of convergent evolution shares a lot with Google Chrome and some of Firefox's as well.

There's plenty of other new material, though, and Safari's snappy performance makes it a viable contender in the browser wars. Competition really is making the browser better, which is of immense importance as the computing industry moves to a cloud-computing future where applications run on the Web as well as on personal computers.

It's still curious that Apple thinks it's worthwhile to bring Safari to Windows. The company's high-end software, such as Aperture and Final Cut Pro, work only on Mac OS X. Very mainstream software such as iTunes and QuickTime work on Windows, too. iPhoto and other iLife programs in an intermediate realm only work on Mac OS X, though, indicating that Apple has limited appetite for the hassles of supporting a rival operating system.

I suspect that Apple concluded Safari for Windows could help the company tout its wares, possibly convincing Windows users that Apple has some software skills. And perhaps it's laying the groundwork for tighter integration with other Apple software, hardware, and Web services.

For a beta, the software is workable; I encountered one crash in a couple of hours' use.

User interface improvements
Like Chrome, Safari 4 puts all its tabs across the top of the screen, with no traditional title bar, with the address bar below and a row of bookmarks below that. Even the upper-right mini-menus are similar, with small icons for window management and tools. Also like Chrome, when a new tab or page is opened, Safari 4 will by default show an array of most frequently visited sites, a feature Apple calls Top Sites.

Pages in the Top Sites view can be moved, pinned, and deleted.

Pages in the Top Sites view can be moved, pinned, and deleted.

(Credit: screenshot by Stephen Shankland/CNET Networks)

Those differences compared to regular browsers are subtle, but I got used to them in Chrome and concluded I like them. Safari has some differences, though.

For one thing, I'm a keyboard shortcut user, and Safari grants me access to its menu items by hitting the Alt key, which is standard Windows protocol but which is missing from Chrome. For another, Top Sites is much more sophisticated, not just because it has a fancy 3D view, but because you can choose how many mini-pages to show, move them around, "pin" the ones you like to a fixed position, and delete ones you don't want showing.

I like showing tabs at the top, which I think devotes proper prominence to the multiple browser views. But I have nits: the font is too dim, making it hard to see the tab text, and I don't care for how the tabs sprawl to claim as much real estate as possible, because for me it makes them actually harder to recognize as tabs. Perhaps I'll get used to that in time.

Happily, middle-clicking on a link opens the Web page in a new tab now rather than a new browser window, something that bugged me with Safari 3.

But Safari doesn't do something deeper with tabs that Google did with Chrome, isolate each into its own separate computing process. That isolation improves security and stability, though you pay a price in memory. Once Chrome offered it, I got annoyed when a problem that could have brought down a single misbehaving tab brought down my entire browser.

More eye candy
Another example besides Top Sites of Apple's polished user interface is the Cover Flow interface to browser history. I'm not a big user of history, so this looks cosmetic more than useful to me, but if you are going to use it, perhaps the visual images will help you find what you need more rapidly than scanning a list of text. A text search box also helps retrieve information.

More helpful by far is the Smart Address Field, which like Firefox 3's Awesome Bar suggests Web addresses from your history and bookmarks when you start typing. It's a much more effective way of returning to earlier sites than scrolling through text history lists.

Safari 4 gets a Cover Flow interface to looking at browser history.

Safari 4 gets a Cover Flow interface for looking at browser history. (Click to enlarge.)

(Credit: screenshot by Stephen Shankland/CNET Networks)

Given Apple's user interface chops, it's interesting the company didn't go as far as Chrome by integrating Web search in the address field. Instead, Safari still has a separate search field to the right of the address bar. Now, though, it's got what Apple calls the Smart Search Field, which is a fancy brand name for using Google Suggest to show possible options when you start typing. Google, Yahoo, and Microsoft all like this feature on their search engines, and I like it this way too: it can save you a few keystrokes, and sometimes it can even help you find material when you don't know exactly how to spell it.

Smart Search Assist uses Google Suggest to streamline search.

Smart Search Assist uses Google Suggest to streamline search.

(Credit: screenshot by Stephen Shankland/CNET Networks)

Like rival browsers, Safari now can magnify or shrink the entire Web page--graphics and text--which is nice for those sites with microscopic type or for when you're showing a site to somebody farther away or dealing with a computer with super-tiny pixels. (Of course, such zooming also can reveal how inflexibly designed a Web page is.)

Skin deep?
User interface improvements aren't just superficial changes. People care about appearance, and making things faster and better adapted to actual humans is important. Kudos to Apple for making Safari look and often work better.

On a deeper level, it's good that the Safari 4 beta also offers better performance. Apple makes a variety of boasts with page-loading speed and, by virtue of its new Nitro engine previously known as Webkit's Squirrelfish, faster JavaScript execution. I can confirm the SunSpider JavaScript speed test runs in less than two thirds the time as on Safari 3, a big improvement, and performance likely will increase more once the final version is released. Check back for more coverage today on detailed performance results.

However, while the user interface improvements overall catch Safari up to the competition and in some cases surpass it, the fact that extensions are missing is an egregious oversight given how powerful Mozilla has shown them to be with Firefox. Not only do they improve the user experience and enable many new features, but they're an excellent way to attract developers and users to your browser. Unsupported options such as PimpMySafari aren't likely to match a real add-on ability.

Source:http://news.cnet.com

Making the rounds at TechFest

Posted by mr bill | Posted in | Posted on 12:09:00 AM

0

REDMOND, Wash.--Microsoft already has several tools that stitch together a bunch of smaller photos to create a larger representation. With Photosynth, Microsoft even uses a collection of still images to re-create a three-dimensional experience.

Now a team of researchers is trying to do the same thing with video, in real time. The idea is that, at any given event, there are lots of people with cell phones capable of recording video. But the resolution of any one of those videos is pretty limited.

At the company's annual TechFest internal science fair on Tuesday, Microsoft showed how, in real time, multiple cell phone video streams can be stitched together to create one higher resolution video. The idea was developed by a trio of folks in Microsoft's Cairo, Egypt, labs as a way to provide video of class lectures. Pretty quickly, though, the team realized that the technique had much broader uses, everything from citizen journalism to live streaming a family wedding to distant well-wishers.

"There are lots of people that have mobile phones in their pockets," said Ayman Kaheel, a development manager at Microsoft's innovation center in Cairo and one of three people involved in that project. (I've embedded a video below of Kaheel talking about the project and giving a quick demo.)

A few steps over, Darren Edge was showing a project called Notes Scape that aims to create virtual sticky notes that travel with you wherever you go, appearing on any cell phone or laptop that you have nearby. I was a little fuzzy on the technology, but someday Edge said the approach could help visualize and organize information, particularly once we all start walking around with the kinds of heads-up displays that remain largely the stuff of science fiction.

While many projects are aimed at evolving traditional objects into their ultra-high-tech equivalent, the team from Microsoft Research India takes a different tactic. As part of their efforts to bring technology to the rural poor, the group often looks at what might be the lowest level of technology needed to solve a particular problem.

A few years back, the team discovered that a TV and DVD player was a far more effective way of showing improved agricultural methods to rural farmers than trying to use laptops.

This year, the team from India is showing a couple of education projects that try to take advantage of the limited technology that is already pervasive. In one, the group has taken books and digitized them to play on a standard DVD, using the fast-forward button to move from page to page. At TechFest, Microsoft showed a Dr. Seuss book running from a standard DVD, with audio added in the background.

"DVDs are a very cheap medium, much cheaper than textbooks," said Microsoft researcher Sarubh Panjwani.

Thousands of books can fit on a single DVD, said Panjwani. That means a school that can't afford many books can still have a library. It also means that the school can have a means to send books home with students. Even in rural areas, more than 70 percent of people have access to a TV and DVD players are also fairly common.

Plus, a book on TV can be shared by an entire classroom if need be, Panjwani said.

"A TV is big enough to share the content," Panjwani said.

I'll have more from TechFest in a little bit, including details on more research projects, more videos and pictures, as well as an interview with Microsoft Research head Rick Rashid.

Source:http://news.cnet.com

Designing the Kindle 2

Posted by mr bill | Posted in | Posted on 7:01:00 AM

0

It gave it a slimmer design and more storage, but there are a lot of things Amazon could have added, but didn't. Things like a color display not only would make the device pricier and give it a shorter battery life, but would also make the gadget uncomfortable to hold.

Amazon CEO Jeff Bezos holds up a Kindle 2 at the device's recent launch in New York City.


"One of the great things about Kindle is it doesn't ever get hot," Amazon Vice President Ian Freed said in an interview at Amazon's downtown office here. That's important, Freed said, given that the company has one main goal with the Kindle--making the product as invisible to users as possible when they are reading.

"The most important thing for the Kindle to do is to disappear," Freed said. That was the goal with the first device and was also a key factor in deciding what would go in the sequel, which started shipping on Monday. There are the obvious factors, like the thinner, sleeker design. But there are also things like an improved cellular modem. As a result, Kindle users will find themselves out of range in fewer places to get updates or buy a new book.

One of the biggest new features is one that is impossible to see--the new Whisper Sync feature that will eventually let people read the same electronic book on multiple devices, including Kindles and cell phones.

Although he wouldn't say just when people will have Kindle content on their cell phones, Freed did confirm that one won't need to have a Kindle device to read Kindle content, though he suspects some of those who try Kindle on a cell phone will ultimately buy Amazon's device.

The cell phone option, like a controversial new text-to-speech feature built into the Kindle 2, is more designed for short bits of content than as the primary mode of reading, Freed said. A cell phone is good for those unexpected times where one has a few minutes to read, while text-to-speech is good for those who are right in the middle of a cliffhanger and have to get in the car or cook dinner.

On the text-to-speech front, Amazon has come under fire for trying to take over for the audio book market, but Freed noted that only a fraction of books even come out in audio form. He also noted that the feature works with blogs, newspapers, and other content.

"Audio books are a great experience with a trained narrator or sometimes the author (reading the book)," Freed said.

Getting more content onto the Kindle remains a goal, Freed said. Although the company has 240,000 books--and nearly all of The New York Times bestseller list--Freed said the company's long-term goal is to get every book, including out-of-print titles, onto the device.

Newspapers and blogs are also important, though Freed wouldn't say whether he bought into the notion that some newspapers would be wise to stop home delivery and instead pass out Kindles to subscribers.

"I'll leave it to others to figure out what the economic model will be for newspapers," he said. "Our newspaper customers have been happy working with us. It's a new source of revenue for them."

Amazon, he said, would certainly be happy to talk to newspapers interested in trying something more radical. "We'd certainly be open to working with any newspapers."

Source:http://news.cnet.com

Five hot storage technologies for 2009

Posted by mr bill | Posted in | Posted on 6:51:00 AM

0

Each year, a handful of storage technologies seem poised to break out of the pack and become essential building blocks for new products that make storage easier to manage, less costly and better performing. For our annual forecast, these are the five technologies we think will be hot in 2009:

  1. 10Gb Ethernet (10GbE) and 6Gb/sec SAS are less-expensive alternatives to Fibre Channel (FC) networking and storage
  2. Remote replication for disaster recovery, while not new, is becoming the cornerstone of DR plans.
  3. Global deduplication, managing islands of dedupe and virtual tape library (VTL) appliances and sharing dedupe data among them, is a much-needed innovation for next-generation dedupe products
  4. Storage-as-a-Service (SaaS) offerings are becoming increasingly appealing in tough economic times.
  5. Self-healing systems, arrays that help cut management time and data loss, round out our list.

As we do each year, we'll cite several promising technologies that, for various reasons, will not be hot in 2009 (see "Not hot in 2009," below). Finally, we'll do a self-imposed reality check and rate the accuracy of the hot technology predictions we made last year (see "Report card on our 2008 predictions").

SAS-2 spec (6Gb/sec) and 10GbE

These are different technologies, but together they put pressure on FC's dominant position in storage networking. At 10Gb, Ethernet becomes more of a storage play because it boosts iSCSI's performance. And the serial-attached SCSI (SAS-2) standard will fortify that interface as "enterprise class" in 2009, largely because of its 6Gb/sec capabilities.

That means SAS should start to challenge FC for high-performance disk drives. Analysts say 6Gb/sec SAS will eventually become the new enterprise storage drive standard. "It's going to be the first time that this serial-attached SCSI interface is faster than FC," says John Rydning, research director for hard disk drives at analyst firm IDC. "We've seen a pretty rapid adoption for internal storage, and now it's going to get the attention for external storage," he says.

Doubling from its current 3Gb/sec bandwidth, 6Gb/sec SAS enables solid-state disk (SSD) adoption and compatibility with the SATA connection. In October, LSI Corp. brought out what it calls the industry's first 6Gb/sec SAS-to-SATA bridge cards and 16-port SAS storage processors.

Marty Czekalski, senior staff program manager at Seagate Technology and VP of the SCSI Trade Association board of directors, says 6Gb/sec SAS will start shipping in systems to customers about halfway through 2009.

"You're still going to have FC SANs between servers and external storage systems, but you will start to see 6Gb/sec SAS drives used on the back side of those controllers. They will replace the FC drives on the back side over time," notes Czekalski. Less cabling, doubled transfer rates, improved link utilisation and rack-to-rack distances are among the advantages 6Gb/sec SAS users can expect, he says. "And 6Gb/sec SAS is a great connection for SSDs," he adds, because users can get 6Gb/sec per link, low latency and high aggregate performance.

Brad Booth, president and chairman of the board of the Ethernet Alliance, says Ethernet has progressed from a technology that would carry only LAN traffic to "a unified fabric ... iSCSI, NAS, FCoE all rely on the Ethernet," he says. Booth, who's also senior principal engineer in the office of the CTO at AMCC, says the advent of electronic dispersion compensation (EDC), used in optical and backplane platforms as a means to compensate for some of the impairments in the transmission medium, is among the most recent developments giving 10GbE technology a boost.

The biggest obstacle for 10Gb remains price, but industry analysts agree that the price of 10Gb in 2009 will drop as the technology matures. In addition, SFP+, a new optical form factor, will permit greater port density and lower the price per port.

Data replication for DR

The use of remote data replication for DR isn't new, but it has taken off in large part to the increased role of server virtualisation. It was more than a decade ago when storage firms began offering capabilities in their storage arrays to copy or replicate data to a remote storage system, but that ability often relied on having identical server and storage hardware at both locations.

Today's server virtualisation technology, which allows the same images to run on different server types at each location, is simplifying replication for DR and lowering its cost. The "game changer" for replication in 2009, according to Stephen Foskett, director of data practice at consultancy Contoural, is the combination of widespread server virtualisation and virtual storage technology "and, most importantly, universal APIs and management systems to stitch the two together. VMware's SRM [Site Recovery Manager] is a good example of the kind of end-to-end technology that will finally allow storage replication to become a standard component of the data centre," he says.

Foskett predicts new products will make it easier for small- to medium-sized businesses (SMBs) to replicate data to DR sites, and for enterprises to move data generated from remote offices to their main data centre. In 2009, the bottom line is that replication for DR will become more affordable. According to Greg Schulz, founder and senior analyst at StorageIO Group, replication technology will no longer be for the "rich and famous," and will become more prevalent in organisations of all sizes.

Global dedupe

Deduplication is growing up and out. Global dedupe expands upon "plain vanilla" dedupe technology by working across multiple processors and dedupe appliances, which can share data. The technology creates a single dedupe database that can be managed from one console. This is a big deal because instead of trying to manage a bunch of separate deduplication appliances, a dedupe system can scale to meet the needs of growing companies and exploding storage systems.

In 2009, there'll be more users like Eric Zuspan, senior system administrator, SAN/Unix, at MultiCare Health System. Zuspan purchased Sepaton Inc.'s S2100-ES2 Enterprise VTL with DeltaStor software for data deduplication this year in hopes of easing his reliance on tape. His shop also owns a Data Domain Inc. appliance (which is handled by the WAN team that serves all the Windows applications). What Zuspan found in Sepaton is a single appliance that has multinodes, and the ability to add more nodes if he needs extra capacity or throughput.

"This appliance gives us that kind of flexibility," says Zuspan. (Rather than using the term multinode, the Storage Networking Industry Association refers to the capability as "single deduplication domain." This means that all data delivered from any node in a system participates in the same deduplicated pool of storage.) Data Domain says it expects to support clustered nodes in 2009.

SaaS

The new and improved Storage-as-a-Service (SaaS) offerings look a lot different than they used to, and are getting a lot of attention. EMC and IBM are among the storage heavyweights who have made significant SaaS acquisitions in recent years. This has 2009, with all its looming economic pressures, shaping up as the year that SaaS (the outsourcing of backup operations to a company with its own hosting facility and software to manage it) will become even more appealing. Whether the selling point is a "cloud" or an underground vault, SaaS will get plenty of attention from storage pros who may find it less expensive to engage a SaaS provider than to protect all or part of their data in-house.

SaaS is also a way to address pricey compliance requirements and new DR mandates. New data-encryption transfer technology and tiered SaaS offerings from vendors are making it an attractive alternative in shops where security concerns kept some IT executives from considering the possibility of shipping their data offsite. Is SaaS still targeted for SMBs? Yes, but large companies are starting to seriously consider SaaS because of the requirement to back up data from a growing number of remote offices and a new awareness of the importance of backing up critical information on laptops, says Stephanie Balaouras, principal analyst at Forrester Research.

There have been a lot of SaaS company acquisitions and repositioning recently, setting up 2009 as the year when users will have more SaaS choices. Dell made a $US155 million acquisition of MessageOne, and company executives say they plan to use the newly acquired SaaS technology for remote data protection and systems management. CommVault extended its managed services agreement with smaller SaaS player Incentra Solutions. In September, Seagate announced i365, A Seagate Company, a new umbrella company designed to bring together the service businesses Seagate has acquired in recent years. Companies such as AmeriVault, Intronis Technologies and Seagate (EVault) are all competing on things like laptop support, on-demand restores, open-file management and multiple data storage facility locations.

One caveat to this SaaS prediction is that some smaller SaaS vendors, already practically giving away their services to establish a clientele, are likely to be squeezed out of the SaaS picture as those veterans with the most clout (EMC, IBM, Iron Mountain Digital and Seagate) continue to mine this opportunity. "I see some of the hosting companies becoming pressured," says StorageIO Group's Schulz.

Self-healing systems

There's plenty of noise being made in this arena and it can be tough for users to decide what to focus on with so many vendors shouting so loudly. What can a self-healing system do for you? It can eliminate a potential single point of failure in a RAID array, for one. Self-healing systems reduce the risk of data loss on a disk drive caused by media defects by inspecting adjacent areas around the first defect. The system then reconstructs the data associated with the first defect using parity. All of this can be accomplished in the background to allow the host uninterrupted access to data.

"The whole point of these systems is to reduce maintenance windows and to prolong technology investments," says Brian Babineau, senior analyst at Enterprise Strategy Group.

In April, Atrato and Xiotech introduced storage systems with a new pledge that customers could expect years of operation without needing any service. Pitching their products as self-healing systems with sealed components containing multiple disk drives, both vendors say they avoid RAID rebuilds by copying data off a troubled drive and, depending on whether the drive has failed, replacing it and copying data back to the new or repaired drive.

These so-called no-maintenance disk arrays are designed to avoid hard-drive swapping. Self-healing systems have been getting their share of press for many years, but as storage budgets tighten, any array that cuts the amount of time storage professionals spend maintaining and fixing troubled disks is time (and money) saved that can be spent on other data protection issues.

Self-healing storage technology isn't a new concept and some storage pros will remember IBM's Shark, its TotalStorage Enterprise Storage Server with self-healing capabilities that was introduced in 2002; yet it wasn't until 2005 that IBM said the "era of self-healing technology" had arrived. And archiving systems like EMC's Centrea have been billed as self-healing for years.

We may be going out on a limb with this prediction, but we're betting that a demand for quicker, more efficient rebuilds and increased pressure to provide 24/7 support for business processes will propel self-healing systems into a "must-have" feature for all storage arrays in 2009.

Source:http://searchstorage.techtarget.com.au

Web 2.0

Posted by mr bill | Posted in | Posted on 12:59:00 AM

0

The Web 2.0 phenomenon is unstoppable. Employees are turning in droves to blogs, wikis, mash-ups, social networking, crowdsourcing and other variations on the Web 2.0 theme. A recent Yankee Group survey found that 86% of non-IT workers are using at least one consumer Web 2.0 tool at work. As younger workers enter the enterprise workforce, access to Web 2.0 technologies will become only more of a given.

The challenge for IT executives is how best to harness Web 2.0 technologies in a way that's secure; serves such basic enterprise functions as collaboration; and adds to worker productivity, revenue generation and overall business benefits.

The possibilities are endless. A Gartner list of Web 2.0 applications includes answer marketplaces, collaborative product and service design, community-driven self-service, crowdsourcing, idea engines and prediction markets.

Many employees are using such social-networking sites as LinkedIn, MySpace or Twitter to communicate with peers and customers. In addition, a growing number of vendors aim to help companies set up and manage enterprise-grade Web 2.0 applications. For example, WorkLight offers Java-based software that will help authenticate, encrypt, store and manage Web 2.0 applications (see "10 start-ups to watch in '09"). Face Connector (formerly Faceforce) is a mash-up that brings Facebook profile and friend information seamlessly into Salesforce CRM. Socialtext 3.0 provides social networking, wikis and customizable home pages for the enterprise.

That's just the tip of the iceberg. Key is identifying an application that fits with the culture of your company, then making it available and watching as the community takes off.


Source:http://www.networkworld.com

Data protection

Posted by mr bill | Posted in | Posted on 12:33:00 AM

0

In today's world of mobile workers, teleworkers, thumb drives, BlackBerries and social-networking sites, IT executives can't worry about devices - they need to focus on protecting data wherever it is.

The obvious place to start - considering that an estimated 5,000 laptops are stolen or lost each year - is the laptop hard drive: It needs encryption. (Read a column about the Drive of shame.)

Software vendors and such open source projects as TrueCrypt offer whole disk encryption across all operating systems, and Microsoft offers disk encryption in Vista, so IT executives have no excuse for not encrypting laptop data. In addition, such hardware vendors as Fujitsu, Hitachi and Seagate Technology offer hardware-based disk encryption.

Another trouble spot is e-mail. A variety of e-mail encryption methods are available, but all of them run into the same problem - they require the recipient of the encrypted e-mail to go to a secure server and enter some form of identification before they can gain access to the decrypted e-mail. For most people, this is a nuisance that rises to the level of a deal-breaker.

Another way to approach e-mail security is through data-loss prevention. DLP tools scan outgoing e-mails for such information as Social Security numbers, sensitive keywords or other possible breaches. Then they flag the offending e-mail. Companies dictate how offending e-mails are handled: They can be returned to the sender, bounced to an IT manager or encrypted.

DLP products, however, can be difficult to get right. That's because companies have to hammer out policies for determining which types of data need watching, what happens when an e-mail is flagged, and whether the individual user should be required to decide whether to encrypt specific e-mails or types of e-mails. For example, the CIO might not appreciate it when he sends an e-mail to the CFO and it gets flagged, bounced back or held up.

Other potential problem areas - everything from thumb drives to smartphones - abound. Nevertheless, vendors today are offering encrypted USB drives and business phones with encryption features. IT executives need to make data security a requirement every step of the way.

Source:http://www.networkworld.com

Cloud computing: Hot technology for 2009

Posted by mr bill | Posted in | Posted on 12:14:00 AM

0

As we arrive at 2009, cloud computing is the technology creating the most buzz. Cloud technology is in its infancy, however, and enterprises would be wise to limit their efforts to small, targeted projects until the technology matures and vendors address a variety of potentially deal-breaking problems.

First off, let's define cloud computing. Gartner says it is "a style of computing whose massively scalable and elastic, IT-related capabilities are provided 'as a service' to external customers using Internet technologies."

The two most commonly cited examples of cloud offerings come from Amazon.com and Google, both of which basically rent their data-center resources to outside customers.

For example, Amazon's Elastic Compute Cloud (EC2) lets customers rent virtual-machine instances and run their applications on Amazon's hardware. Other services under the EC2 umbrella include storage and databases in the cloud. Amazon uses Xen for virtualization and offers customers a choice of Linux, Solaris or Windows operating systems.

The pitch is that customers can take advantage of Amazon's expertise in running large data centers, that customers pay only for the compute and storage resources they use, and that Amazon can scale up or down easily, depending on the demand.

That's the most basic level of cloud computing - infrastructure in the cloud. In this scenario, the customer is aware of and makes choices concerning the infrastructure itself.

The next level is cloud computing as a Web development platform. The best example is Google's App Engine, a place where Web application developers can upload code (as long as it's written in Python) and let Google's infrastructure take care of deploying the application and allocating compute resources.

The third level is running enterprise applications in the cloud. A cloud vendor could host an enterprise application and take responsibility for that application's availability and performance. Gartner predicts e-mail will become one of the first enterprise applications that move to the cloud.

How is that different from software-as-a-service (SaaS)? Without getting too tangled up in semantics, SaaS typically refers to a specific vendor - Salesforce.com, for example - offering its application to multiple customers in a hosted model. Theoretically, a SaaS vendor could use the cloud infrastructure to host its applications. Also theoretically, a cloud provider could host anybody's application.

That brings us to the ultimate cloud scenario, in which these "private" clouds owned by such companies as Amazon and Google melt into one giant, public cloud that contains all the user's data and applications and is accessible anytime on any device.

That's a long way off, however. In addition, the potential roadblocks are many. They include issues of licensing, privacy, security, compliance and network monitoring. A final potential stumbling block is that enterprise applications tend to be customized and intertwined, with one system feeding into or reporting back to another. That makes it pretty tough to pluck out an application and run it in the cloud without affecting every related application.

So for now, keep an eye on the cloud, but keep your feet firmly planted on the ground.

Source:http://www.networkworld.com

10 Hot Technologies for 2009

Posted by mr bill | Posted in | Posted on 12:11:00 AM

0

service

What are the new technologies that stand to change the way you do business in 2009 and beyond? Here are at 10 to consider.

What are the top technologies and technology trends that are likely to change the way you do business in 2009 and beyond? Let’s take a look at 10 of them---A to Z.

  1. Cloud Computing -- Not just for big companies anymore, “cloud computing” continues to grow in popularity. It allows even the smallest firms to affordably pay as they go for access to hosted off-site “in the cloud” computers and servers. Cloud services are now offered by everyone from Amazon's EC2 to Google to IBM, and allow firms to save money, storage space, and energy while making backup and recovery a snap. The downside? Cloud computing could lock companies into a never-ending cycle of escalating vendor costs -- all to access their own data.
  2. Computer Virtualization -- Listed first in Gartner Research’s list of top strategic technologies for 2009, virtualization continues to grow in popularity and generate buzz. The concept: that software now allows businesses to run multiple operating systems and multiple apps through one computer at the same time. The resulting system of virtual desktops and virtual storage can save businesses significantly on hardware, storage, and energy costs. However, even with its promise, Gartner projects that fewer than 40 percent of interested businesses -- mainly enterprise firms --will adopt virtualization by 2010.
  3. Green IT -- Green IT, the concept of adopting technologies designed to save energy and reduce carbon footprints, continues to gain steam as firms try to respond to concerns over climate change, dependence on foreign oil, and the prospect of tighter environmental regulations under a new president. Many technologies can fill this bill, including virtualization, cloud computing, increased use of Web conferencing and collaborative tools, and telecommuting. In 2009, look for companies to adopt those “green” technologies that also help them to cut costs in a lean economy.
  4. Mashups -- Corporate interest is growing in the seemingly infinite ways that Web applications can be “mashed up” to serve up information in entirely different ways, like fusing Google Maps technology with real-estate data to create sites like Zillow.com, or using Sprout technology to add streaming video or a real-time polling app to an otherwise lackluster website. In 2009, look for more companies to enter the mashup arena, particularly to enhance their marketing capabilities.
  5. Memrister --“Memory Resistors” are tiny components that, because of the way they are structured, do not “forget” data stored in them even when turned off. They are likely to be cheaper and faster than flash storage. The ability to form a memrister was finally announced in April 2008 by Hewlett-Packard Labs, after nearly 40 years of research. Memristers “could be a strong competitor to the flash memory market in about five years,” writes HP Senior Fellow Stan Williams.
  6. Mobile Computing -- Mobile computing will remain white-hot in 2009. Devices such as the BlackBerry Storm -- the first BlackBerry with a touch-technology screen -- are expected to be a big hit. A new iPhone is out, too, as well as Google’s G1 Android-based phone with keyboard. In a tight economy, however, companies are sure to haggle for bargain contract deals before buying.
  7. Quad-Core Computing --As the decade comes to a close, this technology could see a boost as Intel and AMD continue to battle over whose processor is more energy efficient. While many companies will not need this level of power and multitasking, those considering virtualization models might consider quad-core computers. But in 2009, many companies are likely to shy away from big new purchases given the tough economy.
  8. Social Software -- The overwhelming popularity of Facebook, MySpace, Flickr, YouTube, and Twitter is driving an increasing number of companies to add social network features to their websites -- for marketing, outreach, and internal collaboration. This is likely to continue in 2009, and companies would do well not to miss the boat, notes Gartner Vice President David Cearley, author of Gartner’s top-10 list.
  9. USB 3.0 -- Unveiled in August 2008 by Intel and others, the USB 3.0 personal interconnect technology would allow data transfer at 10 times the speed of the current USB 2.0 technology, all while being more energy efficient. In addition, 3.0 (also known as USB SuperSpeed) would allow much faster charging of cell phones, digital cameras, and other devices that use USB technology. Products incorporating 3.0, however, aren’t slated to hit the marketplace until late 2009.
  10. Web Browsers --Will 2009 be the year companies switch their Web browsers? With the September release of Google’s Chrome, an arguably faster browser meant to handle rich-media content with more agility than Internet Explorer or Firefox, some may choose to make a change.
Source:http://technology.inc.com

Tesla builds excitement for Model S hybrid sedan with second teaser

Posted by mr bill | Posted in | Posted on 7:33:00 PM

0



Tesla Model S Teaser
The full reveal of the car is still scheduled for March 26 at the SpaceX rocket factory
Tesla's first all-original car is due to be unveiled to the public late next month but the company has already released a couple of teaser shots: one showing the undercarriage and side sills of the car and this latest one showing the new glasshouse.

The upcoming car is currently going by its codename ‘Model S’ and is already on pre-order sale. A prototype will be previewed next month at Tesla's design studio inside CEO Elon Musk's SpaceX rocket factory. It will be a 'street-drivable’ version but the production model won’t be ready for delivery for at least another 18 months

This latest teaser was released in an official brochure sent to the first 100 pre-order customers that have placed deposit for the special-edition Signature Series, according to one of the lucky buyers, Jason Calacanis. All that is visible is the new glasshouse and some minor details such as the side mirror and waistline design. The previous teaser showing the lower portion of the Model S revealed the car’s wheels, which featured a turbo-fan-like quality, slotted and drilled braked disc rotors, a Tesla logo at the front quarter-panel, the general line of the front end, and an inward-drawing character curve at the lower portion of the doors.

The front-end's headlight area appeared to be highly streamlined and smooth, while the general curvature of the form underneath the tarp indicated a gently arching roof meeting with a fairly high decklid in the rear.

Tesla says it's doing well, and that all should be on schedule for Model S production now that government funding is on the way. The $40 million round of financing secured last year is twice what Tesla needed to achieve profitability, says the company, though it's still well short of the money it needs to build a new manufacturing facility for the Model S. So short in fact that Tesla has applied for a $350 million loan from the U.S. Department of Energy (DOE) for the construction of the plant. The details of that story can be found here.

The company has announced that the DOE has approved the loan and distribution of the loan funds will begin within 4-5 months. The company credits the quick approval to the Obama administration's prioritizing of the Advanced Technology Vehicles Manufacturing loan program. Production of the Model S will then be on track for the planned 2011 debut.

Also, Tesla has disclosed the details on its Roadster battery replacement program. For an up-front payment of $12,000 (€10,000 or £9,000), buyers can get an automatic replacement battery pack after 7 years. That's the lifespan for a typical battery pack, which was designed for 7 years service or 100,000mi use.

The pack can also be replaced early at a slightly higher cost, or later with a partial refund. At present the real replacement cost of the battery pack is about three times that figure - $36,000 - according to Tesla.
Source:http://www.motorauthority.com

2009 Honda Civic Si

Posted by mr bill | Posted in | Posted on 7:15:00 PM

0

Product summary

The goodThe good: An impressive gearbox and good handling make the 2009 Honda Civic Si a blast to drive. An intuitive voice-command system makes hands-free use of the navigation system easy, and the new iPod/USB drive connectivity is a good update to the music system.

The badThe bad: We have few problems with the Civic Si, besides the rough look of the navigation system and the cheap steering wheel. A little more torque would also be nice.

The bottom lineThe bottom line: If you like a fun, fast ride, the 2009 Honda Civic Si will make you smile. With navigation, phone connectivity, and modern digital music sources, it also makes a practical daily driver.


The Honda Civic Si, one of the best values in performance, gets updated bodywork and some new electronics to keep it competitive for the 2009 model year. None of these changes are drastic--just a few add-ons and styling to comport with Honda's new look--because the Civic Si doesn't need much changed. It gives extraordinary driving pleasure, as it has since the introduction of the 2-liter engine version in 2002.

The 2009 Honda Civic Si evolves the design introduced in 2005, introducing a more angular grille with diamond-pattern inset. But the basic silhouette is the same, at least in our coupe model test car. Welcome additions to the cabin tech are a USB port in the console and a Bluetooth hands-free cell phone system.

Test the tech: Dynolicious performance
The Civic Si has long been the poster child for the boy-racer compact car set, with its combination of low price and class-leading performance. With Car Tech Editor Antuan Goodwin behind the wheel, we tested our Civic Si's performance in a manner fitting of the Civic's young and tech-savvy target audience: with an iPhone app.


The Dynolicious app shows a steady acceleration line, marked by blips for gear shifts.

The Dynolicious application for iPhone and iPod Touch uses the device's accelerometers to measure vehicle movement on two axes. By calculating movement over time, the app can measure vehicle speed and, subsequently, distance and acceleration. For purposes of our testing, we measured 0 to 60 mph time and skidpad lateral G-forces.

Securing our test iPod Touch to the windshield with a suction cup, we lined up for our 0 to 60 mph test. Previously, we'd tested the 2009 Honda Civic LX-S using the same Dynolicious application, reaching 60 mph in 9.76 seconds, and we wanted to see how much better we could do in the Si. On our first launch, we were a bit overzealous with the revs. The front wheels spun helplessly for grip before the Vehicle Stability Assist (VSA) intervened, dampening the acceleration and resulting in an embarrassing time of 14.50 seconds. Subsequent launches were met with equal amounts of wheel spin and equally dismal times, even with the VSA disabled.


We took some rubber off of these tires in our initial launch attempts.

After a few more runs, Goodwin started to hone in on the Civic's sweet spot for the perfect launch and lined up for a final pass. Using fewer revs this time, we dropped the hammer between 2,500 and 3,000rpm. The Si's front tires sounded a chirp before digging in and launching the vehicle forward. Unlike previous generations of Civic Si, the power no longer comes on like a light switch at 6,000rpm; instead the acceleration is a much more gentle and linear push toward redline. Sixty miles per hour was reached at the 8,000rpm redline of second gear at 8.13 seconds. We were sure that another half-second could have been shaved off with practice, but Goodwin didn't want to abuse the Si's clutch any further. Having beaten the Civic LX-S' time by more than a second and a half, we moved on to the skidpad test.

Our skidpad consisted of a figure-eight loop on a closed course. Accelerating up to 35 mph, Goodwin piloted the Civic Si through the course under the watchful eye of the iPod Touch's accelerometers. Steering was a bit vaguer than we'd expected from a small performance Honda, but still precise. In practice, it was mostly judicious feathering of the throttle that kept the Si's slight understeer in check around the course. Checking the readouts in the Dynolicious app, we noted 0.84 g on the left-hand turn and an impressive 0.91 g on the right-hander.

Our better-than-the-average-Civic 0-to-60-mph time of 8.13 seconds is good, but not what we'd call impressive. Thanks to its lack of low-end torque, the Civic Si is no drag racer. However, with a peak 0.91 lateral g on all-season rubber, we think it would make a fantastic auto-crosser, and be even better with stickier tires.

In the cabin
Based on an economy model, the 2009 Honda Civic Si doesn't do luxury, but Honda fitted it out with some performance elements. The seats offer all the bolstering and grippy fabric you need to keep from sliding around the cabin as the car demonstrates its cornering. The console lid features fabric similar to the seats', providing a comfortable arm rest for cruising. The shifter's metal construction gives it a solid feel.


These sport seats keep you planted as you push the Si around corners.

For aesthetics, the big letdown is in the cabin electronics, where the double-DIN navigation and stereo unit looks like an aftermarket piece shoe-horned into the dashboard. Don't get us wrong, we like its functionality; it just doesn't have the same quality fit and finish as most of the cabin elements. But the touch screen is well-position for easy access, and it also includes Honda's intuitive voice-command system, which lets you control most infotainment functions.

The navigation system stores its maps on DVD, and the resolution is pretty bad by today's standards. But if you can get over that, it handles the basics well, and is very easy to use. And one of our favorite features is the complete points-of-interest database, which makes every type of business listing available. Route guidance works reasonably well, with an accurate location for the car on the maps. The main things this system lacks are information features, such as integrated traffic reporting and weather.

The stereo is a little goofy, as a CD player is hidden behind the motorized LCD, along with a PC Card slot. You can put MP3 CDs in the player, and you can get a PC Card flash drive or adaptor for SD cards to play MP3s in the PC Card slot. We imagine Honda Civic owners account for the majority of PC Card adaptor sales. Those audio sources, along with XM radio, have been in the Civic Si since at least the 2006 model. For 2009, Honda gets modern by adding a USB slot in the console. Plug in a USB drive with MP3s or an MP3 player that stores its music in a nonproprietary file structure, and you can browse your folders with the touch screen. Plug in an iPod, and you can select music organized by artist, genre, and album.


The USB and iPod port is new for 2009--a welcome addition--keeping the Si competitive for its young and tech-savvy demographic.

For an inexpensive car, the Civic Si's audio system is very good, with a subwoofer adding punch to a six-speaker setup and 350 watts of amplification. It's an appropriate rig for the fast and furious little Si, with meaty bass you can use to set off car alarms. Just don't expect fine separation and clarity: sound is a little muddy in the mid-ranges and the highs are lifeless.

Honda rounds out the Civic Si's cabin tech with a Bluetooth hands-free system, a nice addition considering more states are outlawing talking on a hand-held cell phone while driving a car. This Bluetooth system is pretty basic, because it's voice controlled, and doesn't help you make calls unless you know the number. But people will be able to reach you as you blast the car around corners and rev its high-stepping engine past 7,000rpm. You can decide whether you actually want to answer.

Under the hood
The 2009 Honda Civic Si's redline goes all the way up to 8,000rpm, with peak horsepower of 197 at 7,800rpm from the 2-liter four-cylinder engine. Lacking a turbo, the Si achieves its horsepower with a double-overhead cam and Honda's brand of variable-valve timing. Torque is on the low side, at 139 pound-feet coming on around 6,100rpm. As we found in our acceleration tests detailed above, the Si is no drag racer, but it does step lively. The engine doesn't balk at high revs, so don't be afraid to downshift to second at 50 mph. Likewise, under acceleration you can let the tach needle slip past the two o'clock position before upshifting.


We can't heap enough praise on this transmission, as it makes a huge difference in the Si's performance.

Honda's close-ratio six-speed gearbox, used on the Si, is one of our favorite transmissions. The short throw shifter lets you snick it from one gear to the next, and it clicks into place effortlessly. The high rev points make operating in second and third perfect for track and twists, but on long straights and ascents you will feel the lack of grunt from the engine.

The Si is most fun on tight turns, where its precise steering takes the car exactly where you point it. A limited slip differential keeps power running to both front wheels, pulling the car through the turns with good grip. As we found in our Dynolicious test above, the Si turned in an impressive rating on the skidpad. Our one complaint concerns the cheap-feeling steering wheel, but that has nothing to do with the steering mechanics.

As you would expect with a little racer like this, the ride isn't designed for comfort. Over rough surfaces you will feel the jolts, and bigger potholes can throw the little Si around. But it's no worse than in many compact cars, which makes the Si suitable for weekly commuting and weekend racing. The Si gets top-line equipment for the Civic model line, and that includes a stability program, which isn't standard in the lesser Civics.

Fuel economy looks good on paper, with a 21 mpg city and 29 mpg highway rating. But the Si doesn't encourage economical driving, where its best mileage figure might be attained. Even with all of our high-revving fun while we had the car, the tank average still came out to 21.4 mpg, a touch above the city number. If we had spent more time in sixth gear on the freeway, and less watching the tach needle cross 7,000rpm, our mileage would have been much better. For emissions, the Si merely makes the minimal LEV II rating from the California Air Resources Board.

In sum
Our 2009 Honda Civic Si test car was top-of-the-line, which included the navigation package, and came in at a base price of $24,005. With $670 for the destination charge, that adds up to $24,675. In this price range, the Mini Cooper S gives the Civic Si some good competition, and offers similar cabin tech. The choice between the two cars is more decided by their drastic styling differences. The Chevrolet Cobalt SS is an impressive upstart, definitely one to consider for performance, although not so much for the electronics.

We were impressed by the functionality of the Civic Si's cabin tech, if not its design. It's good to see a car at this price offer navigation, digital audio, and cell phone integration. Performance is really excellent, too, as the Si is both fun to drive and can be economical, as well. It also picks up some points for the sporty design.

Source:http://reviews.cnet.com