Are you lucky?

Follow hanmireddy on Twitter

Thursday, April 15, 2010

The power of Blogging


Students keen to make their online personality noticed would invariably have a domain name, a blog, a Twitter account, a Facebook page, have posted professional material on the web, commented seriously on blogs, and made a downloadable resume available

reddy2007.blogspot.com

Mining the Web for Feelings, Not Facts
Computers may be good at crunching numbers, but can they crunch feelings?
The rise of blogs and social networks has fueled a bull market in personal opinion: reviews, ratings, recommendations and other forms of online expression. For computer scientists, this fast-growing mountain of data is opening a tantalizing window onto the collective consciousness of Internet users.

An emerging field known as sentiment analysis is taking shape around one of the computer world’s unexplored frontiers: translating the vagaries of human emotion into hard data.

This is more than just an interesting programming exercise. For many businesses, online opinion has turned into a kind of virtual currency that can make or break a product in the marketplace.

Yet many companies struggle to make sense of the caterwaul of complaints and compliments that now swirl around their products online. As sentiment analysis tools begin to take shape, they could not only help businesses improve their bottom lines, but also eventually transform the experience of searching for information online.

Several new sentiment analysis companies are trying to tap into the growing business interest in what is being said online.

“Social media used to be this cute project for 25-year-old consultants,” said Margaret Francis, vice president for product at Scout Labs in San Francisco. Now, she said, top executives “are recognizing it as an incredibly rich vein of market intelligence.”

Scout Labs, which is backed by the venture capital firm started by the CNet founder Halsey Minor, recently introduced a subscription service that allows customers to monitor blogs, news articles, online forums and social networking sites for trends in opinions about products, services or topics in the news.

In early May, the ticket marketplace StubHub used Scout Labs’ monitoring tool to identify a sudden surge of negative blog sentiment after rain delayed a Yankees-Red Sox game.

Stadium officials mistakenly told hundreds of fans that the game had been canceled, and StubHub denied fans’ requests for refunds, on the grounds that the game had actually been played. But after spotting trouble brewing online, the company offered discounts and credits to the affected fans. It is now re-evaluating its bad weather policy.

“This is a canary in a coal mine for us,” said John Whelan, StubHub’s director of customer service.

Jodange, based in Yonkers, offers a service geared toward online publishers that lets them incorporate opinion data drawn from over 450,000 sources, including mainstream news sources, blogs and Twitter.

Based on research by Claire Cardie, a Cornell computer science professor, and her students, the service uses a sophisticated algorithm that not only evaluates sentiments about particular topics, but also identifies the most influential opinion holders.

Jodange, which received an innovation research grant from the National Science Foundation last year, is currently working on a new algorithm that could use opinion data to predict future developments, like forecasting the impact of newspaper editorials on a company’s stock price.

In a similar vein, The Financial Times recently introduced Newssift, an experimental program that tracks sentiments about business topics in the news, coupled with a specialized search engine that allows users to organize their queries by topic, organization, place, person and theme.

Using Newssift, a search for Wal-Mart reveals that recent sentiment about the company is running positive by a ratio of slightly better than two to one. When that search is refined with the suggested term “Labor Force and Unions,” however, the ratio of positive to negative sentiments drops closer to one to one.

Such tools could help companies pinpoint the effect of specific issues on customer perceptions, helping them respond with appropriate marketing and public relations strategies.

For casual Web surfers, simpler incarnations of sentiment analysis are sprouting up in the form of lightweight tools like Tweetfeel, Twendz and Twitrratr. These sites allow users to take the pulse of Twitter users about particular topics.

A quick search on Tweetfeel, for example, reveals that 77 percent of recent tweeters liked the movie “Julie & Julia.” But the same search on Twitrratr reveals a few misfires. The site assigned a negative score to a tweet reading “julie and julia was truly delightful!!” That same message ended with “we all felt very hungry afterwards” — and the system took the word “hungry” to indicate a negative sentiment.

While the more advanced algorithms used by Scout Labs, Jodange and Newssift employ advanced analytics to avoid such pitfalls, none of these services works perfectly. “Our algorithm is about 70 to 80 percent accurate,” said Ms. Francis, who added that its users can reclassify inaccurate results so the system learns from its mistakes.


Translating the slippery stuff of human language into binary values will always be an imperfect science, however. “Sentiments are very different from conventional facts,” said Seth Grimes, the founder of the suburban Maryland consulting firm Alta Plana, who points to the many cultural factors and linguistic nuances that make it difficult to turn a string of written text into a simple pro or con sentiment. “ ‘Sinful’ is a good thing when applied to chocolate cake,” he said.
The simplest algorithms work by scanning keywords to categorize a statement as positive or negative, based on a simple binary analysis (“love” is good, “hate” is bad). But that approach fails to capture the subtleties that bring human language to life: irony, sarcasm, slang and other idiomatic expressions. Reliable sentiment analysis requires parsing many linguistic shades of gray.

“We are dealing with sentiment that can be expressed in subtle ways,” said Bo Pang, a researcher at Yahoo who co-wrote “Opinion Mining and Sentiment Analysis,” one of the first academic books on sentiment analysis.

To get at the true intent of a statement, Ms. Pang developed software that looks at several different filters, including polarity (is the statement positive or negative?), intensity (what is the degree of emotion being expressed?) and subjectivity (how partial or impartial is the source?).

For example, a preponderance of adjectives often signals a high degree of subjectivity, while noun- and verb-heavy statements tend toward a more neutral point of view.

As sentiment analysis algorithms grow more sophisticated, they should begin to yield more accurate results that may eventually point the way to more sophisticated filtering mechanisms. They could become a part of everyday Web use.

“I see sentiment analysis becoming a standard feature of search engines,” said Mr. Grimes, who suggests that such algorithms could begin to influence both general-purpose Web searching and more specialized searches in areas like e-commerce, travel reservations and movie reviews.

Ms. Pang envisions a search engine that fine-tunes results for users based on sentiment. For example, it might influence the ordering of search results for certain kinds of queries like “best hotel in San Antonio.”

As search engines begin to incorporate more and more opinion data into their results, the distinction between fact and opinion may start blurring to the point where, as David Byrne once put it, “facts all come with points of view.”


Correction: August 27, 2009
An article on Monday about an analysis of the emotional slant of Web postings described incorrectly the National Science Foundation’s relationship with Jodange, a company in the field. While the agency awarded a grant to Jodange, it did not invest in the company. The article also misstated the origins of Jodange’s software. It was created by Claire Cardie and her computer science students at Cornell University; Jan Wiebe of the University of Pittsburgh was not involved. And the article misstated Professor Cardie’s job status. She is a current — not former — professor on the Cornell faculty.

source: nytimes

1 comment:

Anonymous said...

The most we realized was five PIDs of 17 readily available.
So this experienced to be an OBD I auto with an ALCL diagnostic connector and not an OBD
II connector.

Visit my website: obdii scanner