Monday, February 18, 2008

Tracking Email Campaigns Using Google Analytics

Google Analytics is a free Web analytics tool that can be integrated with your email marketing campaigns to gain valuable information about the subscriber activity on your site. This data can be used to increase the effectiveness of future campaigns, and thus boost sales conversions, subscribers, or other campaign goals. In fact, according to a 2005 JupiterResearch report, using Web analytics to target email campaigns can produce nine times the revenues and 18 times the profits compared to regular mass email campaigns.

Today I am going to talk about how you can easily tag your email links in Google Analytics so you can better track your email marketing campaigns. Before we begin, make sure you have a Google Analytics account for your Web site, and verify that it is set up to track conversions. This involves placing a piece of code on every page you want to track on your site, including each conversion or order confirmation page.

What is Link Tagging?

Link tagging involves adding additional information (i.e., variables) to the destination URLs used in your online ads so Google Analytics can detect and associate each link with a specific campaign.
You can tag any number of online activities, including banner ads, paid search ads, or emails. Once a visitor responds to the ad, Google stores a cookie on his or her machine and is able to connect her ongoing actions with the original ad.


How to Tag Your Links

Tagging your links is very easy using Google's URL Builder.You merely need to identify the proper information to place into each of the following variables:

· Source

· Medium

· Term

· Content

· Campaign

Source

The source identifies who is delivering your message to the customer. It also defines the origin of your message. Examples include Google, Yahoo, a Web site you are advertising with, or the name of your newsletter.

Medium

The medium is the means that is used to deliver the message to the recipient (i.e., CPC, banner, email). For an email marketing campaign, you will use "email."

Term

This is the term or keyword you purchased and is only used in paid search tracking. Therefore, it will not be included in an email marketing campaign.

Content

The content variable can be used to perform A/B testing on two versions of an ad. For instance, you can send out two email newsletters and determine which one performs better for you by tracking them separately. You can also assign different content attributes to different parts of a single email. For instance, you may want to tag your header, special offer, footer, and product links. When you use a different content variable for each specific link in your creative, you are able to determine the effectiveness of each part of your email.

Campaign

This is the name of your campaign. You can be running one campaign on several different mediums. Use a descriptive term or slogan like "February Promotions" or "Get in Shape for Spring."

Once you have identified your specific campaign variables, simply enter them into the Google URL Builder, and click on "Generate URL." Then replace the original URL of the link in your email with the new one. You'll need to repeat this for each link in the message you're working on, as well as every future email broadcast you send.


How Is My Email Performing?

Now that you're successfully tracking email campaigns, it's important to know how to access the data. To begin, log into Google Analytics and click the "Traffic Sources" tab. Then click "Campaigns." All of your campaigns will be listed here for the time period you selected. You can click on a specific campaign (i.e., February Promotions) to see the full campaign summary. The "Segment" drop-down box has a long list of options, including "Source," "Medium," and "Content." You can use this feature to track the origin of your traffic, the specific email it's coming from, and the call-to-action that's generating the traffic.

Spend some time exploring Google Analytics and learning how to use the technology. The program's full capabilities surpass the scope of this article.

Integrating Google Analytics with your current email marketing reporting tools helps you understand how customers and prospects respond to your ads and interact with your web site. You will have instant access to all of the clickstream data users generate as they move from page to page across your site. This means you can find out who left your site after previewing your landing page, and who started the process of purchasing a product but strayed to another part of the site. You can see who reviewed product or service information, and who didn't. Essentially, this will allow you to identify what is working in your campaigns and what is not. Then you can tweak future campaigns to see an improvement in performance and ultimately, an increase in sales conversions.

New VideoCensus Service Provides Accurate Stream Counts and Granular Audience

Women Lead Online Network TV Viewing; Men Drawn to Consumer Generated Media


Nielsen Online, a service of The Nielsen Company, today announced the full release of VideoCensus, the first and only syndicated online video measurement service to combine patented panel and server research methodologies. Early findings reveal differences in how men and women consume video content and that online there is a new primetime.

“The growth projections for both online video consumption and video advertising revenue are phenomenal, and the market requires an innovative approach to measurement. The driving force behind the development of VideoCensus was the unanimous call from our clients to deliver the most relevant and accurate dataset possible,” said Dave Osborn, vice president, video measurement and media products, Nielsen Online. “With this release, we’ve taken a huge step in addressing the market’s need to harmonize panel- and server-based metrics and we are delighted by the accolades of major industry players who are supporting our forward thinking approach,” he continued.

INDUSTRY SUPPORT

“As a leader in online video, Turner requires excellence in research methodology,” said Jack Wakshlag, chief research officer for Turner Broadcasting System and VideoCensus client. “We've been debating the accuracy of panels and servers for years. VideoCensus measurement ends the debate about accurate counts of our video volume in the marketplace, while also providing Turner with relevant, high-quality audience demographics.”

“Comprehensive measurement of the online digital video landscape is equally important to marketers and publishers,” added Judit Nagy, vice president, consumer insights for Fox Interactive Media. “As with online audience measurement, having a reliable and trusted source for video metrics is a must have for the industry, and Nielsen Online is taking a leadership role by making technology advancements for the future.”

"As a leading provider of Internet video, it's imperative that MTV Networks has accurate tracking and reporting of streaming volume across our sites," said Colleen Fahey Rush, executive vice president, research, MTV Networks. "Products like Nielsen Online’s VideoCensus are essential in helping us meet that need, and capture the full value of our vast online library."

DECEMBER VIDEOCENSUS FINDINGS: TV NETWORKS VS. CONSUMER GENERATED MEDIA

WOMEN ENJOY NETWORK TV ONLINE; MEN DRAWN TO CGM

Video streams at broadcast network TV Web sites were nearly two times more likely to be viewed by women age 18-34 than men, who accounted for 22 percent and 12 percent of streams, respectively.


For the top four Consumer Generated Media Web sites, streams were two and a half times more likely to be viewed by men 18-34 than women, who accounted for 27 percent and 11 percent of streams, respectively.

“Network Web sites are destinations for fans to deepen their experience – they go to see favorite scenes, episodes and outtakes. These viewers are very loyal and engaged and the Web site is a place to become immersed in the program,” said Michael Pond, media analyst, Nielsen Online. ”With shorter clips and a viral nature, CGM Web sites are much more about discovery, and consumers are likely to view content on more than one.”

A NEW PRIMETIME ONLINE

Nielsen Online reported that streaming activity at the top network TV Web sites over-indexed during the weekday lunchtime hours of 12 p.m. – 2 p.m. At consumer generated media Web sites, the most popular time for viewing was during late night hours on the weekend, between 11 p.m. and 6 a.m.

“These results indicate that the largest appetite for streaming broadcast content is during the noontime hours, when viewers take a break from work to catch up on the shows they enjoy,” said Pond. “Primetime visitors to network Web sites primarily enhance their TV viewing experience with features like online voting, Web-only promotions and other program specific content, although there is some interest in streaming network content during the evening as well.”


U.S. TOPLINE VIDEO CONSUMPTION METRICS (EXCLUDES VIDEO ADVERTISING)

• 116.7 million unique viewers, or 73 percent of active Web users, watched approximately 6.2 billion video streams in December 2007


• The average viewer spent nearly two hours and 10 minutes watching online video content in the month of December


• Each viewer watched nearly 54 video streams during the month The No. 1 video site in December was YouTube, with 2.6 billion streams during the month, followed by Yahoo! with 371.9 million streams and Fox Interactive Media with 364.1 million streams (see Table 5).


Make the Most of SEO Competitive Research : Evaluating the Competition

Why bother about your competitors? Well, a stupid question, I know. You can’t possibly think that you can enter a new niche and get on top without looking into what has been done before you. When done properly, competitor analysis will answer your most important strategic planning questions:

  • Is it worth trying to enter this niche? Will I be able to overdo my competitors? How fast? Will long and hard victory be worth the effort? What’s my expected ROI?
  • What should I do to succeed in this niche? What shouldn’t I?
  • Who are my perspective readers/customers? What are they used to? What do they like?
  • Well, and many more, but I will stop here for now not to miss the point.

Step 1. Evaluating your overall competition.

You can either do it ‘at home’ using Google search and Excel or try paid tools returning complete competitor’s report. I usually perform all possible ways of analysis because I (1) cannot fully rely on reports compiled by someone else (be it an automatic tool or another person); (2) do not feel I have the full understanding of a niche unless I spend long hours on searching Google and compiling data into tables (yep, preferably multiple ones, and then combining tables into one table; but that’s just me, you can safely get along with a single solid report).

The idea is simple: you throw all your keywords into a spreadsheet and add the following information:

  1. Google daily/monthly estimated reach (I was using data provided by Aaron’s keyword research tool);
  2. Overall number of results in Google (broad match);
  3. The site ranked #1 for each term;
  4. Number of results for [intitle:keyword];
  5. Number of results for [inanchor:keyword];
  6. Number of results for both [intitle:”keyword” and inanchor:”keyword”] (hat tip to Ciaran) - this is your exact competition, i.e. those who use SEO (optimized titles and incoming links anchor text).


To save time you can get this information via SEOMoz keyword difficulty tool (it will also provide you with lots of other useful information: average PageRank of the top 10 sites, how many root URLs can be found in top 10 results, etc). Naturally, the best combination is when #1 is high, #2 is low (not necessarily) and #4, #5 and #6 are the lowest possible - the cases framed in green:



Step 2. Finding your direct competitors

After you compiled your targeted keyword list, you can sort by ‘#1 in G‘ column and see the sites that is most often ranked high in Google for your chosen keywords:



Be sure to explore your most successful competitor’s on-site optimization: titles, H1 and H2 tags, internal site architecture, etc. I have singled out two approaches that help me to perform this kind of analysis:

  1. Don’t be too skeptical. Unfortunately most often experienced SEOs analyzing onpage optimization think they can do much better. This thought can bring you to wrongful conclusions.
  2. Learn from their mistakes. (I know, this somehow interferes with the first one, so the most important is balancing between the two.) We all know how to do it right. So analyzing what a competitor did well doesn’t help a lot. The art of seeing mistakes and at the same time being able to keep from underestimating (see #1) always brings to the right solution in the end.

And now a few tools that can also prove helpful:

1. Google Adwords Keyword Tool (free) is useful for comparing Google advertisers’ competition data and your own findings and also for differentiating commercial terms from non-commercial ones. Keywords enjoying high advertisers‘ competition are most likely targeting potential customers (while more informative [and hence less competitive] phrases usually attract people who are collecting information rather than are really willing to buy). A good way to overcome high competition while sticking to more commercial phrases is to turn (moderately) commercial phrases into long tail (e.g. per our table: ‘Tennessee fsbo‘ into ‘townsend Tennessee fsbo‘).

2. Compete.com (paid with a few trial searches) also provides some helpful type of analysis that can help you to evaluate your competition:

  • Keyword Share” shows the percentage of total referrals a site receives from a particular keyword compared to its other referrals (= this keyword referrals/other keywords referrals).
  • Keyword Engagement” shows the average time visitors tend to spend on the site after being referred by this keyword.
  • Keyword Effectiveness” all people referred by this term/total time spent on the site.

While these metrics represented by Compete.com look really promising and useful, I mostly use them for self-education and out of curiosity - just because I am more used to ‘old school’ method of looking into my referrals and learn people’s actual behavior in practice. However this can still be very useful for learning the competitors’ referrals and visitors’ [probable] behavior.

Next time I will look into most effective ways of analyzing competitor’s link building strategies. So stay tuned!


Source: http://www.searchenginejournal.com/make-the-most-of-seo-competitive-research-evaluating-the-competiton/6386/

Filings Watch: Google’s 10-K: Headcount; Acquisition Spend

Google (NSDQ: GOOG) filed its annual 10-K report with SEC this Friday, and not much in it that we don’t know already, but good to get a yearly perspective on umbers, rather than the quarter-to-quarter we all chase.

Some points that caught my eye:
-- Our full-time employee headcount has significantly increased over the last 12 months, growing from 10,674 at December 31, 2006 to 16,805 at December 31, 2007...consisting of 5,788 in research and development, 6,647 in sales and marketing, 2,844 in general and administrative and 1,526 in operations. All of Google’s employees are also equityholders, with significant collective employee ownership.

-- Minimum guaranteed payments to the likes of MySpace and others: At December 31, 2007, our aggregate outstanding non-cancelable guaranteed minimum revenue share commitments totaled $1.75 billion through 2012 compared to $1.17 billion at December 31, 2006. (issues here of lower-performance of MySpace inventory, which we have written about before)

-- Cash used in investing activities in 2007 of $3,681.6 million was attributable to capital expenditures of $2,402.8 million, cash consideration used in acquisitions and other investments of $941.2 million, of which $545.7 million related to the acquisition of Postini in the third quarter of 2007, and net purchases of marketable securities of $337.6 million.

-- Besides DoubleClick and Postini, during the year ended December 31, 2007, we also completed seventeen other acquisitions. Three of these transactions were accounted for as asset purchases; the remaining 14 transactions were accounted for as business combinations. The total initial purchase price for these transactions was $281.6 million.

-- In addition, during the year ended December 31, 2007, we capitalized intangible assets of $5.2 million, paid in cash, related to patent purchases.

-- Cost of revenues increased $2,424.1 million from 2006 to 2007. This increase was primarily the result of additional traffic acquisition costs, the depreciation of additional information technology assets purchased in the current and prior periods, other additional data center costs and additional credit card and other transaction fees. There was an increase in traffic acquisition costs of $1,625.1 million which includes an increase of $216.7 million in fees related to distribution arrangements.

-- Advertising revenues made up 99% of our revenues in 2005, 2006 and 2007. We derive the balance of our revenues from the license of our web search technology, the license of our search solutions to enterprises and the sale and license of other products and services.

-- Our international revenues have grown as a percentage of our total revenues to 48% in 2007 from 43% in 2006.

-- Aggregate paid clicks on our web sites and our Google Network members’ web sites increased approximately 9% from the three months ended September 30, 2007 to the three months ended December 31, 2007, approximately 43% from the year ended 2006 to the year ended 2007 and approximately 65% from the year ended 2005 to the year ended 2006.


Source: http://www.paidcontent.org/entry/419-filings-watch-googles-10-k-headcount-acquisition-spend/

Yahoo! Search Draws Younger Audience; Google Users Big Spenders Online

In my post earlier in the week, I mentioned that Yahoo! Search attracts a younger audience than Google. I promised a post with figures to back up my claim (sorry I am a day late - forgot it was Valentine's Day!). The following charts show the percentage of visits from each age group to Yahoo! Search and Google.com.



I cross checked this data against our Lifestyle data. to be sure that we weren't missing the kids of these householders. Our Lifestyle data confirms that the groups that are highly indexed on Google tend to be older (55+) and the groups highly indexed on Yahoo! Search tend to be younger.

I mentioned this to my husband and he asked if the Google users spend more online. Good question (he seems to think young people have no money)! I created the following Lifestyle Quadrant Analysis to compare the online audience of Google.com and Yahoo! Search.

The figure summarizes the audience strengths and weaknesses for the two search engines. Visits by MOSAIC Group to Search.Yahoo.com are plotted on the y-axis and to Google.com on the x-axis. For example, the top left hand box indicates unique strengths for Yahoo! Search, in that they are groups that are over-indexed relative to the online population on Yahoo! Search but under-indexed on Google.com. The bigger the bubble the higher the propensity to have spent $500 online (based on offline data collected by Experian).



As you can see Google's relative audience strengths - i.e. the groups over-indexed on Google.com relative to the online population - are those that are among the most likely to have spent more than $500 online. This indicates that Google users are more likely to be big online spenders.


Source: http://weblogs.hitwise.com/us-heather-hopkins/2008/02/yahoo_search_draws_younger_aud.html

Yahoo Buzz: Next Digg Competitor



Valleywag has screen captures of a new Yahoo web site, reportedly launching February 26th, named Yahoo Buzz Beta.

Yahoo Buzz will be similar to Digg, but will start only with a 100 sites allowed into the system. After the initial beta period, all sites that are accepted into the Yahoo Publisher Network will allowed to be added to Yahoo Buzz.

By looking at the screen captures, it appears that each article is given a "buzz score." The buzz score seems to be generated by users clicking a "buzz it" icon. In addition, on the right side of Yahoo Buzz stories are recent "Top Searches," which are updated hourly.

Yahoo Buzz is currently a blog with data on top keyword searches from Yahoo Search. Here is the current Buzz FAQ, but reportedly, this new site will replace the current Yahoo Buzz section.

Let's not forget that AOL's Netscape tried to go the Digg route with Netscape.com but ultimately failed.



Source: http://searchengineland.com/080218-093307.php

Blueprints Of Google's Oregon Data Center

Blueprints Of Google's Oregon Data Center

Harper's Magazine has a spread showcasing a blueprint of Google's Oregon data center in The Dalles. The article documents the data center at 68,680 square-feet with three buildings, two that are currently completed. The data center would require enough power to light up about 82,000 homes, equivalent to 103-megawatts of electricity.

Check out the full blueprints over here, take note of the dormitory building and various landscape.


Source: http://searchengineland.com/080218-092213.php

Saturday, February 16, 2008

Guide to Video Marketing on YouTube

Youtube is the largest video sharing site to date, with the most traffic and the highest amount of users on it making Youtube the definitive place for getting your videos published and marketed on. I have put together a guide on Youtube marketing for videos and I think you all will get a lot out of this.

Youtube is the second highest trafficked site globally according to alexa. Quantcast estimates 60 million unique viewers per month with a community base of predominately gen y viewers. Emarketer surveys recently did a research document on video viewer ship according to age and what they found is that the gen y generation views about 5-6 hours per day. The household incomes of these viewers are all pretty even from 0-100k +.

Youtube is optimized for 18 different languages and has a large presence in the asian market place. Youtube is the 4th most trafficked site in the united states so the power behind reaching your audience within America will be found on Youtube. To learn about more of the history of Youtube the wikipedia page does have some valuable info even though I hate giving wikipedia pages out it is useful.

Looking at the Youtube Algorithm:

Making it to the home page on Youtube is going to give your video the most amount of exposure. However, you don’t need to hit the home page to get your million views, it would sure help.

To do this you are looking at roughly a 15 day window of marketing efforts put into Youtube. This can be stretched out however depending on how you are trying to make it to the home page ( top favorites, top views, top comments etc.).

Keeping it Fresh

There is a freshness factor to your videos and it is much more difficult to get videos honors and movement within the search algorithm if the video is old. You would be better off removing that video and re-submitting it if you are looking to produce honors in a given field and to have a chance at hitting the home page with greater ease.

Keeping it Real

Factors to take into consideration when submitting a video is going to be your profile authority. How many friends does this profile have, how many subscribers and how many channel views. The reason this is important is to get more initial exposure and more views using a power profile will help you out greatly. Submitting a video on a orphan account with no friends or subscribers may raise some suspicion over at the Youtube headquarters. Make your profiles look natural and build them out.

Power of Views, Ratings, Favorites and Comments

Youtube has honors for each type of action that can be taken on a video. Honors for comments, favorites, ratings and views. As a marketer if you focus on anyone of these and get a substantial number of votes, favorites, ratings or views in a day you will see honors pretty quickly.

Remember the category you submit to and the type of channel you created has a lot to do with your success with getting honors. Gurus tend to get honors pretty easily within their channel area.

Tips for marketing on Youtube:

There are going to be a lot of tools Youtube has created for marketing your video. I will share them with you along with some other techniques we use.

    1. On videos there is a share option. You can share by email address or with friends you have attached to your account. Remember the more friends you have the more people you can send to. Also you can leverage social media sites when you try to share as well.

    2. Bulletin Boards: By posting a message this video will be displayed to all of your friends on your profile.

    3. Invite to subscribe: This feature is available in your account once you request a friend invite. See the post I did here.

    4. Add friends: Adding friends is a powerful way to gain exposure on Youtube. See my latest post here.

    5. Make sure your video appeals to the community

    6. Sharing videos with email: We like to go viral with our marketing campaigns for videos. Send videos out to your friends and family with a link to the video and an encouragement to share it.

    7. Use StumbleUpon.com: Submit the video to video.stumbleupon.com and then import your email addresses and send to your friends on stumble upon.

    8. Social Media it: You can leverage digg and other social sites like facebook and myspace to drive traffic to videos.

With the tools and tips I have listed out you will absolutely see success with your video marketing campaigns.


Source: http://www.searchenginejournal.com/guide-to-video-marketing-on-youtube/6381/

How stale and dated is your website?

We have all had that money maker website that ranks really well but are desperately afraid to touch anything on it, incase whatever it is about the site that Google’s secret sauce is so in love with gets destroyed in the process. But unfortunately, Google doesn’t necessarily like it either when a site hasn’t been updated in years, despite those killer rankings.

Not only that, humans don’t really like it when they can tell a site hasn’t been updated in ages either, and they couldn’t really care less whether Google loves it or not. And after all, sure, Google can drive the traffic, but if the mass majority of your visitors leave out of disdain when they see you 1999 web design, is it really worth keeping it looking as it did when you first launched it with your Frontpage 97 design skills? Which brings me to the question…

So when was the last time you really updated your website? And then the next obvious thing…

What makes your website look stale, outdated and old?

Copyright date

Does your copyright date still say 2005? Or worse, 1999? Copyright date is a common way that people check how current a site is. And no, people are usually smarter than to be tricked by those javascript “today’s date” scripts that were so popular a few years ago. So make sure your site’s copyright notice is updated, and if you want to show that your site has been around since 1999, change it to 1999-2008 instead.

Font choice

Some fonts are, well, so 1999. If you have Comic Sans MS anywhere on your site, change it immediately (here is a link for the three people that have no idea what font this is). In fact, if the temptation is to great (and for many do-it-yourself webmasters, it seems to be the font of choice) go and remove it from your computer completely. Your conversion rate will thank you for it.

Background images

True, some sites definitely suit having a background image, especially if it is a tasteful Web 2.0-ish design that isn’t too distracting. Note the part about the tasteful design because not all webmasters get that part, especially when they have some photo they want to show off and have it sized to the entire browser window. But the only reason you should have a photo background image on your site is if it is directly related to what the site is about. For instance, a beach resort might have a background image showing the beach, which could be acceptable, so long as the text is still legible. Best practices would be to have a plain background beneath the text so people can actually read what you are trying to tell them, but too many people try and put the text right over top of the photo and inevitably, some of the text will be very difficult to see. And the biggest piece of advice when it comes to webpage backgrounds… if it is clouds, or anything space related, ditch it immediately.

Cache date

Look and see how many people are viewing your site via Google cache and you’ll probably be shocked. But if Google’s last cache date of your home page is six months ago, you’ve got some work to do.

Centered text

Yep, way back when the [center] tag was new, people went nuts with it and everything on the site, including all written text such as articles, was centered. Well, the [center] tag is not new anymore, and very few things should be centered except titles/headlines and subheaders.

Animated “under construction” signs

I also thought these went out about a year after animated gifs first became all the rage, but I still not only see them on older sites, but also on newer site that should know better. If a page is under construction, you are better to not add the page to be accessible to the public in the first place, or at least put a small amount of text on the page so it is passable as a legitimate page and not telling the world that you just haven’t had time to do it yet.

Sparkly anything

Sparkly animated gifs had almost died a quiet death when the MySpace crowd brought them back with a vengeance. There is no reason why any legitimate website should have sparkly images if it is targeting anyone over the age of 13, unless there is a damn good reason for it… and I am still waiting for anyone to supply me with a damn good reason!

Link exchange pages

If you still have link exchange pages on your site – usually aptly titled Link Exchange Page 1, Link Exchange Page 2, and so on, and of course with the pages being called something obvious like links1.html. If you are going to have link exchange pages, be a little less obvious about it. Smart webmasters stopped calling them link exchange pages years ago. Sorry, no examples to protect the guilty!

Homepage refers to outdated events

The last summer games were quite the event, but if your homepage is still showcasing them, you should really update the homepage or change it to a retrospective slant, so people aren’t wondering why you are featuring something that happened a couple of years ago as “new”.

Design

While design can be subjective to a certain extent, it can date a site especially if it is done without columns or a CSS file in sight.

Color scheme

Do you remember the old school html tags when colors used to be specified by name instead of HEX #, and the most popular colors were cyan, blue and purple, usually on a black background? Well, if your site still has them, you seriously need to consider a new color scheme for your site. Nothing can date a site faster than having a black background with cyan text… unless you happen to have a site catering to gamers, and then it seems to be the norm.

Last updated June 17, 2004

If you haven’t updated in the last six months – or worse, years – remove the last updated date from your homepage. The only time you should really use this is if it I the first time you have updated in years, or if you have a massive repeat visitor base that you want to alert to what has recently been updated.

Of course, there are always those odd ball exceptions. This site hasn’t changed much about its design in ten years, right down to using the same neon confetti background image.

When you have a website that has killer rankings, webmasters can be somewhat apprehensive about updated what is on the page incase Google’s secret sauce isn’t so happy about those changes. But you also need to ensure that your visitors don’t come to your site and immediately do an about face because the site looks, well, old. If you have one of these websites, changing the above things can ensure your site doesn’t look outdated, even if you update the content and homepage very infrequently.

Still not convinced? Change bits and pieces at a time over a period of weeks (or even months!) so you can evaluate exactly what Google is thinking of your much needed changes. This will also give you the opportunity to backtrack if suddenly things start to tank and you think your updating had something to do with it. And I can’t stress enough… make sure you keep backups of everything before you make the changes, and keep them for each change you make. This will make it easy to undo your changes to figure out what went wrong.
So if you have any of those oldie but goodie websites, take some time to make sure you aren’t committing one of the above faux pas which immediately dates your website, even when the content – while not recently written – is still valuable and updated.


Source: http://www.jenniferslegg.com/2008/02/15/how-stale-and-dated-is-your-website/

Comparing Six Ways to Identify Top Blogs in Any Niche

In the early days of blogging you could go to the Technorati Blog Index, enter some identifying terms for a particular niche topic and discover what the top blogs were in the field.

Identifying top niche blogs is invaluable knowledge for anyone wanting to enter, study or market to people in a particular field. It's one of the fastest and most effective ways to learn the lay of the land and get involved in the community of successful artists, real estate agents or 4-H club leaders using social media. I've been seeing a lot of demand for this information lately so I thought I'd write up some quick pros and cons of the options I'm familiar with. Perhaps you'll add some of your own favorite methods in comments.

Unfortunately, Technorati's not what it used to be anymore. While we here at RWW are very proud to have climbed to the #14 spot in the Top 100 most linked-to blogs overall in the Technorati Index (look our Perez Hilton, you're next in line) the fact of the matter is that for every day use Technorati doesn't feel very reliable any more.

How then can you identify the top blogs in a particular niche field? There are paid services you can use to identify influencers online but they are expensive and not appropriate for quick hits in a new topic. I'm all for paid services but in this case, let's talk about options that are fast and free. Given the need to classify a lot of content with minimal human intervention, this could be a great place for Semantic Web technology to come in.

Here's a comparison of the pros and cons of six different services you can use to do so. None are as solid a solution as the blogosphere deserves. This is a huge opportunity for indexes, but one that will be hard to fill since an index has to be wide and deep to be truly useful for this purpose.

Technorati

Pros:

The Technorati Blog Finder. was set up for just this purpose and in earlier days claiming and tagging your blog on Technorati was considered an essential step in getting started with a blog. I'm not so sure that's the case anymore.

Technorati offers a clear standard of authority and you can download the OPML file of the top 10 blogs in any category. Why only 10? I have no idea.

Cons:

After years of spotty service, seemingly random redesigns that made the site even worse than it was before, a crazy idea to get bloggers every to point all their rel=tag links to Technorati (!) and the entry of bigger players into blog search - Technorati doesn't feel as active today as it once did. There are probably a lot of top blogs in any niche that haven't added themselves to the directory.

The directory is also organized according to the tags applied to a blog by its own author, typically when the blog just gets started.

The user experience is not good at Technorati but it's good enough to still warrant a look in hunting for top niche blogs.

Del.icio.us

Pros:

We wrote about how to find top niche blogs using Del.icio.us in a post last month. At the simplest level, go to http://del.icio.us/tag/topic+blog.

There's huge amounts of data on Del.icio.us and it's a very dynamic community. There's also RSS feeds, user comments, information about the people (users) who have done the classifying and a lot of other helpful features. I've been using Del.icio.us to find top niche blogs a lot lately and it's served me fairly well, even if I have to eyeball the last few yards to an answer.

Cons:

Del.ico.us hasn't been evolving very quickly, at least the publicly available version of the service. There are a lot of obnoxious qualities to it, like the fact that you can't search for most popular items with multiple tags - there's no such page as http://del.icio.us/popular/topic+blog.

Search results pages are funky and tag/topic+blog just means that a URL has been saved at least once with both of those terms, not that any number of people used both terms at once. It's not intuitive to look up the tags given a URl much less an entire domain. Finally, at least in the tech sector a lot of hip cats are using Ma.gnolia now instead of Del.icio.us. It's a recommendation engine waiting, forever, to happen and I'm still heart broken that it was acquired by Yahoo! instead of the Library of Congress.

StumbleUpon

Pros:

StumbleUpon has huge user numbers, very targeted interests and classifications, algorithm combined with human editorial judgment about the blogs in question.

Cons:

It's more "fun" than it is business, unless you're into SEO. There's no clear way to look at top sites in any category, the search results page is really random looking. Good for stopping by and doing some searches just to see if you've missed anything, but nothing you'd do as part of a structured search.

Google Reader Recommendations

Pros:

Google Reader's new recommendations are very high quality, in tech at least, because they have a large number of web savvy users. I'm hoping that starting a dedicated Google Reader account filled just with some known feeds in a niche, I can have other top sources in that same niche recommended to me.

Cons:

Recommendations don't come right away, you have to wait for awhile. There's also a limit to the number of recommendations you can receive at one time. It is a tech focused community, disproportionately to the blogosphere in general. Finally, this is a pretty silly little hack at things and you find yourself getting tied up with trying to run multiple Google accounts, etc.

AideRSS

Pros:

I love AideRSS because the criteria for hotness is relatively clear and I find the service really useful in lots of contexts. In theory you can plug almost any RSS feed, including search feeds, into AideRSS and it will score items in that feed for popularity based on number of comments, diggs, del.icio.us saves and inbound links. You could put feeds from a blog search for niche specific language into RSS and find some niche hotness. Once you identify top niche blogs you can also run their feeds through AideRSS to quickly discover what their communities of readers find most engaging. It's magic, almost.

Cons:

The service only works most of the time and long URLs choke it up. It's also limited to feeds, which take some creative thinking in order to bend to our particular purpose of finding top blogs.

Ask.com Blogsearch

Pros:

Ask has the best blogsearch on the web, it uses Bloglines subscription numbers as a big weight in spam control. There's very little spam. You can search for niche specific language or a key niche link and sort by popularity of source.

Cons:

Ask does get overloaded some times and the above method is hardly systematic anyway. I wouldn't reley on it alone. Ask blogsearch does index a lot of funky feeds that clutter search results even if they aren't spam. Try it out and you'll see what I mean.

Conclusion

See what I mean? Nobody quite does what we need. Used in concert and with a little work, these tools together can build you a pretty good reading list of top blogs in any niche. There's big room for improvement in this toolset though.


Source: http://www.readwriteweb.com/archives/identify_top_blogs.php

The Great Nofollow Link Debate of '08

Three years ago Google introduced the rel="nofollow" tag to the Web. A benign title to anyone who has never run a blog and experienced the headache, "Preventing comment spam" promised future bloggers comment spam would be greatly devalued by this simple tag. The reason being, this little bugger would effectively kill the SEO value of any link "protected by it."

The new nofollow tag quickly became known as a "link condom," and experts at conferences across the world started hailing it for its simple yet powerful purpose. One of the people involved in the comment spam fighting problem within Google was an unknown engineer named Matt Cutts, now Google's famous, unofficial chief Web spam fighter. (Ironically, Google thinks "spamfighter" should be one word, yet when you search the Matt Cutts-related results still outrank spamfighter.com -- ah the power of the SEO (define) community is strong).

Matt's "next little ditty" came out in September 2005 on his personal blog post that sentenced link sellers to eternal damnation. He laid out a compelling argument that buying links solely for PageRank value was polluting the Web. Publishers started using nofollow tags to show Google selling links wouldn't be based on the SEO value of the publisher's site to the buyer's site.

The Nofollow Evolution

While the paid links debate raged on, the nofollow attribute was quietly becoming used for more than just blocking outbound links. Before we go there, let's check in during early 2007 when Sebastian "X" asked if we really need this attribute after all. Sebastian succinctly highlighted the apparent shortcomings of the tag, and expressed the frustration felt by many this tag was worthless when it came to stopping blog spam. Beyond that, there was confusion as to its real purpose, he said.

Not surprisingly, the "link condom" attribute -- much like the occasional cheap or damaged contraceptive device -- doesn't always work. Some feel Google still follows links and, in some cases, might not even strip the "juice" out based on the trust of the landing domain, still assigning value based on the anchor text.

Tests I've tried have been statistically inconclusive so far. I've noticed nofollow links at Wikipedia are scraped, dropping the attribute but keeping the link. (No names in order to protect the innocent.)

Spammers were still spamming, links were still being sold, and at least one war in the Middle East was still going on. Sebastian hinted at a better use for the tag -- specifically to instruct spiders (define) not to crawl printer friendly version of pages. Turns out a new revolution led SEOs to turn their nofollow focus inward.

The Nofollow Revolution

Eric Lander started an interesting debate about nofollow use triggering search engine algorithms (define) to detect spammy sites. This theory has almost no legs to stand on. SEOs, though, certainly haven't forgotten about this little tag.

SEOs have been secretive about their use of nofollow. (I may in fact be hunted down for this post.) SEOs know nofollow lessens "PageRank dilution" that occurs between internal pages of a site. The idea, in a nutshell: if search engine ranking algorithms devalue administrative pages (privacy policy, TOS, contact us), no SEO value should be delivered from external links.

Often a home page gathers many inbound links. Focusing the links on the most important secondary and deeper site pages benefits internal pages with a relatively low number of inbound links.

Some people find this practice effective. I've tested it and I'm not completely sold yet. One commenter on Lander's post (who happens to work with me) took apparent offense at the idea the nofollow could not be used to help "craft the flow of Internal PageRank."

Bottom line: The rel="nofollow" attribute is here to stay, and a few more years of testing and a few dozen search engine algorithms updates from now, it will likely have far different value than it holds today. An interesting tag started on a simple enough mission to rid the world of spammers has evolved into a tool to be used wisely and tested consistently. Please join us for discussion on this topic in the Search Engine Watch Forums.

Frank Watson Fires Back

Frank Watson: The initial intention of the nofollow tag was good -- but the option was that bloggers could use the tag as they saw fit. The problem now is that Google is forcing people to use the tag.

As Michael Gray pointed out, "Google is not the government." Google isn't even the Internet police. Yet the fact that Google is the source of the largest part of Web traffic forces people to follow their edicts.

Google may cause greater government intervention on the Web. Given Google's dominant share of Web searches and traffic, governments around the world may view this as a form of monopoly and start regulating Google, as they review the purchase of DoubleClick.

Will there be an antitrust case eventually brought by someone who lost his source of income by a change Google makes to their algorithm? Let's wait and see.


Source: http://searchenginewatch.com/showPage.html?page=3628452

Welcome to the All-New Yahoo! Video

Hey Yahoo! Video viewers and creators (and those that do both),

Notice anything different about Yahoo! Video? As you can tell we’ve really changed things around here and I’m really excited about it. I love online video, and our new site makes everything about viewing, uploading, organizing, and presenting online video better. Yahoo! Video is now your one-stop shop for all video on Yahoo!, including music, movies, TV, news, sports and a whole lot more.

  • A wider viewing experience: Yahoo! Video supports a 16:9 cinematastic player that’s far ahead of what most sites are offering. And the video looks great.
  • More content: Yahoo! Video is now about the whole spectrum of video found throughout Yahoo!, including music, movies, TV, news, sports, and a whole lot more. And we’re still committed to featuring the best videos from our talented community of independent video creators.
  • Bigger files: Speaking of our creators, we’ve raise the max file size to 150 megabytes, so you can upload longer, higher quality video.
  • An expanded browsing experience: Discover more video through networks, playlists, and related videos.
  • More sharing: Not only can you embed individual videos on your blog or site, you can create your own curated video experience with embeddable playlists. This is seriously cool. Take a look at this example.
  • More in-depth profiles: Your profile page now says a lot more about you. Pick your own nickname, fill the page with your favorite playlists and videos, add contacts and fans, and read and write comments.

And let’s be honest, there are a few things from the old Yahoo! Video you won’t miss:

  • Comment titles: You no longer have to title your comments and you can reply to other people’s comments. You can also delete comments you’ve made, as well as unwanted comments on videos you’ve uploaded.
  • Blank video players: Our embeddable player now carries the keyframe of your video, and it has a full-featured related-videos carousel. With our 16:9 aspect-ratio, I think it’s probably the best embeddable player out there.

Source: http://www.yvideoblog.com/blog/2008/02/15/welcome-to-the-all-new-yahoo-video/

Friday, February 15, 2008

Multi-National Search Marketing: Effective Strategies for Global Marketers

Yesterday, Chris Sherman from Search Marketing Now presented another webinar jammed packed with useful information. This session entitled ‘Multi-National Search Marketing: Effective Strategies for Global Marketers‘ was sponsored by iProspect and moderated by Claire Schoen.

Chris was a perfect speaker due to his experience covering search and search engines since 1994. He is the author of several books, including ‘Google Power’.

Chris started out the presentation by outlining that he’ll cover why we might want to go global, what types of campaigns work best and how to do it.

The first slide discussed the fact that a lot of people perceive that Google has taken over the earth. Yes, Google is dominant and if you are running a search marketing campaign, you understand that Google’s reach can give you that global coverage. Chris said we may ask ourselves why should we should bother going multinational since Google has a global reach. But, Chris believes there are other opportunities out there.

The market share and what the reality really is.....

comScore does a monthly report on worldwide search share. Google does have the dominant share - about 63% - that’s just under 2/3 of all the market share worldwide. Despite the all the media about the Yahoo/Microsoft merge, the reality is that Yahoo worldwide, they are number one in terms of the number of people who visit them. In terms of search - they are number 2. If you aren’t using Yahoo, you could be missing out on certain benefits - particularly worldwide.

The number 3 player is Baidu.com - they are the dominant player in China. Over time as the Internet expands in China, Chris believes those numbers for Baidu will go in the double digits very quickly.

There is a tie for number 4 - Microsoft sites and NHN Corp. Finally, the other 90+ search engines have a 13% market share. If you actually start drilling down you’ll find some of these players are dominant in certain regions.

Key Players (source comScore)

  • Google totally dominates North America and most of Europe.
  • Yahoo is dominant in Asia, except for:
    • China (Baidu 61%, Google 20%)
    • South Korea (Naver 74%, Google 4%)
    • Russia (RIndex 57%, Google 23%)
Why Go Global

Almost 1/6 of the population is out there searching and doing about 61 billion searches per month. If you drill down, 75% of all searches are outside North America. To Chris, that is the most compelling reason you want to consider doing a multinational search marketing campaign.

To Translate or Not

Chris posed the big question…‘do we have to go through the details of translating?’

  • Depends on your goals. You can have a strictly English campaign and it will work best for global brands and products with the same name in all cultures.
  • An alternative is to mix campaigns that mix and match English and target market language (keywords, ads, creative landing pages, etc). Takes a bit of experimenting to discover what works best with this mixed approach.
Key: Search Behavior

  • Varies from country to country.
  • Chris discussed how people use different search terms, have different eye tracking, and result scanning patterns, different click through styles. Studies have proven the differences between North American searchers and others - like those from China.
  • This means you will likely need to do translation, search optimization and cultural optimization. You need to make your content appealing to the search engines and the searchers through their various needs that they are expressing through their search behaviour.
PPC or SEO or Both

  • Chris counseled us that if we are going multinational, we will have to commit a reasonable amount of resources to ensure the campaign is successful. Even before we being, we should consider whether a paid search or natural search campaign is more likely to be effective in a given market.
  • Sherman feels that in larger markets - you are probably o.k. to do either - or both if you have the resources.
  • However, for smaller markets, you should consider each individually. (In smaller countries where you could be targeting the millions or hundreds of thousand of people - you need to carefully consider who is the market leader in those countries).
Targeting Your Markets
  • Use your marketing department to carefully weigh the probability of ranking with both country size and internet reach. Remember that Internet reach varies greatly (example, China is under 20% while other countries have much greater Internet reach - but with a smaller population).
  • Do you have sales and logistics resources in a market? May not need them and may be able to successfully do it all online, but if you can’t you need to have the resources in place.
  • Are we able to handle shipping, different currencies, duties and taxes?
  • What about support? Do our support people speak the language? People may want to call or send an email. Do we have the people in place to handle this?

All of these things will impact whether the search marketing campaign can be successful.

Process

  • First thing is to find good translators
    • Must know local idioms (need to understand the local dialect, terms etc).
    • Must be able to translate unique or technical terms for your product, service or brand
  • Translate text, images and navigation. A lot of people overlook translating images and navigation - you need to make the user experience rock solid.
  • Crucial - Make sure SEO is involved from the beginning. Don’t want to do SEO as an afterthought - get them involved right from beginning so that you’ll end up with a more effective website and the SEO specialists can help you avoid potentially costly mistakes.
Optimized Translated Content
  • Chris warned us that English content that’s optimized does not automatically become search friendly when translated. Translation is an art and may alter the content in such as way that it doesn’t rank well at all in your target market.
  • Similarly, you can’t simply translate English PPC ads and landing pages.
  • Tip: Translate your keyword list first, before any other content. Get a good sense of those critical keywords that you are hoping to capture with the searchers.
The Long Tail Varies
  • The long tail means targeting less common keywords or phrases - that long tail exists in all other languages but it is not the same in all other languages.
  • Romance language searches tend to use fewer, more common words.
  • English & Dutch/German searchers tend to use more terms and less common terms. Right now, long tail will probably be more effective in those languages.
  • Paradox? In the UK - nine keywords account for 5% of all searches.
What About Duplicate Content
  • Chris said he is often questioned about duplicate content in multinational campaigns. If you have the same language content on multiple servers in different countries, you may be subject to duplicate content penalties. Be aware, but don’t necessarily be alarmed. They may only be trying to find the main source of your content, regardless of where you are located in the world. However, if you are finding that search engines are getting confused, you may need to do some work changing content, or putting it into other formats etc. Really boils down to your individual situation.
  • Content translated into different languages and hosted in different countries is not duplicate content to search engines (at least today…) May change but today isn’t a concern.
The Right Domain

What should the right domain be - a dot.com or a country specific domain?

  • Boils down to your intent. If possible, go for both - especially for companies trying to target regions with regional pride, you will want to go for country specific domain.
  • For example, IBM has one global website, with subdomains for individual countries.
  • Sony, by contrast has local domains in all countries.
  • Be careful of those ‘choose your country’ top level pages! If you have a dot.com and are going to redirect to another country - don’t make those pages search engine hostile! You want the search engine to be able to find the country specific domains. Make sure navigation on homepage - no matter what you do with it -better not block the crawlers from finding the content on all the different websites you create.
IP Address COnsiderations

IP Address gives the physical location of where the server is based.

  • Chris said he has found that when search engines are ranking content, they will look at the IP address to decide whether it should give more weight in country specific results to sites with local IP addresses.
  • Challenge: legal or residency requirements in some countries. In some cases, you may need to prove you have some legal presence in that country before you can get a local IP address.
  • Google’s webmaster tools allow you to specify country, and Microsoft says this capability is coming.
Crucial - Localized Links
  • Localized links are crucial when you are going into a multinational campaign. It is not enough to translate and set up a site in a different country.
  • As with any site, it needs links pointing to it to rank well in search engines.
  • And most of these links need to come from local authority sites, not from the mother ship or out of country sites.
Multi Country PPC
  • Can be the most cost effective way to have multinational reach.
  • Geotargeting can be very precise.
  • Translate both ads and landing pages.
  • Use PPC as a research tool to help identify most effective keywords in a specific language/country.
Global Brands in Multiple Countries

  • For global brands in multiple countries, Chris suggested we trust the offline brand experts here.
  • Some cultures like and accept global brands (ex. China) - others prefer homegrown, localized brands.
  • Especially important - emphasizing brand attributes in a culturally appropriate way. You have to be sensitive to that and don’t neglect images!
Consider Smaller Markets

  • PDF report ‘Global Search Report 2007′ by einternet is available online and has a wealth of information.
  • Ex. China is on track to become the number one in terms of searches on the Internet but they have a low penetration of the population. Conversely, Denmark has 70% penetration and there is another search engine beyond Google that is popular. Report is full of good information that can help companies choose smaller markets.
Conclusion
  • Chris concluded by stating that multinational search marketing offers a very appealing way to reach more customers, but it is not for everybody.
  • Success requires deep, localized knowledge of markets. If you don’t have that knowledge, you need to reach out to a partner that has that knowledge.
  • Campaigns must be optimized and tailored for both language and culture. Not enough just to take optimized content, translate, and expect it to work in another country. Have to tailor that information for the language and culture you are trying to target.

After this very informative presentation, there were a few minutes for questions.

How would we find out about legal residency requirements? - If you are trying to establish a site in a specific country, the hosts in that country will spell out what is require. For example, in Australia you need an Australia Business Number. Go directly to a particular host based in a country and find out the requirements that are necessary to actually register a site there.

How do you find translating services in a specific country? You can go on the web and find translation services, the key there is to get a translation service that can also work with an optimizer. The best thing to do is seek out a local SEM firm or a global SEM firm and see what they can do because they are probably going to have the contacts to do this type of work. Do not rely on automated translation systems - these are rough only and will backfire in an overall search marketing campaign.

Directories? They are emerging as a good resource and they are gradually becoming better at accommodating advertising needs. Directories are very good if you have the time and resources to find the good ones. They can give you very good reach for not a lot of cost/effort.


Source: http://www.10e20.com/blog/2008/02/13/multi-national-search-marketing-effective-strategies-for-global-marketers/

24 Metasearch Engines for Centralized & Efficient Searching

Metasearch engines have risen up the web through the years and continue to grow in numbers, even in the current Google dominated search world.

Companies who create their own versions of metasearch engines must have given up on trying to compete with Google and the other top search engines, and instead take advantage of the API’s and technologies available via the big three or four search technologies, integrating that data into one interface with a unique spin on ranking or listings formats.

Here at the Journal, we’ve been featuring some of these metasearch engines; especially the new ones. But we won’t be able to cover them all (although we try).

So the best we could do is come up with a list of these metasearch engines, made up of metasearch startups and others have been around for quite some time now.

It is not however a comprehensive list, as we know that there are still more out there. If you know of other metasearch engines, please feel free to add them in the comment. We will continuously update the list as we receive more comments.

Wikipedia entry defines a meta-search engine as a search engine that sends user requests to several other search engines and/or databases and aggregates the results into a single list or displays them according to their source.

  1. MetaCrawler - Uses innovative metasearch technology to search the Internet’s top search engines, including Google, Yahoo! Search, MSN Search, Ask Jeeves, About, MIVA, LookSmart and more. Search refinement available.
  2. Dogpile -Puts the power [of] all the leading search engines together in one search box to deliver the best combined results. Toolbar download available.
  3. Mamma - “The mother of all search engines”,
  4. IxQuick - “The world’s most powerful metasearch engine”, includes universal power search, global search and search refinement. Toolbar download available.
  5. kartOO - A visual meta search engine that instead of showing up traditional search results pages, displays a graphical map of search results. Honestly, I don’t get the point.
  6. Ithaki - Search various search engines and ranks results based on an internal ranking system.
  7. Seekz - A parallel web search engine that queries many of the Internet’s top search engines and displays results in groups with the most relevant website appearing first. Removes duplicate results as well.
  8. iBoogie - Puts documents with similar content or with related topics into the same group. Each group is assigned a label based on the content of the documents.
  9. Zuula - Provides search unaltered search results from various search engines giving users the chance to check them first before going to results from individual search engines. And it even remembers which search engine you have been to so you can easily switch back and forth. Sponsored links are separated.
  10. inCrawler - A metasearch engine and a web directory in one. Works like your traditional search portal where categories are displayed upfront just below the search toolbox.
  11. WindSeek - Pulls results from many of the major search engines simultaneously with lightning speed and accuracy.
  12. Seek2Day - Gathers results from 17 different major search engines and in a very quick way decides which information is most relevant to the user. The results are then presented in a comprehensive format.
  13. ez2Find - a Global meta search engine that searches AlltheWeb, Teoma, Google, Yahoo!, AltaVista, Wisenut, ODP, and MSN, parse the results, remove the duplicates, include links to relevant directory categories (directory results from the Open Directory) and to clustered results. Calculate result relevance by an algorithm that count the number of time the link was found in the search engines and its position on them.
  14. Vroosh - VROOSH.com is a metasearch engine that also acts as a metacrawler utilizing fast parallel technology for speed and accuracy. Metasearch using Keywords in any language, use Advanced Search to search in a specific Country or metacrawler for MP3,and FTP sites.
  15. qkSearch - calls itself a 3-in-1 meta search engine by providing clustering search, split search and blended search (not functional yet).
  16. TurboScout - search 21 search engines without retyping the search engine name. To limit your search into one search engine, all you have to do is enter your keyword and click on the name of the specific engine.
  17. FinQoo - the next generation meta search engine that lets you search, share and share summarize. It doesn’t have an about page yet though, so I’m wondering whose responsible for it. The Philippine flag right up the search box, probably made by some Filipino tech guys.
  18. Polymeta - an intelligent and advanced meta search and clustering engine that enables organizations and individuals to simultaneously search diverse information resources on the web with a common interface. Search results are merged, ranked and presented in relevance order. Uses a natural language processing and information retrieval algorithms in its query analysis and search result refinement.
  19. Unabot - Unabot is all-in-one meta search site that allows the user to choose from hundreds of different search engines, directories, and indices to query.
  20. vPinPoint - is a parallel search engine that searches Google, Yahoo, MSN and Ask.
  21. Draze - crawls various search engines and let users compare search results.
  22. SearchSalad - Search Salad brings together results from the top search engines and also from the major review sites all in one place giving you the ability to see the top results from the major sites all at once. This combined with Search Salad’s customised search environment means that you see the best of what the web can offer in terms of search.
  23. Clusty - Clusty queries several top search engines, combines the results, and generates an ordered list based on comparative ranking. This “metasearch” approach helps raise the best results to the top and push search engine spam to the bottom.
  24. AllPlus - searches four major search engines namely, Google, Yahoo, Live and Ask.

Source: http://www.searchenginejournal.com/24-metasearch-engines-for-centralized-efficient-searching/6375/

Paid and organic search trends

One of the most powerful Hitwise tools is our paid and organic search data. We’ve been analyzing this and have come up with some interesting trends for the travel and retail industries. Before I get into the analysis; a quick note on the methodology. This paid search data represents a weighted average of the proportion of upstream traffic from paid search to twenty leading sites in our Travel and Shopping & Classifieds categories – i.e. it illustrates the percentage of each industry’s traffic come from paid search. The chart below illustrates the proportion of upstream traffic that these two industries receive from paid search over the 15 months to last December.


So what does the data say? The first thing that jumps out that is that the travel industry is more reliant on paid search than the retail industry. The retail industry actually receives more traffic overall from paid search, but this is simply because it receives more visits full stop. Proportionally, travel websites receive on average 50% more traffic from paid search than Shopping and Classifieds sites.

The second interesting conclusion relates to peaks and troughs in paid search traffic, which I’ve summarized in the table below. As you can see from the chart, the level of paid search activity in both industries is anything but consistent during the year. Naturally, traffic to sites in our Shopping and Classifieds category peaks during November and December, and this is also the period when search traffic in general to the category peaks. However, paid search activity peaks slightly earlier – the two biggest months are November and December – implying that retailers rely more on paid search during the pre-Christmas browsing / research period than during the peak purchasing weeks in December. On the other hand, paid search activity is at its lowest in the summer months, which coincides with the quietest period for the retail sector.


The travel industry experiences two peaks in traffic. Its busiest period is after Christmas, when people flock online to book their summer holidays, while there is a second peak during the summer months as people visits sites for information, check-in facilities and last minute travel. What the paid and organic search data reveals for this sector is very interesting. Search drives most traffic overall during the summer months, but this peak is primarily driven by organic (or natural) search. The peak for paid search happens during the Autumn as people return to work after the summer, and their thoughts again turn to the prospect of warmer climates.

So, although the travel and retail industries experience peaks and troughs in paid search at different times, there is actually a common theme. Paid search activity peaks during the key ‘research’ period in the buying cycle – i.e. in the months before the surge in visits and purchases. This would imply that paid search is more effective if used earlier in the purchasing cycle.

The other question we wanted to answer in this analysis was: is paid search becoming more or less important? As the chart below illustrates, the answer to this question depends on which industry you’re talking about. The travel sector received significantly more traffic from paid search in the last three months of 2007 than in 2006. This is driven by two factors: a growth in the amount of traffic that the category receives from search overall, and the fact than an increasing proportion of this search traffic (40% in December 2007) comes from paid search.


As you can see from the chart, the opposite is true for retailers. The amount of traffic that sites in our Shopping and Classifieds category receive from paid search decreased during the final quarter of last year. The interesting thing about this trend is that there was actually an increase in traffic from search to the sector during December, implying that retailers are switching from paid to organic search strategies. 29% of search traffic to our selection of 20 top retailers was paid in December 2007, down from 31% in 2006.


Source: http://weblogs.hitwise.com/robin-goad/2008/02/paid_and_organic_search_trends.html


The Inconvienent Truth About Social Media Marketing

Social media is hot. Everyone wants to be on Digg's home page. Link baiting, especially using things like numbered lists, imperative rules or controversial hooks is the SEM strategy du jour. There's just one -- major -- problem with spending so much time and effort on capturing the eyeballs of social media users.

Social media traffic does not monetize.

Social media is easy to hype because there is a lot of traffic on social media sites. But if you try to do anything with social media traffic to convert it to revenue, you will be hard-pressed -- unless you are selling CPM-based advertising.

Google, in its 2007 Q4 results press conference, complained that it was having trouble monetizing social media websites. When I set up an AdWords campaign today, Google recommended I try the MySpace network. They are desperate to get anyone they can advertising on that network because the traffic has so little implied intent and so little value, and Google is already locked into an expensive partnership.

Man eats world's largest turkey in single sitting

Who would read the above line and be inspired to buy?

Trying to appeal to larger web communities using shock and awe, at the expense of creating content that your community and subscribers find relevant, costs you trust and attention with each swing and a miss. When you finally hit a homerun and get exposure on social media sites, most of those people leave within 30 seconds. Few link to your site, few buy, and few subscribe.

Leading publishers even worry about too much social media traffic deflating their CPM ad rates. And StumbleUpon, a leading social media site, sells traffic for 5 cents a visitor. So is it really worth the effort to target social media?

Why social media usually fails

Within your own field you likely know what people care about, why they care about it, and key emotional touchpoints that you can appeal to. But if you can't get exposure in your own market it is going to be even harder to appeal to larger and less related markets.

When social media actually works

The most effective way to target social media is to find something on a network that already relates to what you are doing and co-brand it, rather than trying to create a new hit from scratch.

Community gold

Get featured or referenced on a site like Search Engine Land and thousands of people interested in SEO and SEM are going to see that reference. If you create useful content and get covered by leading editorial channels in your field, what are the odds that some of the people who passionately subscribe to content directly related to your topic will also read, subscribe to, link to, trust, and/or buy from your site? Many will.

The relevancy is so great that you do not need a large stream of traffic to create a lot of value. Why does search work? Great relevancy. Why does social media work? It usually doesn't, at least when you factor in opportunity cost.


Source: http://searchengineland.com/080214-080046.php